Saturday, March 27, 2010

Ubuntu review

Irritated with my Desktop after an upgrade gone bad and an incident with the nvidia noveau driver that left me x less, I decided it was time to re-install. I turned to my bookshelf to find Ubuntu Unleashed 2010 Edition. Normally by the time a book hits my shelf the material is outdated, not necessarily useless, just not the most up to date. This is an exception. The Ubuntu Unleashed 2010 Edition was updated with an Ubuntu 9.10 DVD and a “Free Upgrade to Ubuntu 10.04″ which I found out that if you buy the book before the end of 2010 you can get an upgrade kit in the mail.

So, I pop the DVD in the drive and start the installation. Nothing new here for anyone that has installed Linux or Ubuntu recently; for those that haven’t, it was a pleasant surprise to see that it actually detected my high resolution monitor and used it to its advantage. It really is strange to not have to squint at an installer. The first chapter covers the step by step installation in more detail which is relatively short and easy to follow. Most people should not need to read this if they are familiar with installing an operating system but it I think it is good to have it there. Just don’t let this first chapter prevent you from looking further into this book. After putting the DVD in and getting it started, I found myself reading the book through the entire installation; which for some reason got from 0-90% quickly, then took the majority of the time in the 90% range, but I’m not complaining.

The Authors really did a good job of writing in an understandable language and organizing the book in a logical format. I’ve found myself flipping through and finding many golden nuggets of information. I personally would not have picked this book up because of the title, since I’m not a big Ubuntu user. But Ubuntu Unleashed 2010 edition is packed full of information, 32 chapters and a hefty appendix to be exact. It is not all Ubuntu – specific either, meaning most of the content should work on just about distribution. This book would not be rendered useless if you don’t decide to go the Ubuntu route. I recommend taking a look at the contents and buying this book, as I’m sure you’ll be pleasantly surprised at the topics it covers. I think it would be a great book for someone that is interested in Linux in general, it reads well but can also be used as a quick reference. I wish I had a book like this when I was getting started, it would have saved me a whole lot of time and effort. I have set aside some of the more advanced chapters and made a note to read later.

Other reviews I’ve read have said that it has too much terminal use in it, which is something Ubuntu is trying to eliminate. While this may be true, if you want the most out of your Linux distribution, the fact is you will at some point use a terminal. Commands are less likely to change as much as graphical interfaces. Although some things may be slightly outdated I don’t think that this book should be re-written, as it is in the nature of open source and technology to change. If you keep this in mind I don’t think you’ll be disappointed with it.

Reference:http://www.thelinuxblog.com/ubuntu-unleased-2010-edition-revie/
Retrieval Date:03/27/10

Friday, March 26, 2010

i3 Processor reviews

Intel Core i3 530 processor review

A couple of weeks ago Intel unleashed a new dual-core processor series onto the market, armed with high clock frequencies and an integrated graphics unit for a good 2D experience on that monitor of yours, and sure it can even handle a little gaming as well. Obviously we are talking about the dales series processors, in specific Clarkdale - a derivative of in the Nehalem family of processors.

Now you'd be saying, um Hilbert didn't we review these processors already ? Um, yes ! And no. See on the 3rd earlier this month we took an in-depth peek at the Core i5 600 series processors. Intel that day also released the the Core i3 series processors, exactly the same thing, yet clocked slight slower and with Intel's Turbo mode' stripped away. The end result however is a processor that is priced much more attractive-- yet for a dual-core processor offers much more bang for buck at a mainstream or HTPC. And that processor was not seeded towards Dutch press, hence the review today.

The Intel Clarkdale processor lineup includes the 32 nm fabrication node Core i3 530 and 540 models, as well as the Core i5 650, 660, 661, and 670, which will be featured with Hyper-Threading, 4MB of L3 cache and support for dual-channel DDR3-1333 memory.

So let's head onwards to the next page where we'll startup a little overview on the Clarkdale series processors, a photo-shoot and then head onwards into the test session where we'll throw some performance models at the processor to see how well it can hold up with all the dual, triple and quad core processors out there.

Reference:http://www.guru3d.com/article/core-i3-530-processor-review/ and a link to amazon http://www.amazon.com/Intel-Processor-2-93GHz-LGA1156-BX80616I3530/dp/B0030DN1GO
Retrieval Date:3/26/10

i5 Processor reviews

New 2010
Intel® Core™ i5 processor

Intel® Turbo Boost Technology◊1

Automatically speeds up your processor when your PC needs extra performance—that's smart performance with a speed boost. Available in select models of the new 2010 Intel® Core™ i5 processor-based systems.
Intel® Hyper-Threading Technology◊2

Features 4-way multi-task processing that allows each core of your processor to work on two tasks at the same time, delivering the performance you need for smart multitasking. You and your PC won't be slowed down, regardless of how many applications you have open at once.
Intel® HD Graphics◊3

Intel® HD Graphics provides superb visual performance for sharper images, richer color, and life-like audio and video.◊3 Watch movies and Internet videos in high-definition, play popular game titles and get full support for Microsoft Windows* 7. It's all built in; no need for an extra add-in video card.
Connect In More Places

Access to the internet, whether in your home or while you're mobile, is increasingly important. At home you have a variety of wired and wireless connectivity options with your Intel® Core™ i5 processor–powered PC. What about when you're on the move? You're covered there as well. Pairing Intel®-powered devices with a wireless broadband service package and a tiny USB modem from your service provider enables you to enjoy high-speed broadband internet just about anywhereΣ without depending on WiFi hotspots. So power up and get connected!

If you're buying a new PC, look for embedded WiMAX capability in your PC.

Reference:http://www.intel.com/consumer/products/processors/corei5.htm
Retrieval Date:3/26/10

Wednesday, March 17, 2010

The real difference between SQL Server and Oracle

For years now there's been a constant war between Microsoft supporters and Oracle supporters. Oracle has these features, SQL Server has these features, etc. But that's not really where the real importance lies. Sure, functionality is a part of it because your database should be able to do what you need it to do. However, do you want to know what the real difference between the two companies is and why Microsoft
For years now there's been a constant war between Microsoft supporters and Oracle supporters. Oracle has these features, SQL Server has these features, etc. But that's not really where the real importance lies. Sure, functionality is a part of it because your database should be able to do what you need it to do. However, do you want to know what the real difference between the two companies is and why Microsoft has made such a strong impact in the industry?

The answer is simple: information. Microsoft has built such a strong community and its members are committed to helping each other. There are so many forums out there you just don't have time to go to them all. And one of the most amazing things I've found is that the MSDN forums are actually sharked by Microsoft's own PSS and dev teams. You just can't get any better than that. You've got both the guys on the support team, and the guys who actually write the code helping you with your problem. You've got MVPs out there writing new and exciting books like crazy. They're really giving up all the secrets on how SQL works, and what you can do with it.

Oracle is still living in the old days where everything is a good ole boys club. This is the world of Linux and Unix where they started, and it's a dinosaur, man. You just can't afford to do business like that anymore. You have to open up your community and start programs to encourage your best people to help and teach.

If you take any 10 DBAs from each side and ask them to look up a solution to a problem on their platform, the SQL guys will find the answer much faster than the Oracle guys will. And that's just a fact. If you're looking on specifics on how Oracle works internally, it's almost impossible to ferret out the info, but with SQL, there are so many open resources it's just a matter of a few minutes to an answer.

Microsoft also has a Connect Web site where users can enter in bugs and feature requests, and these requests go straight to the dev team. Your voice gets heard.

So the real difference between these two platforms is community. Microsoft has gone to great lengths to build a community and really support it. And Oracle is still doing business the old way. It's almost like Oracle's still proud that they're holding on to the good ole boys club. They're proud of how complicated everything has to be in Oracle. Knowledge is for the few and the special. And this attitude is pervasive in third-party vendors as well. Look at all the vendors out there making video training. I haven't seen any for Oracle, but SQL has tons. OK, I've seen a couple for Oracle, but they're all that old style CBT from the 1990s. But there aren't any high-level Oracle people out there making video training that's affordable for the end user.
And the stuff Oracle posts on its site is incomplete at best. It's just not enough. And it's not like nobody's using Oracle. So the user base is there. Then why isn't the training there too?

I think the third-party training vendors don't have the training available for Oracle because of the lack of community. It's that same attitude propagated by the mothership and the whole Unix world that keeps information from being available. And it's frustrating because being someone who would like to learn more about Oracle I could use some of those resources.

The problem is though that I've sought them out before and they're just too hard to find. And I don't have time to run comprehensive searches for everything under the sun just to figure out one little aspect of Oracle. I'd rather go through a modern tutorial by an Oracle expert that actually explains how some of this stuff works. Show me good examples, explain to me what they mean, etc.

You know ... pretend that there are some people out there who haven't been doing this for 20 years. Because people are going to use what's available to them and what will get them up and running the fastest.

And right now, with all the factors involved, Microsoft is a better overall platform than Oracle because it doesn't matter what your platform can do if nobody knows how to make it work.

Reference:http://www.infoworld.com/d/data-management/real-difference-between-sql-server-and-oracle-755
Retrieval Date:03/17/10

Tuesday, March 16, 2010

Visual Studio 2010 review

Developers expected to see Visual Studio 2010 and .NET Framework 4 beta 1 in the Tech-Ed time frame -- and Microsoft didn't disappoint. On Monday, May 18, the first business day following the educational conference, the company made Visual Studio 2010 and .NET Framework 4 betas available for MSDN subscribers to download. Public downloads of the bits were released a few days later.

Beta 1 gives the broad .NET community a closer look at what's coming in Visual Studio 2010 and the revamped code editor built using Windows Presentation Foundation (WPF). As reported in last month's cover story, ".NET 4 Revealed," the updated framework promises maturing class libraries, new parallelism capabilities and a major upgrade to the ADO.NET Entity Framework (EF) that debuted in .NET 3.5 Service Pack 1 (SP1).
In late May, Microsoft continued its emphasis on the cloud, releasing an updated community technology preview of its Azure Tools for Visual Studio, which added support for the Visual Studio 2010 beta. The Azure Tools provide C# and VB templates for cloud-based projects, debugging of the local development fabric and storage, and access via a WindowsLive ID to the Azure Services Developer Portal. The Visual Studio extension installs the Azure SDK, which Microsoft also updated in May. The caveat: Windows Azure and .NET Services work with .NET 3.5 SP1-they do not yet support .NET 4.

Trials and Triumphs
Early reports on the Visual Studio 2010 and .NET Framework 4 beta are fairly positive, considering these are still technology previews. Developers should be able to install the Visual Studio 2010 beta side-by-side with Visual Studio 2008 SP1 on the Windows 7 RC (not the Windows 7 beta) without any issues, according to Microsoft.

Joseph Megkousoglou, associate lead software architect at property services firm Knight Frank LLP, lauds the clear separation between managed and native code in the new IDE's installer. He is also pleased with the performance of Visual Studio 2010 on the Windows 7 RC. "I would happily use it as my main IDE, speedwise," Megkousoglou says.

The Team Foundation Server (TFS) 2010 installation is another matter, however. "[It was] quite long and involved," he explains. "As a Subversion guy, I'm still trying to see the benefits of all the features TFS offers."

Still, the new bits have not caused tremendous headaches. "I've already loaded our main projects in Visual Studio 2010 and compiled them with .NET 4," Megkousoglou says. "No major problems there. Performance is very similar to .NET 3.5. The main solution I loaded-comprised of nine different projects-is an ASP.NET application, uses LINQ and Enterprise Library components, and contains a total of 120K lines of code. This is one of our main applications, which receives a huge [number] of visitors every day."

Other developers ran into performance issues with the Visual Studio 2010 beta. "It's sometimes more sluggish to use than Visual Studio 2008 on the same hardware, but I'm hoping that will be resolved by the time it's released," observes Vaibhav Gadodia, a .NET architect at outsourcer Nagarro Inc.

"The biggest problem I have with it is stability," says David Barnhill, a senior consultant with Lab49 Inc. "It has crashed at random times on my machine. I haven't found a pattern to it. But the stability is not bad for a beta 1 product; I've had much worse experiences."

Working WPF
The much-ballyhooed WPF editor part of the new Visual Studio shell is not the showstopper that some developers feared. "While I think the performance is OK, I'm sure it will get better," says Stephen Forte, chief strategy officer of Telerik Inc. "I'm sure beta 2 will have a full-fledged performance."

Forte is a fan of the new WPF shell. "The WPF editor will give us a ton of benefits," he notes. "You can tell right way when you look at it: It looks different and quite frankly it's easier on the eyes."

Despite the new look and feel, Visual Studio 2010 remains familiar and easy to navigate. "It's actually close enough to the development environment of 2008 where you can find your way around really easily," Barnhill says.

Developers who've downloaded the beta have commonly voiced dismay over certain issues. One is too much white space, or glyphs, which can make it harder to read code. In a posting in the Visual Studio Editor blog a week after the beta dropped, Brittany Behrens, program manager on the Visual Studio Platform team, noted that Microsoft will revert back to the Visual Studio 2008-style glyphs in the final release. Similarly, the widely used Box Selection-missing in beta 1-will be updated with new multi-line editing capabilities and put back into the final release.

Another point of concern for many developers is blurry fonts and fuzziness, especially for lower point sizes. "One of the big things they're going to fix is the text in the display; [it's] a little blurry now," Barnhill says. "There are some problems that are in WPF's font rendering that they'll be fixing in the beta 2 of .NET Framework version 4-that's something that a lot of people really hate."

WPF only supports TrueType fonts, and other bitmap fonts still appear on the menu in Visual Studio 2010 beta 1. Those fonts will be removed in the final release, according to Behrens. (To learn more about common editor issues in beta 1 and Microsoft's fixes, you can take a look at Behrens' June blog posting at http://tinyurl.com/p93xhm.)

Gadodia says he's willing to wait for Microsoft to add the Compatible Width Layout stack in beta 2 to address WPF's font rendering. "It would be crazy to sit in front of an IDE that has these problems," he notes. "Since a lot of the UI is WPF-based in Visual Studio 2010, it's definitely more painful."

Despite these issues, Barnhill applauds the new shell, which is built on WPF and the new Managed Extensibility Framework. "It's just a much better technical base to start with than before," he says. "It's a lot easier for developers to make extensions to the editor and make it work differently without running a little C++ code. You can do it right in WPF and C#."
The historical debugger and new multithreading debugging functionality that Barnhill calls "parallel stacks" are also impressive in beta 1. "I really like the historical debugger, but one issue with that is it's only available in the Visual Studio Team System (VSTS) Developer Edition. It doesn't come with Visual Studio Pro," he explains. "I like, overall, the design."

The "parallel stacks" used for debugging multithreaded applications enable developers to set a break point in their source code so that they can figure out where each thread is at a point in time. "The performance of that is fine," says Barnhill. "When you run it and get to that break point, that's the only time you can see your stacks; that parallel stack only makes sense when you're at a break point in your code."
The early looks at the first version of Parallel LINQ (PLINQ), which is rather limited, are encouraging. "It worked as advertised as far as being able to easily take a collection of objects and do parallel operations on those objects," says Barnhill. "It was very easy to do, and more [efficient] than the old way you had to do it."

Close
Click here to find out more!

News
Early Feedback on Visual Studio 2010 Beta

* 07/01/2009

The historical debugger and new multithreading debugging functionality that Barnhill calls "parallel stacks" are also impressive in beta 1. "I really like the historical debugger, but one issue with that is it's only available in the Visual Studio Team System (VSTS) Developer Edition. It doesn't come with Visual Studio Pro," he explains. "I like, overall, the design."

The "parallel stacks" used for debugging multithreaded applications enable developers to set a break point in their source code so that they can figure out where each thread is at a point in time. "The performance of that is fine," says Barnhill. "When you run it and get to that break point, that's the only time you can see your stacks; that parallel stack only makes sense when you're at a break point in your code."

The early looks at the first version of Parallel LINQ (PLINQ), which is rather limited, are encouraging. "It worked as advertised as far as being able to easily take a collection of objects and do parallel operations on those objects," says Barnhill. "It was very easy to do, and more [efficient] than the old way you had to do it."
'Oslo' Preview Debuts 'Quadrant'

Microsoft released a new CTP of "Oslo" in May, giving developers a first look at "Quadrant," the code name for the graphical tool intended to provide browsing of models. Oslo is the code name for Microsoft's model-driven development platform that runs on a SQL Server 2008 repository.

The May 2009 community technology preview (CTP) updates the SDK with runtime and provides the first public look at Quadrant. (An alpha version of Quadrant appeared in the Virtual PC handed out to attendees of Microsoft's Professional Developers Conference 2008, but the graphic modeling tool was not in the public CTPs.)

"What's interesting about the [Quadrant] tool is, now you can put things into the repository and you now have the ability to view them graphically, as opposed to just using command lines or SQL statements," says Stephen Forte, chief strategy officer at Telerik Inc. "It's obviously in an early form but it looks pretty good. You can really work with your application data and metadata much better."

The latest CTP also adds support for UML 2.1 domain models-use case, component diagrams, activity, sequence-and the respective XMI importer.

Also noteworthy is the M editor called Intellipad. It enables developers to compile DSLs created in the M modeling language into image files, rather than having to use a command-line interface, explains Forte.

"You can actually author M in Visual Studio much easier now with this SDK," Forte says. "You could in the past, but integration is even tighter. For example, if you create and end the project in Visual Studio, you can now open that same M project in Intellipad, which you weren't able to do before. So the tooling definitely interoperates nicely."

-J.S.

Entity Framework 4 Unleashed
With the release of the Visual Studio 2010 and .NET 4 betas, developers are getting their first look at the ADO.NET EF 4, which is Microsoft's preferred model for building applications that access databases.

Given much of the backlash about the first release, the update has been eagerly awaited. EF 4 adds support for n-tier APIs and templates, increases Plain Old CLR Objects (POCO) coverage and improves Persistence Ignorance, in addition to other improvements, according to Microsoft.

"The Entity Framework itself has pretty much undergone a radical transformation," Forte says. "It really addressed some of the concerns of the community in that respect."

Nagarro's Gadodia is among those who passed on the first version of EF. "We were pretty happy with LINQ to SQL," he notes. "We had invested a lot of training effort as well, since it was-and is-such a shoe-in for that ORM layer."

Of course, many who invested in LINQ to SQL felt jilted last fall when Microsoft shifted its focus to EF. With EF 4, Gadodia notes that a lot of the features of LINQ to SQL are now available within EF. "It almost feels as if Microsoft copied some of these into EF from LINQ to SQL as a result of developer feedback," he says. "We'll continue to use LINQ to SQL in the short term in our current projects, for the simple reason that most of our developers are trained in the technology."

Among other features he likes in the updated EF are things like the modeling support, which now generates DDL based on the model. "There are other changes which are useful; for instance, POCO support," he adds. "We're developing a new internal application framework to use across projects, and changes in EF have put it in the front-running for using it in our framework."

Reference:http://visualstudiomagazine.com/Articles/2009/07/01/Early-Feedback-on-Visual-Studio-2010-Beta.aspx
Retrieval Date:03/16/10

Friday, March 12, 2010

What is Reimaging

Are you looking to cure your computer problems then you can look at Reimage.com for an Internet-based solution that fixes most any XP problem in as little as 30 minutes automatically.

IT professionals know that supporting Windows XP can be a tricky and time-consuming process. After all, the number of malware infections, spyware incidents and user mistakes is growing at an alarming rate! And, invariably, even the most cautious of users will encounter undesired behavior from a Windows XP PC. That undesired behavior can manifest itself in many forms, ranging from unexplained "blue screens" to system slowdowns to nonworking applications and much more.

For MSPs, solution providers and technicians in general, fixing those XP problems often involves a complete reinstall of the operating system, or, worse yet, a site visit, both of which could lead into an expense that no one would want to incur. Further complicating the problem is the fact that most PC users fail to back up their desktops on a regular basis, if ever!

Reimage.com takes a new approach to fixing XP systems: The company provides a service via the Web that works using an ActiveX application to replace all of the standard Windows XP files and then cleans the registry of unidentifiable entries or known problems. The service also scans the hard drive and removes viruses, malware, adware or anything that can cause a problem. If the system is unbootable, Reimage.com provides tools on its Web site to create an emergency boot CD, which will enable the subject PC to boot and connect to the Reimage service.
We put Reimage through its paces and tested the product on a few problematic XP PCs, and had good results. One of our first test machines was a system that would not boot after a motherboard and CPU upgrade. The system would hang while loading up a particular system file, in this case a file that was used by a freeware utility called SpeedFan. We created an emergency boot disk using a utility found on the Reimage site. That utility uses a Windows XP Service Pack 2 disk to create an ISO image with the drivers needed to boot a dead PC and connect it to the Internet and then the Reimage service site.

For Reimage to work, all it needs is the ability to "see" the XP hard drive and perform a scan. We rebooted the test system with our newly created boot disk and connected to the Reimage site, logged in to the service, and ran a recovery scan. The scan ran through the hard drive and reloaded all of the Windows XP system files, removed unidentified pieces of software and cleaned the registry. After about 20 minutes, the system was ready for a reboot. Upon reboot, the system functioned again and was for all intents and purposes fully usable.

We tested Reimage on another system that was plagued with spyware and browser hijacks, among other things. The subject system took over 5 minutes to boot and was basically unusable. Internet Explorer consistently went to unintended sites and adverts for spyware removal and security products would constantly pop up. The system also suffered from other performance issues, random crashes and many other problems. We were able to connect directly to the Reimage site, log in to the service and run the ActiveX-based scanner. The service quickly identified a multitude of problems and went to work.

Overall, the repair process took about 25 minutes and offered informative screens during the analysis and repair phase. What's more, the product offered the ability to generate a custom report at the end of the process and even offered an undo capability.

Reimage is currently offered on a monthly subscription basis or as a pay-per-use service. Monthly subscriptions start at $150 per month for as many as 50 PC repairs. The pay-per-use service is available for $200 and allows the repair of 20 PCs. Reimage is available for Windows XP and Vista support is expected in the near future. Reimage.com is looking for MSPs to partner with and will offer the ability for MSPs to "rebrand" the service.



Reference:http://www.channelinsider.com/c/a/Reviews/Reimage-Works-Like-Magic-to-Solve-Problems-with-XP/
Retrieval Date:3/12/10

Windows Skydrive Info

Windows Live Folders [SkyDrive] Review
Update: Soon after we published this review, Microsoft announced that the beta was being opened to the public at large and the service was being renamed to "Windows Live SkyDrive." Microsoft also announced several minor improvements, but the service largely looks and works the same as described below. Tweaks include drag-and-drop uploading, thumbnails view, sequential previews, and a view of recently visited folders from other users.

Part of the company's innovative and interactive—and yes, very Web 2.0—Live series, the Windows Live Folders is secure file storage that offers shared folders. And like a lot of the Live services, it's still in beta, in this case early beta; its developers say the final product may look quite different. The service also lets you make files available to the public at large. As a true Web 2.0 app, it works entirely in the browser—and it works as well in Firefox as in Internet Explorer. At present, you get 500MB of free storage with it, and it always displays how much you have left at the top of the window. The maximum file size you can upload right now is 50MB.


Reference:http://www.extremetech.com/article2/0,2845,2165275,00.asp
Retrieval Date:3/12/10

Tuesday, March 9, 2010

Windows Defender

Editor Reveiw

Microsoft Windows Defender is perhaps the best free antispyware application we looked at this year, but it's lacking when compared to brand-name antivirus-plus-antispyware solutions. We also disagree with Microsoft's aggressive need to verify our Windows license (not once but twice) before allowing us the opportunity to download and install Windows Defender. Given it's a free app, we would prefer that Microsoft see the larger picture and have all desktops clean of malicious spyware, regardless of their Windows status. Once Windows Defender is installed, it's not bad, though it could be better. Advanced users will appreciate the granularity in its controls. We fault Windows Defender only for being too lenient with some adware and spyware, labeling most every item we tested as low threats, an opinion not shared by other vendors.

Setup
Although Windows Defender is free, you cannot simply download and run the product. As mentioned, if you haven't already done so, you must first download and install the Windows Verification tool on your desktop, then you must validate that you are in fact running a licensed version of Windows. Only then may you download Windows Defender. Guess what? Microsoft then asks you again to validate your copy of Microsoft Windows before continuing with the Windows Defender wizard. If you follow the default settings in the installation wizard you are automatically signed up for Microsoft SpyNet, Microsoft's in house database of spyware seen in the wild. If you do not want any information transmitted back to Microsoft, choose the Install Definition Updates Only option instead. You will also need to agree to a supplemental license agreement (one that goes beyond what you agreed to when you installed your genuine version of Windows XP SP2 or Windows Vista). And there you have it. It's like getting frisked (twice) as you walk into the post office; Microsoft makes the process of downloading and installing unpleasant for such a pithy application.

Reference:http://reviews.cnet.com/security-and-encryption/windows-defender/1707-3688_7-32367764.html
Retrieval Date:3/09/2010

Monday, March 8, 2010

Why use Trackray

TrackRay is free web based Task and Time management software for mobile and desktop use. It can track project task assignments, activities, progress status; record timesheet entries and evaluate team members' workload.

TrackRay is:
* Simple to use - no complicated program settings, options, or volumes if confusing manuals.
* Full Mobile support - all TrackRay functionality and featured are accessible using either mobile devices (cell phones, smart handheld devices...) or desktop computers (PC, Mac, Linux...)
* FREE - no fees, bills, trial periods, hidden charges, usage limits or other obligations.

You don't need to install TrackRay - it is a hosted web based software that can be used from anywhere in the world where internet is available. Just use your desktop or mobile browser on Windows PC, Mac, Linux, mobile cell phone or other device.

TrackRay is like your web-based email, except it tracks projects, tasks and time.

Task and Time Tracking can be done using desktop computer or internet capable mobile device such as cell phone. All features are accessible using either media.



Reference:http://www.freedownloadmanager.org/downloads/TrackRay_58312_p/
Retrieval Date:3/12/10

Saturday, March 6, 2010

Workgroup or Domain

What is the difference between a domain and a workgroup?

Computers on a network can be part of a workgroup or a domain. The main difference between workgroups and domains is how resources on the network are managed. Computers on home networks are usually part of a workgroup, and computers on workplace networks are usually part of a domain.
In a workgroup:

*

All computers are peers; no computer has control over another computer.
*

Each computer has a set of user accounts. To use any computer in the workgroup, you must have an account on that computer.
*

There are typically no more than ten to twenty computers.
*

All computers must be on the same local network or subnet.

In a domain:

*

One or more computers are servers. Network administrators use servers to control the security and permissions for all computers on the domain. This makes it easy to make changes because the changes are automatically made to all computers.
*

If you have a user account on the domain, you can log on to any computer on the domain without needing an account on that computer.
*

There can be hundreds or thousands of computers.
*

The computers can be on different local networks.



Windows 2000 Server: Domain or Workgroup, pros and cons?
hd1840 Jan-16-02 04:10 PM

Larry,

I saw your very reasoned and well thought out advice on a recent reply about dedicated vs nondedicated file servers in an office environment. I wonder if I can try your patience and ask your advice about my situation.

Background: I'm a professional who telecommutes from home at times and am also married to a surgeon who also has robust IT needs. I have a 100 Mbs LAN at home which uses a 1.5 Mbs aDSL connection to access the internet. I have a standard gateway router with an uplink to a hub and from there to the individual ports. I also have a networked HP printer with network card. I have 3 pcs on the network, all running a version of Windows 2000. Two are desktops that I built using AMD processors, one is an old Pentium II 333 IBM laptop.

My questions center around what is the best implementation for file and print sharing on the network. I have tried a peer to peer network and also a client-server network. Currently I'm using a client-server network with Windows 2000 Server installed on one of the AMD desktops. I'm also using it as a workstation. Both have things that I like, but there are some things that are a real pain. I really enjoy the power that windows 2000 server gives with advanced disk management, print server, file server etc. I also have an active directory configured. I have no end of problems setting up the DNS server because even though I do have a registered domain with network solutions, I don't host it on any of my machines. There are a lot of issues with active directory.

Other things to consider:

Security is not my primary concern
The ability to log on to any machine is important
The ability to share files and the network printer is important
Shared internet connection is important(currently using the server included in the router to dish IPs to the network with the corresponding DHCP service turned off in Windows 2000 server)

Question 1: What is the best solution for what I need? I really don't like the hassles of Active Directory, but do enjoy the other features of W2K
server. I've been thinking that perhaps the best bet would be to use W2K server with a workgroup rather than a domain (which I'm using now). In the peer to peer that I had, it seemed like every time you wanted to use a file on another machine, you had to remember the password you used on that
machine, and it became very confusing. If I just reinstall W2K server with a workgroup, and don't use AD, will that give me the best of both worlds?

Question 2: As you know, using a server also as a workgroup can be fraught with peril. Is it reasonable to load W2K server on the laptop and just let it sit in a corner and act as the server for this network? I'll miss it, but would gladly make the tradeoff if it meant a stable environment. Now it seems that I lose my network printers requiring a reboot, and the like. Or,if I go the W2K server route using a workgroup, can I get away with the non-dedicated server? As you can tell, I'm loathe to take a step back to the peer to peer environment if I can avoid it.

Question 3: Is there a way to use AD (and thereby have a domain) when you don't host the domain on the network? For instance, my domain is located on a host in another state, but for lack of anything better, I used the same domain name in the home environment when setting up W2K Server. When I look at the error log, there is no end to the AD messages, especially that QoS (I think that is right) is not properly configured. Considering that I have two DNS servers for my domain (although not on my network) is it possible to use them and keep AD content? All attempts so far have met with negative success. I have Minsai's book (both for server and pro) and use it for most of my reference needs.

I've been chasing myself around in circles, and admit that I really like to push the envelope. Considering that this a very small Lan, and I have some expertise, I want to use the most robust solution I can without pulling my hair out with reboots and other problems. If there are some things that will allow me to use a full blown domain and W2K Server, that would be preferred. What can you recommend? Looking for advice on the best way to proceed. Thanks!

1. RE: Windows 2000 Server: Domain or Workgroup, pros and cons?
lbyard Jan-16-02 05:17 PM
In response to message 0

>I've been thinking that perhaps the best bet would be to use W2K server with a workgroup rather than a domain (which I'm using now). In the peer to peer that I had, it seemed like every time you wanted to use a file on another machine, you had to remember the password you used on that
machine, and it became very confusing. If I just reinstall W2K server with a workgroup, and don't use AD, will that give me the best of both worlds?

Coincidently, I have a network similar to yours (three home offices; 5 computers) and I just retired my NT server because I do not need a domain controller, DNS, or WINS. A Windows 2K Pro computer replaced it for shared data. Another reason is I cannot see purchasing Windows XP Pro vs. XP Home just for the ability to connect to a domain. If you don’t need the security and all the bells and whistles in 2000 Server, why run it (or NT server or Novell…). A router works great for this sort of network, especially if it also has a print server. A simpler solution than DNS for a local web server is to use hosts tables. Search C: with *.sam for a sample and instructions.

> Is it reasonable to load W2K server on the laptop and just let it sit in a corner and act as the server for this network? Or,if I go the W2K server route using a workgroup, can I get away with the non-dedicated server?

I have a dedicated server mainly for the accounting database. Don’t confuse a computer dedicated to sharing data, etc. and not also acting as a workstation with a domain controller and all of those other bells and whistles. You do not need a powerful computer for a server. I use old, pass-me-downs for that function. My present server has a 300 Mhz K6-2 processor, 32 Mbytes of memory, and a 6.4 Gbyte drive. I plan to upgrade the memory to 64 Mbytes. I’d like a bigger drive for backups, but the current one is fine for shared data (which is backed-up regularly to one of the other PCs). I am planning to load Linux on the computer I was running NT server on and try it out as a server. That computer has a 166 MHz processor. Which is fast compared to my first server. It had a 12 MHz 286 processor, 1 MByte of memory (perhaps it got up to 2-4 Mbytes before we replaced it with a 386 and two mirrored 240 MByte drives), a 40 MByte drive and was networked to 8-10 computers.

I’d get rid of the local domain and certainly would not use a local domain that was the same as one I had running on a hosting service. I called my local domain DUXHQ for exactly that reason—so as not to have a problem with duxcw.com, which also hosted remotely. Larry

2. RE: Windows 2000 Server: Domain or Workgroup, pros and cons?
hd1840 Jan-16-02 05:32 PM
In response to message 1

Larry,

Thanks for the advice on the hardware issue. It sounds like my laptop will work as the server in whatever configuration I use.

A further question and clarification. It sounds like you recommend against the hassles of having a domain controller and full blown AD in a setup like I have.

However, I am unclear on the merits of using W2K Server in a client-peer network using a WORKGROUP as opposed to just reverting back to peer-peer and using a regular W2K Pro machine to handle the same function. Can you tell me the pros and cons of these two options now that the field has been narrowed down a bit? Won't all the hassles of DNS, WINS and the like go away when running a workgroup? If there is a reference that addresses this issue in print I'd be grateful to know

what it is so I can read up on it. Thanks in advance.

3. RE: Windows 2000 Server: Domain or Workgroup, pros and cons?
lbyard Jan-17-02 04:26 PM
In response to message 2

>I am unclear on the merits of using W2K Server in a client-peer network using a WORKGROUP as opposed to just reverting back to peer-peer and using a regular W2K Pro machine to handle the same function.

I think you answered your own question... I am using Win 2K Pro and it works just fine. If I wanted all of the server bells and whistles (and expense), I would take the time to install NT Server on the 300 MHz computer. I was running a domain controller as a holdover from the days when I had a 9-person, million a year storefront shop/business (see About Dux). Now I have is 750 sq foot shop/office attached to my home, my Wife and her office, and my son and his office, and I do not need nor want to bother with the security and complexity of a domain controller. I think one of the worst mistakes Novell ever made was to implement Directory Services and make their server software more complex and harder to use than small businesses and home offices need. Perhaps Microsoft is repeating that mistake. Why make a small server more complex than it needs it to be. A server should sit in a corner gathering cob webs and serving without complaint or attention day-in, day-out, year-in, year-out, 24 hours-a-day, and one shouldn’t have to log into one when at home, unless desired. Simple is better. Larry


Reference:http://www.duxcw.com/dcforum/DCForumID2/1605.html,http://windows.microsoft.com/en-US/windows-vista/What-is-the-difference-between-a-domain-and-a-workgroup
Retrieval Date:)2/6/10

Thursday, March 4, 2010

What's better Amazon.com or Ebay.com to buy and sell products ?

I have been selling my used books on Amazon for a while.

1- Generally Amazon is a better place to sell stuff but not for everything. For selling electronics and some special stuff you should get a sort of permission from Amazon to be able to sell your item.

2- You may sell everything you want on Ebay. You make your own listing but to sell on Amazon you should sell whatever is available on their website. If it is not available you should recommend it to them and after their approval you get a chance to list your item.

3- Ebay is an auction website. Amazon is not. Amazon used to have an auction website but they were not successful on that, they closed it down.

4- To sell your item on Amazon you should be patient. Amazon has less visitors and you might end up with waiting more for your item to be sold.

5- From my experience I should tell you that Amazon fees are tiny in compare to Ebay fees! There is no listing fee in Amazon but in Ebay they charge you as soon as you list your stuff for sale. They both have final fees if your item is sold.

6- In Amazon you name your price, so you will get at most as much as you list it for but in Ebay sometimes quite amazingly you may sell something in a very higher price than you had originally thought!

So, in summary if you are not in rush use Amazon. It has tiny fees and you get a fair amount of money for what you want to sell.

reference from
http://askville.amazon.com/eBay-Amazon-place-sell-stuff/AnswerViewer.do?requestId=7778359
Retrieval Date:
2/4/10

Dell PowerEdge Servers

Designed for Performance Intensive Applications
Dell makes it easy to find the perfect tower, rack or blade servers. We started with a clean sheet to create a versatile server portfolio with increased memory capacity and I/O rates for intense applications in demanding environments.

Manageable, Scalable and Flexible
Our PowerEdge server portfolio offers a range of options with easy configuration, powerful open-standard systems management, scalable storage and flexible services.

Introducing the Latest Eleventh Generation PowerEdge Servers
The Dell PowerEdge R510 features advanced management capabilities, cost effective RAID options, and an excellent balance of internal storage, redundancy and value in a compact chassis.

reference from
http://www.dell.com/poweredge
Retrieval Date:
2/4/10