Through much of the year I hear complaints about personal computers — whether they are running Windows or the relatively rare alternatives. These complaints are typically about systems reliability, the costs resulting from migrating employees to new hardware, the cost associated with new employees and the massive costs associated with keeping software up to date.
With the cost of new hardware at all-time lows, secondary costs can exceed 80 percent of the total cost of maintaining personal computers, which doesn’t include the opportunity cost associated with having trained software engineers dedicated to keeping the hardware running as opposed to making the company more productive.
Much of this cost is carried back to the roots of the PC. In the beginning, the PC was largely a hobbyist’s toy and, for the most part, while the components have changed, this home-built architecture has actually changed very little in the past 30 years.
The Failure of Thin Clients
However, on the server side, there have been several dramatic changes that allow a server that is running software similar to the desktop PC to run with substantially greater reliability and security. Because some believed this technology could be applied to the desktop, the concept of “thin clients” started making the rounds, driven hard by desktop PC pretenders Oracle and Sun.
These folks were server experts, but about as far away from being experts on low-cost client devices as we get in this business. Their efforts therefore failed. Sun still has the Sun Ray product line, and, to date, it still provides one of the most compelling demonstrations of what is possible in the market. But the cost of this solution and Sun’s lack of credibility in the desktop space caused this platform to fail. While the idea was a good one, it is almost impossible to replace an existing dominant technology unless you are credible with that technology first.
Thin-client solutions do exist that are now vastly more capable than they initially were, but servers simply were not designed for the kind of loading that desktop users can put on them. One user running a particularly resource-intensive application can damage the performance of all users, and — with the exception of data input — these solutions haven’t done well, even though they have proven to be very robust and cost effective.
People forgot the “personal” aspect of the PC, and the solution simply gave up too much to get to the admittedly strong cost benefits. It wasn’t until the dot-com boom that another server technology created a more reasonable solution. During the dot-com years, companies sprang up to specialize in hosting Web sites, but it was quickly discovered that people didn’t like the idea of their Web site being hosted on the same hardware as someone else’s.
The Emergence of Blade Servers
But putting lots of servers into a small place was problematic. You could rackmount the servers, but each one still took up way too much space and was difficult to replace if there was a failure. Blade servers were created to deal with this problem by putting the entire server on a single card. If the server failed, you could replace it in moments and move the server image (basically, all of the software that made the server unique) to another blade. At worst, hardware failures would result in only a few minutes of actual downtime.
The blade architecture provided personal servers that were cost effective. In hindsight, the blade architecture clearly showcased a technology that could be used for desktops. A company called Clear Cube took the bladed server and created a desktop blade. While the product took a couple of years to mature, it eventually delivered on the promise of a desktop system that was as secure and as reliable as a server while still maintaining the performance expected of a desktop system.
It had three problems: It cost more than a similar desktop during a time when IT was tactically focused on cost; few had heard of the company; and the solution couldn’t be switched. This last problem meant you needed to hardwire every desktop via Cat 5 cable back to the blades, which was awkward for most people. This issue was recently addressed with a solution that could be switched, trading off some performance for installation flexibility.
While IBM Global Services stepped in to provide installation support and helped the company ramp up to the volume, the comprehensive IBM solution never materialized. This is because IBM continues to operate as if it is unsure whether it will remain in the PC business.
HP Beats IBM to the Punch
HP, which clearly has taken the lead from IBM for innovation in the PC space and remains heavily committed to it, entered the segment with a vengeance with a solution the company calls the Consolidated Client Infrastructure (CCI). What makes the HP solution particularly unique is the use of Transmeta chips to balance performance, heat and flexibility. This not only allows for higher densities of blades, but also for more OS flexibility.
Wrapped with HP’s global services and management tools, the CCI solution provides a strong counterpoint to Clear Cube’s offering, and, more importantly, it validates the PC blade solution by providing the necessary two vendors for competitive bidding. While Clear Cube’s unique hard-wired solution remains best where performance is critical, both vendors can now deal with the general customer. The larger the deployment, the more HP will be favored in this area.
Key benefits of a blade include the ability to recover from catastrophic failures in the time it takes to reboot, added physical security, resistance to viruses with rapid-patch capabilities, reduction of noise and heat in the workplace, and performance capability that can be dynamically allocated on the basis of need rather than tenure. It is no wonder finance and education markets are enamored with this solution. It makes incredible sense.
In the end, what makes the blade PC an interesting solution is that it leverages a broad spectrum of technologies to provide a level of reliability and security unattainable in more traditional products. While it does come with a slightly higher purchase price, the return on investment is expected to be within 12 months for most companies — and that is without factoring in the other benefits associated with an environment that is solidly immunized against catastrophic desktop PC failures.
For desktop-computer deployments, IT now has two choices: Continue to complain about the way things are, or do something about it and move to the more robust future of PC blades.
Rob Enderle, a TechNewsWorld columnist, is the Principal Analyst for the Enderle Group, a company founded on the concept of providing a unique perspective on personal technology products and trends.