Technology

EXPERT ADVICE

Top 8 Enterprise Server Predictions for 2010

Although the enterprise server market has been among the hardest hit by the suffering economy, there is reason to be hopeful as 2009 draws to a close. I’d like to take a moment to share eight predictions of what we can look forward to in 2010 — trends that have the potential to dramatically change the enterprise in the years to come.

1. The Customer Really Is Always Right

As budgets tightened, C-level executives were forced to make hard decisions to survive, shifting their business models to meet what customers needed. In the year to come, they must remain open to this model. Great innovation is often spurred by client demand — doing proof of concept prototypes and applying what we’ve learned, such as IBM’s experience creating Blue Gene.

Working with the Lawrence Livermore National Laboratory, IBM sought to create a complete solution, not simply a new chip or server, but an entirely new architecture designed to simulate physical phenomena of national interest — such as aging of materials, fires and explosions — that required computational capability much great than anything available.

These types of partnerships continue to push the envelope in terms of how much data can be stored and how quickly systems can scale, enabling the industry to bring supercomputing power to the masses by adapting these systems for commercial applications such as simulations and modeling, data mining or business intelligence.

2. Spending will Gradually Shift to Pre-Integrated Systems That Are Workload Optimized

Despite the industry-wide tendency to view systems in isolation, the configuration is fast becoming much more unified. Interest in pre-integrated hardware and software solutions targeted at specific workloads will continue to grow exponentially in response to this trend.

Rather than investing in servers, storage, networking and software separately, customers will turn to integrated bundles of hardware and software that are optimized to perform extremely well on a particular business problem or workload. This shift won’t happen overnight, but in several stages, with vendors initially offering optimized software on hardware and evolving to customized software running on preconfigured hardware.

Customers will benefit from leveraging hardware that has been fine-tuned and can be up and running within minutes as opposed to days. In the long term, systems will be based on a co-optimized design that takes both the hardware and software components into

3. New Flexible Hybrid Computing Platforms Will Emerge

It’s not enough anymore to simply accumulate standard commodity processors and servers into a data center if you want to cost-effectively run a business. Hybrid computing platforms that integrate architectures optimized for different kinds of jobs, such as OLTP, data analytics, and so on, will emerge. Business problems which might need some OLTP processing and some data analytics, for example, may thus be run entirely on a single flexible, hybrid platform.

Just consider Hoplon Infotainment, a Brazilian company that created the massive, multiplayer online game, “Taikodom.” Rather than purchase thousands of small servers to support its game deployment platform, the company took a hybrid approach, combining Linux on a mainframe with a cell processor, which enables Hoplon to support its million of players in real-time while scaling up and down to meet demands at peak hours.

From online gaming to enabling emerging applications, such as the ability to simulate an automobile before it’s built, hybrid computing has the potential to become the dominant model in the coming years.

4. IT Applications Will Expand Beyond Traditional Workloads

As we enter the next decade, there will be a continued focus on capitalizing on IT to solve real-world problems. In fact, IT is increasingly pertinent to nontraditional areas often overlooked as social or societal issues in the past.

In reality, improving traffic management in cities like Stockholm and London, powering smart grids in Malta and San Jose, Calif., and managing water around the globe are real problems that can be solved through the strategic use of computing power.

Enterprise systems tightly coupled with software and virtualization enhancements will play a key role in driving a smarter planet, serving as the cornerstone that power these new applications.

5. Analytics Will Power the Next Wave of Systems Innovation

Along with the explosion of the Internet came the growth of the “Internet of things.” Simply put, we are increasingly using connected objects — sensors, mobile phones, cameras, roadways, pipelines — to track and share critical information in real-time.

Enterprise systems of the future must be able to process endless streams of data that are coming in at a very high speed and to analyze that data in real-time. To handle this traffic, systems must be reconfigured not only to manage and visually present that data, but also to conduct predictive analytics, helping clients move beyond the “sense and response” mode to “predict and act” for improved business outcomes.

Ultimately, systems will go one step further, powering prescriptive analytics that will prescribe what a client should do and how to feed that back into their business processes.

One application could be to predict spikes in electricity uses so the system powering a smart grid could automatically shift resources to meet capacity demands, while reducing power consumption and consequently costs during “quiet” hours.

6. Cross-Platform Virtualization Becomes a Necessity

While virtualization has helped CIOs deal with the cost and complexity of IT, it has introduced new challenges as companies struggle to manage disparate platforms based on different technologies.

In 2010, enterprises will take advantage of cross-platform virtualization by clustering sever, storage and networking resources to allow for improved collaboration and prioritization of key resources.

In the future, systems must be fitted with a layer of automation that overlaps everything, allowing companies to better manage their diverse data center environments as one entity.

7. Green IT Pushes Adoption of Cloud-Based Systems

As servers grow denser, with new blades that incorporate servers, storage, switches, memory and I/O capabilities, enterprises will be forced to take a closer look at energy use.

This re-energized focus on green IT will spur the adoption of new technologies such as cloud computing, which is both driven and defined by concepts like automated processes, elasticity and enormous scalability.

Instead of powering thousands of servers, enterprises will be able to run hundreds that automatically scale up and down, cutting down on emissions and power consumption, as well as costs.

8. Storage: Data Deduplication and SSDs Will Continue to Prevail

The amount of digital information will grow to 988 exabytes by 2010, according to IDC. That’s equivalent to a stack of books from the sun to Pluto and back. Every day, 15 petabytes of new information is being generated — eight times more than the combined information in all the U.S. libraries.

Technologies such as data duplication and automated storage “tiering,” which move data between different layers in the storage hierarchy based on usage, will help clients effectively harness this information while reducing costs.

SSDs will be an increasingly important layer in the storage hierarchy, as it will improve performance and enable new types of applications.


IBM Fellow Jai Menon is vice president, technical strategy and CTO of the Systems & Technology Group.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels