Internet

IBM Dominates Super High-End Mind-Blowing Computing

For four years now, IBM’s Blue Gene/L has remained king of the hill as far as supercomputers are concerned, according to listings from TOP500 Supercomputer Sites, an organization that ranks the world’s most powerful systems. On Wednesday, Blue Gene/L again appeared at the top of the TOP500 list.

IBM, however, has another Blue Gene in the works that may someday eclipse the Blue Gene/L.

IBM’s new supercomputer, the Blue Gene/P, launched Tuesday, made the list at number 30, and it’s capable of crunching 20.86 teraflops (TFlop/s, or trillions of calculations per second). The Blue Gene/L, on the other hand, cranks out a whopping 280.6 TFlop/s. IBM’s next closest competitors are both supercomputers built by Cray: the Cray XT4/XT3, which produces 101.7 TFlop/s, and the Cray Red Storm system, which produces 101.4 TFlop/s.

All three systems are used by organizations within the U.S. Department of Energy.

Six in the Top Ten

Of the top ten supercomputers in the world today, six are built by IBM, and the company produces 46 of the top 100. Thirty-nine of them are built on IBM’s Power architecture hardware, running versions of Unix (IBM’s AIX flavor of Unix) and Linux software, according to the company.

“I don’t see anybody else out there who is even close at this point,” Charles King, principal analyst for Pund-IT, told TechNewsWorld.

“I’ve heard other companies make claims, and the lead changes from time to time … but with the kind of sheer scalability IBM has been getting out of the Blue Gene architecture, I don’t see anything on the horizon that will threaten it,” he added.

Sun Microsystems is working on its Sun Constellation System, which is based on reusable components comprised of thousands of Sun UltraSPARC TI, AMD Opteron or Intel Xeon processors placed in Sun Blade 6000 servers capable of holding 48 blades per rack.

The first implementation, the so-called Ranger HPC cluster, is expected to go live later this year at the Texas Advanced Computing Center at the University of Texas at Austin. When it’s fully deployed, Sun says it will deliver more than 500 TFlop/s.

3 Petaflops on the Way

While the newer Blue Gene/P only currently hit the list at number 30, it represents the future of IBM’s supercomputers. It is a two-rack system that fits in about the space of two refrigerators — which is downright tiny compared to many cluster-based supercomputers. By adding additional racks, IBM says the Blue Gene/P can scale out and be configured to reach up to three petaflops, which is a mind-warping three thousand trillion calculations per second.

“There will be clients who want to build it up to a petaflop,” Herb Schultz, IBM’s Deep Computing marketing manager, told TechNewsWorld. “We’re confident that we have clients who want to build it up to a petaflop or more. People might have trouble believing this, but there are some commercial and industrial accounts that have workloads that could consume a petaflop.”

IBM, Schultz said, has made it easier for programmers to write applications for Blue Gene/P, and with the combination of a proven architecture and a roadmap for future development, IBM is seeing new customer interest. The company currently has about 30 Blue Gene customers, with a handful of customers already interested in Blue Gene/P.

With the Blue Gene/P, Schultz said the architecture lets customers add racks to produce predictable linear performance gains — going from two racks to four racks will produce twice the computing power. This is different from cluster-based systems, Schultz explained, because the clusters introduce performance-sucking latencies between nodes that lead to less predictable and smaller performance gains whenever you add additional processors.

“With regular commodity clusters, you can double the size of it but only get 20 percent more performance,” he noted.

What Are They Doing?

More than half of the TOP500’s list of supercomputers don’t have a publicly specified use; however, supercomputers are most often used for research and high-end simulations in the earth sciences — geophysics, weather and climate research, oceanic testing, as well as in a variety of non-specified laboratory settings — but they are also increasingly being used in the finance, semiconductor and gaming industries.

“Classified weapons testing has been a constant area of growth for high-end supercomputers, like doing blast testing and computational fluid dynamics, movement of objects through water or air, but one commercial area that has been mentioned recently is insurance,” King said.

“The insurance industry has been doing increasingly sophisticated weather simulation to augment their risk analyst related to areas that might be subjected to hurricanes or tornado damage, but that’s still not the kind of thing where they could justify the cost,” he added.

“What we’ve observed is, if a supercomputer does manage to get past infant mortality and we can derive value from of it, we’ll see industry pick it up,” Schultz said. “Pretty soon things that were once exotic become common in the industry.”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels