Emerging Tech

IBM’s Blue Gene Supercomputer To Seek Origins of Universe

IBM and Dutch astronomy group Astron have announced a joint research project that will harness the 34-teraflop supercomputing capability of IBM’s Blue Gene/L to peer back billions of years, deep into the history and even the birth of the universe.

Blue Gene/L, to be completed by mid-2005, will rank among the world’s most powerful supercomputers, performing more than 34 trillion calculations per second. In its role as seeker of the universe’s origins, it will serve as an alternative to the large optical mirrors and radio dishes currently used to point to distant galaxies. Instead, Blue Gene/L will harness more than 10,000 simple radio antennas spread across the northern Netherlands and Germany, interpreting data collected from them using high-speed calculations, IBM said.

“The challenge of processing unprecedented data volumes requires us to raise our technology to a new level,” IBM Research director of exploratory server systems Bill Pulleyblank said. “Blue Gene/L provides the flexibility and power to enable us to meet this grand challenge.”

Big Blue Telescope View

IBM said the supercomputer will give Astron the flexibility and speed required to gather information from its Low Frequency Array (LOFAR) “software telescope” — a network of telescopes and astronomical findings.

Supported by the Dutch government’s Ministry of Education, Culture and Science, the supercomputer is the latest in IBM’s Blue Gene family of supercomputers, which are being used to analyze and solve scientific, business and societal problems. The company’s ultimate goal is to produce a petaflop Blue Gene supercomputer — a machine that can perform 1 quadrillion calculations per second.

Harvard Research Group vice president Bill Claybrook told TechNewsWorld that the key advantage of using supercomputers in research is time.

“People can do work they couldn’t in the past and do it [more quickly],” he said.

IT Aiding Astronomy

Like many scientific researchers, astronomers have benefited from advances in high-performance computing, prompting some to call the present the golden age of astronomy. Indeed, today’s astronomers have access to a vast number of databases of varying types, as well as ever-clearer views of far-off celestial objects and events.

Astron director Harvey Butcher confirmed that discovery in astronomy goes hand in hand with technological innovation, adding that Blue Gene/L may provide the best view yet of the origins of the universe.

“Together with IBM researchers, we hope to learn how to design a new generation of radio telescopes capable of revealing the secrets of the early universe,” Butcher said.

He added that the approach will have implications for other supercomputing applications, including geophysics and precision agriculture.

Distributed Competition

Even as IBM presses forward with its Blue Gene initiative, another trend also is taking hold in the supercomputing space. Yankee Group senior analyst Dana Gardner told TechNewsWorld that the traditional, monolithic approach to supercomputing increasingly is being replaced by a distributed-computing model — cheaper, faster utilization of clustered computers.

One such cluster, Virginia Tech’s Terascale, consists of 1,100 dual-processor, 2-GHz Power Mac G5 computers. It is being used for research on nanoscale electronics, quantum chemistry, computational chemistry, aerodynamics and molecular modeling of proteins, among other work.

Meanwhile, the U.S. Department of Defense has announced its own commitment to a Linux cluster from Linux Networx, which will provide a 2,132-processor Evolocity II cluster for an Army research facility. The cluster, purchased under the High Performance Computing Modernization Program, is the largest ever sold to the Pentagon to make use of new Intel 64-bit extension technology, according to Linux Networx and the DoD, which did not disclose financial details of the deal.

Pluses and Minuses

However, Claybrook said that while Linux clusters and other distributed-computing models have significant price and performance advantages, they also suffer from manageability issues that are less prevalent in traditional supercomputers.

“You’re still going to have supercomputers,” he said, referring to national laboratories in the United States that use the colossal machines. “People still seem to be buying them.

“Plus, there’s a huge competition among countries to have the biggest supercomputer,” Claybrook added, referring to foreign governments’ investment and development in this arena. “People love to build these big things.”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels