Internet

Google Sets Eyes on Skies With NASA Deal

Google took a giant step today, announcing with NASA’s Ames Research Center that the two will combine efforts on technology research ranging from the microscopic to the massive.

The search engine company and the space agency signed a memorandum of understanding (MOU) that Google says outlines projects including “large-scale data management, massively distributed computing, bio-info-nano convergence and encouragement of the entrepreneurial space industry.”

Bio-info-nano convergence is the combination of biological, information and nano technology, which NASA says is essential to getting an astronaut to Mars.

Nothing Solid

So far, plans are mushy.

“Everything is tentative at this point — the ink is not yet dry on the MOU — but I think the most immediate project would be to provide better access to NASA’s scientific data and images,” Peter Norvig, Google’s director of search quality, told TechNewsWorld.

“Already, Google Earth and Google Maps provide much wider access to satellite photos than ever before; we’d like to do that with much more of NASA’s data,” he said.

Google will also expand into NASA Research Park at Moffett Field, which is in Mountain View, Calif., and very close to Google’s headquarters. The search company said it will develop an office space as much as one million square feet in size, but will continue to maintain its company headquarters separate from the research center.

The two organizations, seemingly disparate, do in fact mesh, partly because each one deals with terabytes of data, albeit in vastly different ways.

Super Opportunity

Norvig said he’s fascinated by the differences in supercomputer structure between the two organizations.

“To me personally, the most interesting thing is comparing our supercomputer architectures and seeing how they can inform each other. Ames’ Columbia supercomputer last year broke the record as the world’s fastest — it may since have been broken by someone else — and is built around the idea of a large — 24 terabyte — memory, shared among multiple processors. This is appropriate for problems like calculating the wind flow over the space shuttle as it enters the atmosphere. Any part can effect any other part, so you want it all in one memory image.”

“Google has taken a very different approach to supercomputers,” Norvig indicated, “[using] commodity off-the-shelf computers with small memory sizes (a few gigabytes) networked together.”

“This is appropriate when your problem is something like looking at lots of Web pages,” he explained, “where what appears on one page does not have an immediate effect on what appears on the other. Getting our system designers together with theirs, I believe we both can learn how to improve our respective approaches, and perhaps an interesting hybrid can emerge.”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels