Emerging Tech

I Think, Therefore IBM

IBM has announced the latest version of its neurosynaptic processor — that is, a processor whose workings are inspired by the human brain.

Built on Samsung’s 28nm process technology, it has 5.4 billion transistors and an on-chip network of 4,096 neurosynaptic cores.

IBM claims it is the first neurosynaptic chip to achieve 1 million programmable neurons, 256 million programmable synapses, and 46 billion synaptic operations per second.

The size of a postage stamp, the processor consumes just 70mW of power during real-time operations — about as much as required by a hearing-aid battery, according to IBM.

Children of TrueNorth

This is the second generation of a processor based on IBM’s TrueNorth architecture; the first was unveiled last year.

IBM unveiled a 16-chip system with 16 million programmable neurons and 4 billion programmable synapses to demonstrate the processor’s scalability.

The processors were built under the SyNAPSE program funded by the United States Department of Defense, which has pumped tens of millions of dollars into it since its 2008 launch. Participants include IBM, HP and HRL Laboratories.

Some Tech Stuff

Samsung’s 28nm process technology has a dense on-chip memory and low-leakage transistors.

The chip’s event-driven circuit elements use asynchronous design methodology developed at Cornell Tech.

It runs on TrueNorth, a multithreaded, massively parallel, highly scalable functional software simulator of a cognitive computing architecture.

IBM researchers developed a new language to enable TrueNorth, which is built around a simple, digital, highly parameterized spiking neuron model called a “corelet.”

Corelets support both deterministic and stochastic neural computations, codes and behaviors, and they can be combined to produce larger, more complex corelets. They integrate memory, computation and communications.

Corelets communicate with each other through an on-chip event-driven network if they’re on the same processor, and through an inter-chip interface if they are not, leading to seamless scalability.

They operate in an event-driven fashion rather than being clock-driven like traditional CPUs.

Individual cores can fail — but, like the brain, the rest of the processors will go on working.

Possible Issues With the Processors

“What about the memory and interface technology?” asked Jim McGregor, principal analyst at Tirias Research. “I’m not really sure how this would be used if it does not require much memory or external communications.”

The processors might be used as controllers “for anything from a robotic system to an artificial organ,” he speculated. “Otherwise software, memory and communications bandwidth “could be significant challenges to using this in place of existing computing solutions.”

Where Neurosynaptic Processors Might Be Used

IBM’s latest neurosynaptic processor provides “a level of processing in parallel that comes close to what organic brains can do,” Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld.

This “moves us from calculations to being able to reason and make decisions,” he explained. “Obvious targets would be automated weapons systems, advanced digital assistants, autopilots, advanced robotics, and systems that could better emulate people.”

They might also come in handy in the burgeoning Internet of Things, because “one of the critical parts of IoT is control, and in large complex systems, human operators can’t monitor enough things or make decisions quickly enough,” Enderle pointed out.

Further, such processors “could open up new areas of prosthetics that can better integrate with the brain and either enhance humans or better repair them after an accident,” he speculated.

Other Players

Qualcomm has been working on its own neuromorphic processors, called “Zeroth” processors, which mimic the workings of the human brain.

Prototypes of Zeroth have been incorporated into mobile robots whose learning and decision-making processors were developed using siliconized versions of algorithms modeled on the anatomy and excitatory spiking of neurons in the brain.

The Massachusetts Institute of Technology also has designed a chip that mimics brain plasticity.

Richard Adhikari

Richard Adhikari has written about high-tech for leading industry publications since the 1990s and wonders where it's all leading to. Will implanted RFID chips in humans be the Mark of the Beast? Will nanotech solve our coming food crisis? Does Sturgeon's Law still hold true? You can connect with Richard on Google+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels