A collaboration between IBM and DARPA has produced next generation computer chips that adapt well to unexpected inputs due to their structural mimicking of our brain’s neurons. Applications for these “neuro-synaptic chips” include analyzing financial market fluctuations and even predicting tsunamis.
Five years ago, there was considerable talk amongst IT scientists about an innovation in computer chip design/technology dubbed “adaptive chips”; these theoretical chips would more closely approximate human brain cells (neurons) — they would actually “think” — and allow a new form or computer processing “behavior” and advance the field of Artificial Intelligence dramatically. And, as always, there were many skeptics who doubted that such behavioral innovations were possible.
But now, such chips (and their functional versatility) are no longer theoretical, thanks to a Cognitive Computing Project funded by IBM and a 40 million dollar grant from DARPA (Defense Advanced Research Projects Agency). This current research project was a continuation of IBM’s 4 year (2006 – 2009), super-computing research project in which its scientists partially simulated a mouse brain, then completely simulated a rat’s brain, and finally, simulated just 1% of a human’s cerebral cortex. In a significant advancement in neuromorphic machine intelligence, IBM announced the successful development of the chips this past week.
Technically referred to as neuro-synaptic chips, these chips contain numerous digital processors which are structurally/functionally akin to our brain’s neurons — complete with synapses (which in the human brain are vital to memory and learning) and axons which serve as the “data pathways” between chip cores.
And, it is not so much what these chips can do, that is causing the excitement, but how they are doing it
In modern computers, processing and memory are separated functions. With the neuro-synaptic chips, each core, or processing engine, has individual processing, communication and memory functions closely tied together. This allows the chips to respond to inputs (information) that it was not programmed to recognize or respond to. All of this allows, in turn, a tremendous increase in parallel processing which allows a computer to perform multiple tasks at the same time, like human brains do.
Quoting IBM Research project leader Dharmendra Modha, in a recent Huffington Post article:
“You have to throw out virtually everything we know about how these chips are designed. The key, key, key difference really is the memory and the processor are very closely brought together. There’s a massive, massive amount of parallelism.”
Although we live in an increasingly digital era, most of the physical world still operates in “analog” mode. In order to analyze real-world processes and make accurate predictions, a computer has to assimilate and analyze billions of analog signals and process these into useful information. This takes more than supercomputing, it takes a revolution in chip design.
It is asserted that these new adaptive chips will lead to perceiving, thinking and sensing computers — attributes that bring the goal of “true” artificial intelligence closer to reality. The integrated neuro-synpatic chips may also lead to a “revolution” in human-machine communication.
Applications include analyzing global financial data, calculating complex hydrological cycles, and even analyzing tidal variations to predict tsunamis.
Image credit: Nrets ; CC – By 2.5
Its not a brain nor is it brain like, but it will find its way into a missile delivery system!
Yea RIte:
Thanks for your comment. I share your concern about potential uses (although at this point, the speculated uses seem mostly benign and useful).
The fundamental idea is that by mimicking structure (of a neuron; note: analog neurons have been developed in the past), a device will gain some of the same “behaviors” or performance (of a brain cell), most importantly, the ability to recognize/process information that it (the chip) was not “programmed to recognize.”
Brain cell plasticity (in functionality, not growth) and versatility, I think, is the immediate goal…ultimately, the goal may also be to understand how a “mind” can arise from so many interconnected “processors”, i.e., neurons.