(Tech Xplore)—IBM and the Air Force Research Laboratory are working to develop an artificial intelligence-based supercomputer with a neural network design that is inspired by the human brain.
The work involves building a supercomputer that behaves like a natural brain— in that these chips operate in a fashion similar to the synapses within a biological brain. The system is powered by a 64-chip array of the IBM TrueNorth Neurosynaptic System.
This 64-chip array, said Andrew Tarantola in Engadget, will contain the processing equivalent of 64 million neurons and 16 billion synapses."
In technical terms, the two are partnering to improve the "TrueNorth line of chips designed to optimize the performance of machine learning models at the hardware level," said John Mannes in TechCrunch.
The system fits in a 4U-high (7") space in a standard server rack, said IBM, and eight such systems will enable the scale of 512 million neurons per rack.
How are the chips different from conventional CPUs?
"Each core is part of a distributed network and operate in parallel with one another on an event-driven basis. That is, these chips don't require a clock, as conventional CPUs do, to function," said Tarantola. If a core fails, the rest of the array will continue to work.
Observers are also calling up its low power consumption. "This 64-chip array will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet absolutely sips energy," said Engadget.
StreetInsider said that "the processor component will consume the energy equivalent of a dim light bulb – a mere 10 watts to power."
Beyond the collab with the Air Force, IBM believes the low power consumption of its chips could some day bring value, said TechCrunch, "in constrained applications like mobile phones and self-driving cars."
Indeed, an IBM Research posting caption for a smartphone says, "Low power chips could make your mobile phone as powerful as a supercomputer."
Traditional computers (left brain) focus on analytical thinking and language.
Neurosynaptic chips, though, address the senses and pattern recognition (right brain).
IBM said its scientific quest is how to meld these two capabilities together into holistic computing intelligence.
The IBM news release on the collaboration said the scalable platform IBM is building for AFRL "will feature an end-to-end software ecosystem designed to enable deep neural-network learning and information discovery."
The news release further described how this melding can occur:
"The IBM TrueNorth Neurosynaptic System can efficiently convert data (such as images, video, audio and text) from multiple, distributed sensors into symbols in real time. AFRL will combine this 'right-brain' perception capability of the system with the 'left-brain' symbol processing capabilities of conventional computer systems. The large scale of the system will enable both 'data parallelism' where multiple data sources can be run in parallel against the same neural network and 'model parallelism' where independent neural networks form an ensemble that can be run in parallel on the same data."
Why is the Air Force interested and how would they use this technology?
Washington Technology said, "AFRL is investigating potential uses of the system in embedded, mobile and autonomous settings where limitations exist on the size, weight and power of platforms."
Tarantola: The Air Force wants to combine TrueNorth's ability to convert multiple data feeds—audio, video or text—into machine-readable symbols with a conventional supercomputer's ability to crunch data.
(AFRL seeks to combine that so-called "right-brain" function with "left-brain" symbol processing capabilities in conventional computer systems," said Washington Technology.)
In the Air Force context, Mannes said applications could include its use in satellites and unmanned aerial vehicles (UAVs).
Meanwhile, reports noted on Friday that the technology is still very much in the early stages. Mannes in TechCrunch: "IBM's chips are still too experimental to be used in mass production, but they've shown promise in running a special type of neural network called a spiking neural network."
At this juncture, it is useful to know the technology has had its detractors. TechCrunch said in 2014, a research director at Facebook expressed skepticism at TrueNorth's ability to deliver value in a real-world application. The chips were designed for spiking neural networks, but he said it was a type of network that did not show as much promise as convolutional neural networks on common tasks like object recognition.
Mannes commented: "We haven't fully explored all the potential applications of this type of computing, so while it's very reasonable to be conservative, researchers have little incentive to completely disregard the potential of the project."
The IBM TrueNorth Neurosynaptic System was originally developed under the auspices of Defense Advanced Research Projects Agency's (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program in collaboration with Cornell University. Research with TrueNorth is currently being performed by over 40 universities, government labs and industrial partners on five continents.