IBM has unveiled a whole new programming paradigm as it bids to become the leader in cognitive computing.
Big Blue researchers believe for brain-like computing to become a reality, new programming languages and architectures are required. IBM is out to create them, in a project funded by the US Defense Advanced Research Projects Agency (DARPA), which wins our award for Acronym of the Week: it is called SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics).
The plan is to create “a network of neurosynaptic processor cores, which, like the brain, is modular, parallel, distributed, fault-tolerant, event-driven, and scalable,” said Dr Dharmendra Modha, principal investigator and senior manager at IBM Research
“It’s a high-level description of a software program that is based on re-usable building blocks of code – the corelets,” Modha explained, in a blog post.
“Each corelet represents a method for getting something done using the combination of computation (neuron), memory (synapses), and communication (axons) on individual neurosynaptic processor cores along with inter-core connectivity.”
The corelets on their own do simple small tasks, but they can combine to carry out different functions.
Modha hopes the new programming model will have the same impact as FORTRAN did on high-level programming, making it easy for anyone to create “sophisticated cognitive applications”.
IBM has also produced a software simulator of a cognitive computing architecture, comprising a network of neurosynaptic cores. “We have demonstrated the simulation of the new architecture at unprecedented scale of 1014synapses,” Modha noted.
And it has created a “neuron model”, made up of equations that can carry out “a wide variety of computational functions and neural codes and can qualitatively replicate the 20 biologically-relevant behaviours of a dynamical neuron model”, Modha said.
The research team produced a library of software designs that can use this kind of massively parallel computing.
The ultimate goal is to create a chip system with 10 billion neurons and 100 trillion synapses, consuming just one kilowatt of power. This sounds a lot, but it still only has one tenth of the human brain’s 100 billion neurons, and the number of synapses is at the low end of the capacity of the human brain which starts out with a quadrillion synapses, and declines in adults to between 100 and 500 trillion.
How much do you know about IBM? Try our quiz!
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…
View Comments
Not new and not "news"!!! What Hopfield and student Sejnowski never realized nor gave credit for is as follows: Artificial neural-networks(ANN) patterned on biological neural networks(BNN) artificial-intelligence(ANN) were alive and well long before 1980 when physicist Edward Siegel [consulting with Richard Feynman(Caltech) for ANN AI pioneer Charles Rosen(Machine-Intelligence) & Irwin Wunderman(H.P.) & Vesco Marinov & Adolph Smith(Exxon Enterprises/A.I.) discovered trendy much-hyped "quantum-computing" by two-steps: (1) "EUREKA": realization that ANNs by-rote on-node switching sigmoid-function 1/[1 + e^(E/T)] ~ 1/[1 + e^(hw/kT)] ~ 1/[+ 1 + e^(E/T)] ~ 1/[ + 1 + e^(hw/kT)] is Fermi-Dirac quantum-statistics 1/[1 + e^ (E/ T)] ~ 1/[1 + e^(hw/kT)] ~ 1/[+ 1 + e^(E/T)] ~ 1/[ + 1 + e^(hw/kT)] = 1/[e^(hw/kT) + 1] dominated by Pauli exclusion-principle forcing non-optimal local-minima(example: periodic-table's chemical-elements) forcing slow memory-costly computational-complexity Boltzmann-machine plus simulated-annealing, but permitting from non-optimal local-minima to optimal global-minimum quantum-tunneling!!! (2) "SHAZAM": quantum-statistics "supersymmetry"- transmutation from Fermi-Dirac to Bose-Einstein 1/[+ 1/[e^(hw/kT) + 1] ---> 1/[e^(hw/kT) - 1] ~ 1/f power-spectrum, with no local-minima and permitting Bose-Einstein condensation( BEC) via a noise-induced phase-transition (NIT). Frohlich biological BEC & BNN 1/f-"noise"~"generalized-susceptibility" power-spectrum concurred!!! Siegel's work[IBM Conference on Computers & Mathematis,Stanford(1986); Symposium on Fractals, MRS Fall Meeting, Boston(1989)=five seminal-papers!!!] was used without any attribution whatsoever as/by "Page-Brin" PageRank[R. Belew, Finding Out About, Cambridge(2000)]Google first search-engine!!! Siegel empirically rediscovered Aristotle's"square-of-opposition" in physics and mathematics, which three-dimensionally tic-tac-toe diagrams synonyms(functors) versus antonyms (morphisms) versus analogy/metaphor. Amazingly neuroimager Jan Wedeen has recently clinically discovered just such a three-dimensional network of neurons which dominates human brain thinking. Siegel "FUZZYICS=CATEGORYICS=PRAGMATYICS"/ Category-Semantics Cognition for physics/mathematics is a purposely-simple variant of Altshuler-Tsurakov-Lewis "TRIZ"(Russian acronym: "Method of Inventive Problem-Solving") embodied in softwares Invention-Machine(Boston) and Ideation(Michigan)for engineers inventing optimality!
Dr. Edward Siegel
"physical-mathematicist"
CATEGORYSEMANTICS@GMAIL.COM
(206) 659-0235