Computers could soon work like the HUMAN BRAIN: IBM unveils groundbreaking new PC chip architecture
- New programming architecture will allow developers to design applications for brain-like chips that are currently in development by IBM
- Chips could lead to machine that have capacity for perception and thought
PUBLISHED: 06:19 EST, 8 August 2013 | UPDATED: 06:23 EST, 8 August 2013
Scientists believe we are one step closer to creating an artificial mind.
IBM has announced a new programming architecture for chips inspired by the human brain.
The company claims the chips could pave the way for smart sensor networks that mimic the brain’s capacity for perception, action, and thought.
One day, it could allow computer scientists to develop a machine with a brain that is even more intelligent than that of humans.
IBM’s said its new programming architecture will allow developers to design applications for these brain-like chips once they released.
‘Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,’ said Dr Dharmendra Modha, the principal investigator for the project at IBM Research.
‘While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.’
The computers we use today were designed decades ago for sequential processing according to a pre-defined program.
Although they are fast and precise ‘number crunchers,’ our computers struggle to deal with real-time processing of the noisy, voluminous, big data produced by the world around us.
In contrast, the brain- which operates comparatively slowly and at low precision- excels at tasks such as recognising, interpreting, and acting upon patterns.
Overall the brain, consumes the same amount of power as a 20 watt light bulb and occupying the volume of a two-litre bottle.
In August 2011, IBM demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of ‘neurosynaptic cores.’
The chip’s memory functions as synapses would in the brain, the processors as neurons and communication as nerve fibres.
These chips attempt to replicate and improve the brain’s ability to respond to biological sensors and analysing vast amounts of data from many sources at once.
To achieve this, the researchers developed a highly scalable functional software simulator of a cognitive computing architecture comprising a network of neurosynaptic cores.
Alongside this they created a digital neuron model which acts as the main information processing unit.
The company claims that within this, a network of such neurons can sense, remember, and act upon a variety of different situations.
IBM, in collaboration with Cornell University and iniLabs, has been awarded $12 million in new funding from the Defense Advanced Research Projects Agency (DARPA) to advance the research for this part of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project.
The project is part of the same research that led to IBM’s announcement in 2009 that it had simulated a cat’s cerebral cortex, the thinking part of the brain, using a massive supercomputer.
Using progressively bigger supercomputers, IBM had previously simulated 40 per cent of a mouse’s brain in 2006, a rat’s full brain in 2007, and one per cent of a human’s cerebral cortex in 2009.
Eventually IBM wants to build a chip system with ten billion neurons and hundred trillion synapses.
IBM REVEALS ITS VISION FOR FUTURE TECHNOLOGY IN 2018
If you’ve only just got used to talking to your phone, get ready for a major change.
In December, IBM revealed its predictions for the computer we will all be using in 2018 – and it believes they will have all five senses, and will communicate with us in radically different ways.
‘Infrared and haptic technologies will enable a smart phone’s touchscreen technology and vibration capabilities to simulate the physical sensation of touching something,’ the firm said.
‘So you could experience the silkiness of that catalog’s Egyptian cotton sheets instead of just relying on some copywriter to convince you.
‘It’s amazing when you look back over the 60+ years of the computing revolution and see how far we have come in such a relatively short time,’ said IBM’s Bernard Meyerso.