DARPA, the US Military’s ultra-secretive research agency, has unveiled a $65 million dollar plan to plug a human brain directly into a computer and allow them to communicate with each other.
If that sounds like science fiction then we’re here to tell you that it is, but not for long.
The hope is that by allowing humans and computers to communicate it can usher in a new era of people having ‘super senses’ or by allowing soldiers to learn new skills simply by having them ‘downloaded’ into their brain.
DARPA has been working on what is known as a neural interface system for over a decade but in the last few years progress has started to pick up pace.
In January 2016 the organisation unveiled a brand-new program called Neural Engineering System Design (NESD) which would focus specifically on creating an “implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world.”
This new $65m injection will see the money split up and sent to five research organisations, all of whom are working to develop that neural interface.
So how would such an interface work? Well quite simply it would take the electrical signals used by neurons in the brain and then convert them into the ones and zeros that today’s computers use.
Phillip Alvelda, the founding NESD Program Manager said: “By increasing the capacity of advanced neural interfaces to engage more than one million neurons in parallel, NESD aims to enable rich two-way communication with the brain.”
If getting a computer to communicate with one million neurons sounds complex, consider the fact that there are 86 billion neurons in the human brain.
One million would still be a huge accomplishment however and would, explains Alvelda, allow us to deliver rich sensory signals directly to the brain.
For those of you curious about what each of the five teams will be working on DARPA provides a complete breakdown of each contract:
A Brown University team led by Dr. Arto Nurmikko will seek to decode neural processing of speech, focusing on the tone and vocalization aspects of auditory perception. The team’s proposed interface would be composed of networks of up to 100,000 untethered, submillimeter-sized “neurograin” sensors implanted onto or into the cerebral cortex. A separate RF unit worn or implanted as a flexible electronic patch would passively power the neurograins and serve as the hub for relaying data to and from an external command center that transcodes and processes neural and digital signals.
A Columbia University team led by Dr. Ken Shepard will study vision and aims to develop a non-penetrating bioelectric interface to the visual cortex. The team envisions layering over the cortex a single, flexible complementary metal-oxide semiconductor (CMOS) integrated circuit containing an integrated electrode array. A relay station transceiver worn on the head would wirelessly power and communicate with the implanted device.
A Fondation Voir et Entendre team led by Drs. Jose-Alain Sahel and Serge Picaud will study vision. The team aims to apply techniques from the field of optogenetics to enable communication between neurons in the visual cortex and a camera-based, high-definition artificial retina worn over the eyes, facilitated by a system of implanted electronics and micro-LED optical technology.
A John B. Pierce Laboratory team led by Dr. Vincent Pieribone will study vision. The team will pursue an interface system in which modified neurons capable of bioluminescence and responsive to optogenetic stimulation communicate with an all-optical prosthesis for the visual cortex.
A Paradromics, Inc., team led by Dr. Matthew Angle aims to create a high-data-rate cortical interface using large arrays of penetrating microwire electrodes for high-resolution recording and stimulation of neurons. As part of the NESD program, the team will seek to build an implantable device to support speech restoration. Paradromics’ microwire array technology exploits the reliability of traditional wire electrodes, but by bonding these wires to specialized CMOS electronics the team seeks to overcome the scalability and bandwidth limitations of previous approaches using wire electrodes.
A University of California, Berkeley, team led by Dr. Ehud Isacoff aims to develop a novel “light field” holographic microscope that can detect and modulate the activity of up to a million neurons in the cerebral cortex. The team will attempt to create quantitative encoding models to predict the responses of neurons to external visual and tactile stimuli, and then apply those predictions to structure photo-stimulation patterns that elicit sensory percepts in the visual or somatosensory cortices, where the device could replace lost vision or serve as a brain-machine interface for control of an artificial limb.
Computers have advanced to such a level that the brain is already able to communicate with computers, albeit in a more indirect fashion.
Modern prosthetic limbs allow a remarkable amount of movement and precision by tapping into either brainwaves or nerve signals directly. What DARPA wants to do is improve the quality and bandwidth of that communication.
Phillip Alvelda, the NESD program manager explains, “Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem,”
Alvelda isn’t the only one that strongly believes computers and humans will become physically linked.
However just like Alvelda he pointed out that the problem wasn’t the computers themselves, it was the way we were communicating with them.
“It’s mostly about the bandwidth, the speed of the connection between your brain and the digital version of yourself, particularly output,” he said.
Musk explained that computers are able to communicate at “a trillion bits per second” while humans are left trailing in their wake at about ten bits per second when we are texting or communicating via our smartphones.