Darpa funds brain-machine interface project for controlling weapons via thoughts
Image credit: Illustration: DARPA
The US Defence Advanced Research Projects Agency (Darpa) has officially funded a programme to develop a brain-to-machine interface – in the form of a headset designed to let military personnel control weapons through brain activity alone.
The Next-Generation Nonsurgical Neurotechnology (N3) programme, which was first announced in March 2018, allows for the development of high-resolution, bidirectional brain-machine interfaces for use by able-bodied service members.
According to Darpa, these wearable interfaces could ultimately enable diverse national security applications such as control of active cyber-defence systems and swarms of unmanned aerial vehicles.
The agency is also hoping such an interface could make it easier for service members to carry out complex tasks as well as help them multitask.
“Darpa is preparing for a future in which a combination of unmanned systems, artificial intelligence, and cyber operations may cause conflicts to play out on timelines that are too short for humans to effectively manage with current technology alone,” said N3 programme manager Al Emondi.
“By creating a more accessible brain-machine interface that doesn’t require surgery to use, Darpa could deliver tools that allow mission commanders to remain meaningfully involved in dynamic operations that unfold at rapid speed.”
Over the past 18 years, Darpa has demonstrated increasingly sophisticated neurotechnologies that rely on surgically implanted electrodes to interface with the central or peripheral nervous systems.
For the military’s primarily able-bodied population to benefit from neurotechnology, nonsurgical interfaces are required, with the agency highlighting how the N3 project aims to “create reliable neural interfaces without the need for surgery or implanted electrodes”.
According to Darpa, the N3 teams are pursuing a range of approaches that use optics, acoustics and electromagnetics to record neural activity and/or send signals back to the brain at high speed and resolution. The research is split between two tracks.
The teams are pursuing either completely non-invasive interfaces that are entirely external to the body or minutely invasive interface systems, including nanotransducers, that can be temporarily and non-surgically delivered to the brain to improve signal resolution (see graphic below).
Darpa also said that the technology has to be “read and write”, meaning that it will be bi-directional. It will not only be used for soldiers to control a drone swarm – an example used by Darpa – but will also put sensory information inside people’s brains, making them feel pressure or actually see things.
The latter scenario is actually something Rice University — one of the recipients of Darpa’s multi-million-dollar funding for N3 — is working on: a system that will allow a blind person or anyone connected to it the vision of what another person is seeing.
If research in this area were to be successful, the next step will be to emulate the brain activity to reproduce images taken with a digital camera.
Darpa envisions two ways to make this happen. One is completely non-invasive, which will use something similar to a helmet, a diadem, or some other apparatus to transmit radio frequency waves, which will transmit information into and out of the brain.
The ultra-sound, light, RF and magnetic fields, with the system of algorithms to decode and encode the brain’s motor and cognitive signals, affect specific areas of the brain.
The target for the non-invasive technology is to have a closed-system loop latency — the speed of which the entire system works in and out and in the brain — of 50 milliseconds, which is less than the speed of an eye blink. The report by Darpa also mentions six degrees-of-freedom control of machine.
The other one is called ‘minutely invasive neural interfaces’, which will require a substance to enter the subject’s body either orally, through a nasal spray or some other mechanism such as injections.
Rather than affecting areas of the brain, Darpa expects this to work at single-neuron resolution, connecting to each neuron individually, with the agency anticipating that this method could achieve ten degrees of freedom.
“If N3 is successful, we’ll end up with wearable neural interface systems that can communicate with the brain from a range of just a few millimetres, moving neurotechnology beyond the clinic and into practical use for national security,” Emondi said.
“Just as service members put on protective and tactical gear in preparation for a mission, in the future they might put on a headset containing a neural interface, use the technology however it’s needed, then put the tool aside when the mission is complete.”
Darpa has awarded funding to six organisations to support the Next-Generation Nonsurgical Neurotechnology (N3) programme.
Battelle Memorial Institute, Carnegie Mellon University, Johns Hopkins University Applied Physics Laboratory, Palo Alto Research Center (PARC), Rice University and Teledyne Scientific are among the multidisciplinary teams who are developing the interfaces.
In April 2019, researchers from UC Berkeley and the US Institute for Molecular Manufacturing (iMM) predicted that exponential progress in nanotechnology, nanomedicine, artificial intelligence and computation will lead to the development of a “human brain/cloud interface” (B/CI), which connects neurons and synapses in the brain to vast cloud-computing networks in real time.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.