It’s been half a century since University of California, Los Angeles, computer science professor Jacques Vidal coined the term “brain-computer interface” (BCI) and produced the first peer-reviewed papers on the idea of interpreting brain signals with a computer.
Today, BCI offers huge potential to improve the lives of millions of people around the world, including those with amyotrophic lateral sclerosis (ALS), spinal muscular atrophy (SMA), and other neuromuscular diseases, as well as other conditions that affect the ability to move and communicate.
The two-hour session “Progress in Brain Interface Technology” at MDA’s 2022 Clinical & Scientific Conference in Nashville, Tennessee, in March, shed light on this complex and still experimental technology. Leigh Hochberg, MD, PhD, a neurocritical care physician at Massachusetts General Hospital (MGH) in Boston, co-hosted the session with Daniel Rubin, MD, PhD, a neurologist and researcher at MGH and Harvard Medical School.
Dr. Hochberg explained that all BCI systems have three main components: a neural sensor (which detects brain activity), a decoder (which translates brain activity into motor commands), and assistive technology equipment, which can range from a tablet to a wheelchair to a semi-autonomous robot.
“The goal for any brain-computer interface is to be able to extract signals from the nervous system — from the brain specifically — that are relevant to an intended movement, for example, a movement of the arm, or for somebody with severe dysarthria [weakness in muscles used for speech], to extract intended speech,” Dr. Hochberg said. Those signals from the brain could be turned into actions, such as speaking through a computer that synthesizes speech or moving a robotic arm.
“All kinds of questions need to be answered in order to do that, including what signals we want to be recording from, and what sensor is going to be used,” Dr. Hochberg said.
Using BCI to improve communication and quality of life is a prime focus of Lynn M. McCane, MS, a research scientist at the National Center for Adaptive Neurotechnologies (NCAN), housed at the Stratton VA Medical Center in Albany, New York.
“Back in the 80s, Farwell and Donchin [researchers with the University of Illinois at Urbana-Champaign] had this idea that you could use an event-related potential for people to communicate because it’s very reliable,” Lynn said. “So, imagine yourself looking for your friend in a crowd. You see him, and you have this recognition. That recognition also creates a synchronous response in the brain that you can actually measure with scalp electrodes.”
Researchers developed a crude visual keyboard in which each letter would flash on a screen. As the letters flashed, an electroencephalogram (EEG) would detect electrical activity in the subject’s brain, an algorithm would decode it, and whichever letter had the highest amplitude would be selected. “But as you can imagine, printing one letter at a time is kind of slow,” Lynn said.
“We thought brain-computer interfaces would be a good way to solve this problem, bypassing the motor system and getting signals directly from the brain,” she explained. “But this technology created a whole new set of problems. Our challenge was, could we use BCI reliably in the environment where these people were?”
To find out, NCAN launched a multi-site clinical trial involving veterans. The study was funded by the US Department of Veteran Affairs (VA).
“Veterans have a much higher incidence of ALS than the civilian community, and the VA was interested in quality of life for their veterans,” Lynn said. “Quite a few veterans were interested in the study.”
So far, NCAN has evaluated 119 people and installed 41 in-home BCI systems. “We have gotten to the point where it’s on a laptop and the amplifier is very small; we can put the whole thing in a suitcase,” she said.
While the NCAN clinical trial is advancing innovation in bringing BCI into people’s homes, turning brain signals into actions is still a challenge. But some studies have shown it is possible, including a 2019 study by Edward Chang, MD, a neurosurgeon at the University of California, San Francisco. His team used electrodes temporarily placed on the brains of five volunteers to record signals from the brain’s speech centers — which control muscles in the tongue, lips, jaw, and larynx — as the volunteers read sentences out loud. Then, a computer decoded the signals and used them to synthesize speech.
Dr. Hochberg and Dr. Rubin are involved with a collaborative research effort called BrainGate, which is developing technology that has successfully allowed people who are paralyzed to control a robotic arm by pretending they are moving their own arm.
Australian neurologist Thomas Oxley, MD, PhD, is an instructor and director of innovation strategy for Mount Sinai Health System’s Department of Neurosurgery in New York. He’s also CEO of Synchron, which builds next generation BCI solutions, including a novel stent electrode, called Stentrode, that is inserted through blood vessels. This is currently the only brain-interface device that can be implanted without open brain surgery.
The market for such technology is immense, considering more than 5 million Americans live with severe paralysis. The most frequent causes are stroke, multiple sclerosis, ALS, muscular dystrophy, myasthenia gravis, and spinal cord injury. Worldwide, according to Dr. Oxley, some 90 million people have mild limb impairment.
BCI’s potential and the number of people who could benefit are growing every year. Dr. Oxley pointed out that the devices being developed in 2022 are beyond what was envisioned in 2012. “And we don’t have any idea what these devices will look like in 2032,” he said.
The leaders in the BCI field who presented at MDA’s conference will certainly have a hand in shaping that vision.