A new research report from Stanford University highlights a high performance brain-to-computer interface that can enable people with paralysis to type words and messages with much higher performance than has previously been demonstrated.
One of the first authors of the report, published today in eLife, is Chethan Pandarinath, Ph.D., assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. Pandarinath helped lead the research at Stanford before his recent move to Emory and Georgia Tech.
"The performance is really exciting," said Pandarinath, in a Stanford news release. "We're achieving communication rates that many people with arm and hand paralysis would find useful. That's a critical step for making devices that could be suitable for real-world use."
The research team worked with two people with Amyotrophic Lateral Sclerosis (ALS) and one person with spinal cord injury. They each had small, pill-sized electrode arrays implanted into their brains, allowing the researchers to read out electrical activity as the individuals thought about moving. The researchers then decoded this activity to allow the participants to control a cursor on a computer screen and type out words and messages.
"Here at Emory, we will follow-up on this work in several ways," says Pandarinath. "First, while this is a promising proof-of-concept, in the long-term we want to be able to restore much more function, for instance, being able to control a robotic arm with this same level of performance, to enable reaching and grasping of objects."
Over the next few years Pandarinath and his Atlanta biomedical engineering team plan to bring a clinical trial of brain-machine interfaces to Emory and Georgia Tech, where they can further push the performance of these devices by leveraging partnerships with world-class clinical resources and engineering across both communities. That includes neurosurgery and neurology at Emory and engineering at Georgia Tech.
"Second, ultimately we want brain-machine interfaces that restore more natural control of external devices. To do so we want to be able to restore sensation as well -- providing the user with sensory feedback so they can 'feel' when they grasp objects. To do this we need to 'write' that information into the brain (rather than just reading information out)," says Pandarinath.
In order to do that, he explains, the team will need a better understanding of what natural sensory responses look like and develop new technologies for interfacing with the brain. His laboratory is developing new techniques in animal models, and they hope to eventually translate those techniques into human subjects.
Resources:
- Link to published journal paper or abstract.
- Link to Stanford news release.
- Link to Stanford video.
Media Contact:
Walter Rich
Communications Manager
Wallace H. Coulter Department of Biomedical Engineering
Georgia Institute of Technology
Walter Rich