A Blog by Jonathan Low

 

May 18, 2021

Brain Implant Turns Thoughts Into Text With 90 Percent Accuracy

One stroke away from telepathy. JL

Shelly Fan reports in Singularity Hub:

By imagining the motions of writing letters, a man with spinal injury was able to translate thoughts into text, at a speed that rivals thumb typing on a smartphone. At 90 characters per minute and an accuracy of over 90%, the system leapfrogs every record previously using neural implants. A neural implant uses AI to convert electrical brain signals, generated as someone imagines handwriting into text displayed onto a computer in real time.

Texting might not be faster than speech, but for many of us it’s a natural way to communicate.

Thanks to a new brain-computer interface (BCI), people with paralysis can now do the same—with a twist. By imagining the motions of writing letters, a man with spinal injury was able to translate thoughts into text, at a speed that rivals thumb typing on a smartphone. At 90 characters per minute and an accuracy of over 90 percent after autocorrect, the system leapfrogs every record previously accomplished using neural implants.

The crux is an algorithm based on a popular and very powerful neural network—recurrent neural network (RNN)—plus a few tricks from the machine learning community. The result is a neural implant that uses AI to convert electrical brain signals, generated as someone imagines handwriting into text that’s displayed onto a computer in real time.

“This … could help restore communication in people who are severely paralyzed, or ‘locked-in,’” said study author Dr. Frank Willett at Stanford’s Neural Prosthetics Translational Laboratory. “It should help people express themselves and share their thoughts. It’s very exciting.”

“Mindtexting” may just be the start. The study suggests that counter-intuitive to common belief, AI seems to be better at decoding brain signals that underlie our more complex behaviors, rather than simple ones—an invitation to reimagine the potential of a brain-computer symbiosis.

“Although much work remains to be done, Willett and co-workers’ study is a milestone that broadens the horizon of iBCI [invasive brain-computer interface] applications,” said Drs. Pavithra Rajeswaran and Amy Orsborn, at the University of Washington who were not involved in the study. “Because it uses machine learning methods that are rapidly improving, plugging in the latest models offers a promising path for future improvements.”

Typing Without Hands

The study is part of the legendary BrainGate project, which has led the development of neural interfaces for the past decade to restore communications in people who are paralyzed. To be clear, these “implants” are true to their name: they are microarrays of tiny electrodes on a chip that’s surgically inserted into the top layer of the brain.

BrainGate’s got many mind-blowing hits. One is an implant that lets people pilot robotic arms with thought. Another success helped paralyzed people move a computer cursor with their minds on an Android tablet, expanding their digital universe to the entire Android app sphere, and of course, email and Google.

This is all possible because the central processor, the motor cortex, is still intact even after paralysis, at least for relatively simple movements such as reaching or grasping. It’s like cutting your wireless router cable: you lose online access, but the network itself is still there. Neural implants tap straight into the source—the electrical signals that underlie our every move—decode them into language that computers understand, and use them to control another output: a robotic hand, exoskeleton, or a cursor on the screen.

The problem? Using your mind to control a cursor to hit letters on a digital keyboard is terribly slow. The most successful implant so far averages 40 characters per minute, and requires surgery and training. Even an off-the-shelf eye-tracking keyboard that’s non-invasive can let people with paralysis type marginally faster.

The new study took a completely different approach: toss the keyboard.

A Spark of Genius

The study participant, dubbed T5, is a long-time BrainGate participant.

Back in 2007, T5 suffered from a traumatic incident that damaged his spinal cord and deprived him of movement below his neck. In 2016, Dr. Jaimie Henderson, a neurosurgeon at Stanford, implanted two microarray chips into the “hand area” of T5’s left precentral gyrus, a part of the brain that normally helps us plan and control motion. Each chip contained 96 microelectrodes to tap into his electrical brain activity. These neural signals were then sent to a computer through wires for further processing.

Here’s where the magic comes in. Neurons are a loud, noisy bunch, and deciphering specific signals—neural codes—that control single movements is incredibly difficult. It’s partly why it’s currently impossible for someone to imagine a letter and have it “mind-read” by a BCI setup. The brain’s electrical signals that encode for different letters are too subtle for any algorithm to accurately decode.

The new study’s workaround is outside the box and utterly brilliant. Because the process of writing alphabetical letters is quite unique for each letter, the team reasoned, it may trigger neural signals that are different enough for an algorithm to tell apart which imagined movement—and its associated brain signal—corresponds to which letter.

To start, patient T5 first traced an individual letter repeatedly in his mind (in print, not cursive). Although his hand was completely still, the authors said, he “reported feeling as though an imaginary pen in his hand was physically moving and tracing out the letter shapes.” T5 next spent hours imagining writing groups of random sentences.

All the while, his implants captured neural signals related to writing each letter, which were “remarkably consistent.” The data was then used to train a type of artificial neural network called a recurrent neural network (RNN), which is “especially good at predicting sequential data.” Because RNNs tend to be data-hungry, the team used a trick called data augmentation that reshuffled previous neural signals, essentially generating artificial data to beef up the algorithm. They also injected some noise into the data, with hopes that the eventual BCI would be more robust against slight changes in brain activity.

Mind-Texting Dominance

Over time, the RNN was able to decode neural signals and translate them into letters, which were displayed on a computer screen. It’s fast: within half a second, the algorithm could guess what letter T5 was attempting to write, with 94.1 percent accuracy. Add in some run-of-the-mill autocorrect function that’s in everyone’s smartphones, and the accuracy bumped up to over 99 percent.

When asked to copy a given sentence, T5 was able to “mindtext” at about 90 characters per minute (roughly 45 words by one estimate), “the highest typing rate that has yet been reported for any type of BCI,” the team wrote, and a twofold improvement over previous setups. His freestyle typing—answering questions—overall matched in performance, and met the average speed of thumb texting of his age group.

“Willett and co-workers’ study begins to deliver on the promise of BCI technologies,” said Rajeswaran and Orsborn, not just for mind-texting, but also what comes next.

The idea of tapping into machine learning algorithms is smart, yes, because the field is rapidly improving—and illustrating another solid link between neuroscience and AI. But perhaps more importantly, an algorithm’s performance relies on good data. Here, the team found that the time difference between writing letters, something rather complex, is what made the algorithm perform so well. In other words, for future BCIs, “it might be advantageous to decode complex behaviors rather than simple ones, particularly for classification tasks.”

The new system isn’t yet ready for the clinics. It’ll have to be tested in additional people and have some common typing functions added, such as delete or text editing. The team also wants to add the ability for mindtexting capital letters and symbols.

But the new BCI doesn’t have to function alone. Other BCIs that translate neural activities of speech into text already exist, and it’s conceivable for a person to potentially shift between the two methods—mental writing and speaking—to communicate with others. “Having those two or three modes and switching between them is something we naturally do [in daily life],” said Dr. Krishna Shenoy at Stanford University, who supervised the study with Dr. Henderson.

But that’s all for the future. The immediate next step, the authors said, is to work with patients who can’t speak, such as people who lost the ability due to stroke or neurodegenerative diseases, or those who’re conscious but cannot move at all, and restore their ability to interact with the outside world. “The authors’ approach has brought neural interfaces that allow rapid communication much closer to a practical reality,” said Rajeswaran and Orsborn.

0 comments:

Post a Comment