Surgically implanted devices that allow paralyzed people to speak can also listen to their inner monologue.
That’s the conclusion of a study on brain-computer interfaces (BCIs) in the journal Cell.
The discovery could lead to BCIs that allow paralyzed users to produce synthesized speech more quickly and with less effort.
But the idea that the new technology can decode a person’s inner voice is “disturbing,” says Nita Farahany, professor of law and philosophy at Duke University and author of the book: The Battle for Your Brain.
“The further we advance in this research, the more transparent our brains become,” says Farahany, adding that measures to protect people’s mental privacy are lagging behind the technology that decodes brain signals.
From Brain Signal to Speech
BCIs are capable of decoding speech using tiny arrays of electrodes that monitor activity in the brain’s motor cortex, which controls the muscles involved in speech. Until now, these devices relied on signals produced when a paralyzed person is actively trying to speak a word or phrase.
“We’re recording the signals as they try to speak and translating those neural signals into the words they’re trying to say,” says Erin Kunz, a postdoctoral researcher at Stanford University’s Neural Prosthetics Translational Laboratory.
Relying on signals produced when a paralyzed person tries to speak makes it easier for that person to mentally shut up and avoid sharing too much. But it also means they have to make a concentrated effort to convey a word or phrase, which can be tiring and time-consuming.
So Kunz and a team of scientists set out to find a better way—by studying the brain signals of four people who already used BCIs to communicate.
The team wanted to know if they could decode much subtler brain signals than those produced by attempted speech. The team wanted to decode imagined speech.
During attempted speech, a paralyzed person does their best to physically produce comprehensible spoken words, even if they can no longer. In imagined or inner speech, the individual simply thinks of a word or phrase—perhaps imagining how it would sound.
The team found that imagined speech produces signals in the motor cortex similar to those of attempted speech, but weaker. And with the help of artificial intelligence, they were able to translate those weaker signals into words.
“We achieved up to 74% accuracy in decoding sentences from a vocabulary of 125,000 words,” says Kunz.
Decoding a person’s inner speech made communication faster and easier for the participants. But Kunz says the success raised an unsettling question: “If inner speech is similar enough to attempted speech, could it leak unintentionally when someone is using a BCI?”
Their research suggested yes, in certain circumstances, such as when a person was silently recalling a sequence of directions.
Password Protection?
So the team tested two strategies to protect BCI users’ privacy.
First, they programmed the device to ignore inner speech signals. That worked, but it removed the speed and ease associated with decoding inner speech.
Then Kunz says the team borrowed an approach used by virtual assistants like Alexa and Siri, which wake up only when they hear a specific phrase.
“We chose Chitty Chitty Bang Bang, because it doesn’t occur very often in conversations and is highly identifiable,” says Kunz.
This allowed participants to control when their inner speech could be decoded.
But the safeguards tested in the study “assume that we can control our thoughts in ways that may not actually correspond to how our minds work,” says Farahany.
For example, says Farahany, the study participants couldn’t prevent the BCI from decoding the numbers they were thinking of, even if they didn’t intend to share them.
This suggests that “the boundary between public and private thought may be blurrier than we assume,” says Farahany.
Privacy concerns are less problematic with surgically implanted BCIs, which are well understood by users and will be regulated by the Food and Drug Administration when they hit the market. But that kind of education and regulation may not extend to next-generation consumer BCIs, which will likely be worn like caps for activities like video gaming.
Early consumer devices won’t be sensitive enough to detect words like implanted devices do, says Farahany. But the new study suggests that capability could be added someday.
If so, says Farahany, companies like Apple, Amazon, Google, and Meta could find out what’s happening in a consumer’s mind, even if that person doesn’t intend to share the information.
“We have to recognize that this new era of brain transparency is truly a whole new frontier for us,” says Farahany.
But it’s encouraging, she says, that scientists are already thinking of ways to help people keep their private thoughts secret.
Source: npr.org by Jon Hamilton



