Technology Decodes Inner Monologue in Those With Paralysis

We all engage in inner dialogue—whether it’s a pep talk or silent reflection. These unspoken thoughts often reveal more than what we say aloud.
Image Credits: singularityhub

We all engage in inner dialogue—whether it’s a pep talk or silent reflection. These unspoken thoughts often reveal more than what we say aloud.

Now, Stanford researchers have developed an AI system that decodes internal speech to help people with paralysis communicate—especially those who struggle with current brain-to-speech tools.

Unlike older systems that require attempted speech, this decoder translates silent thoughts into speech with up to 74% accuracy. To protect privacy, users can activate it using a mental “neural password.”

This is the first time we’ve identified brain activity linked solely to thinking about speech,” said study author Erin Kunz. “For those with severe motor and speech impairments, this could enable easier, more natural communication.

How Brain Implants Turn Thoughts into Words

Normally, speech begins with brain signals that control the muscles involved in speaking. Implants can detect these signals, allowing people with paralysis to communicate again. Some recent systems even translate these signals into speech in real time. For example, a man with ALS used an AI implant to convert his brain activity into clear sentences. Another stroke survivor’s thoughts were similarly decoded into real-time conversation.

However, these systems still depend on users attempting to speak. For people unable to engage their vocal muscles at all, researchers are now developing decoders that translate pure internal thoughts into words.

Earlier studies found that inner speech activates similar—but not identical—brain regions as spoken language. Yet pinpointing those regions has been difficult.

To dig deeper, the Stanford team partnered with four BrainGate2 trial participants with implanted microelectrode arrays. One, a 68-year-old woman with ALS, could still vocalize but not clearly. Another, a 33-year-old man with locked-in syndrome, could only move his eyes but was mentally sharp.

How Brain Signals Reveal Silent Thoughts

To decode inner speech, researchers recorded electrical signals from the participants’ motor cortexes during two tasks: attempting to speak and silently thinking of single-syllable words like “kite” or “day.” In some cases, participants also listened to or mentally read the words. By analyzing and comparing brain activity across these different scenarios, the team was able to identify the specific regions of the motor cortex involved in inner speech.

With those brain maps, they trained an AI model to interpret the participants’ internal thoughts.

Balancing Vocabulary Size and Decoding Precision

The system, while promising, was not flawless. Even with a restricted 50-word vocabulary, decoding accuracy varied—error rates ranged from 14% to 33% depending on the individual. For two participants, the AI attempted to decode phrases using a much larger 125,000-word vocabulary, but accuracy dropped further. In one case, a sentence like “I think it has the best flavor” was mistakenly translated as “I think it has the best player.” Still, some sentences—such as “I don’t know how long you’ve been here”—were decoded correctly.

Despite some errors, study author Benyamin Meschede-Krasa noted that simply thinking about speech—rather than trying to speak—could make communication faster and easier.

Early inner speech tests were prompted, like being told “don’t think of an elephant.” To explore spontaneous inner speech, the team had a participant play a memory game involving arrows and visual cues. The decoder successfully picked up her mental speech linked to the game.

Researchers also asked participants to think about personal topics, like their favorite food or movie. While the system picked up more words than during blank-mind tests, most results were still gibberish, with only occasional clear phrases.

In other words, the AI isn’t reading minds—yet.

Safeguards for Thought-Decoding Technology

As the tech advances, it may one day detect unintentional thoughts, raising privacy concerns. To address this, the team built in safeguards: one distinguishes inner speech from speech attempts (useful only if users can still try to speak), and another uses a mental “password” to activate decoding. In one trial, the system detected the password with 99% accuracy.

As brain-computer interfaces evolve, protecting mental privacy will be crucial. The new decoder, which taps directly into inner speech, could reduce user effort and speed up communication—especially for those with locked-in syndrome.

The future of BCIs is bright,” said study author Frank Willett. “This work gives real hope that they could one day enable fluent, natural conversations.


Read the original article on: Singularity Hub

Read more: Laser-etched Black Metal Brings Niche Solar Tech Into the Spotlight