A team from MIT can read the words you think and turn them into speech

  • A team at MIT is using subvocalisation to allow subjects to speak with their thoughts
  • Idle curiosity and some serious concerns got them thinking about whether this was possible
  • Kapur’s team delivers a non-invasive wearable that can read your mind (through your jaw)
  • The possibilities for this tech are super exciting

In neuroscience, researchers have been working with brain-computer interfaces (BCIs) for decades. After the discovery of the brain’s electrical activity in 1924 by Hans Berger, enabling electroencephalography (EEG), scientists have been trying to understand the language of the mind – ‘reading’ thoughts directly. While some BCIs are invasive, relying on sensors implanted in the brain, far more promising approaches use non-invasive techniques. These BCIs typically either use tiny sensors worn on the head to measure brain activity, or a special form of magnetic resonance imaging, an fMRI, that illustrates active, real-time blood flow in the brain.

With constant advances in this tech, scientists have reached the point where mind-reading isn’t just a parlour trick. These systems can now detect brain activity and translate those signals into communication and control. And that promises hope for people who’ve lost the ability to move or speak, helping them regain independence and live fuller lives. In fact, recent innovations may allow people to talk directly with their minds, opening new possibilities for treatment and communication, especially in environments that are far too loud for normal speech.

A team at MIT is using subvocalisation to allow subjects to speak with their thoughts

If while reading, you pronounce each word aloud in your mind, you know exactly what subvocalisation is. It helps us better understand the words we’re seeing, though it also slows reading speed substantially. In a sense, when you do this, your brain is speaking silently, and your body knows it. It’s a cool trick: your brain thinks a word and sends a signal to the muscles of your face, even though they don’t move. And now, a team of researchers at the Massachusetts Institute of Technology has developed a system that can read these silent thoughts and transcribe them into speech.

A man wearing a white wearable device and a computer screen with icons
A team of researchers at the Massachusetts Institute of Technology has developed a system that can read these silent thoughts and transcribe them into speech.

Idle curiosity and some serious concerns got them thinking about whether this was possible

As Arnav Kapur, the lead author of the project from MIT’s Media Lab, explains, there were a number of applications that got them thinking about the tech. On one hand, there were the merely curious (and perhaps not so good) ideas: could they design a system that would allow people to access their mobiles unobtrusively during a conversation? As Kapur explains, “at the moment, the use of those devices is very disruptive. If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.” We have enough trouble getting and keeping one another’s attention now – that sounds like the last thing we need!

But on the other hand, there were serious, life-changing possibilities. Consider people like the late Stephen Hawking, who by the end of his life, was unable to speak with his mouth or move his body. Hawking used a computer to speak for him, but it was a slow and laborious process not really useful for ordinary conversation. And just imagine how difficult it would be for him to do things you take for granted, like check email or select a movie on Netflix. Kapur and his team wondered if they could find a way to help people like him who suffered from traumatic injuries or debilitating diseases. And then there were the really specialised scenarios where silent communication is critical: really noisy environments with a lot going on, a situation that demands clear communication, but makes it all but impossible. Think here of busy factory floors or ground crew at airports.

Kapur’s team wanted to give control to those who had lost it, and make communication possible for those who need it. That’s pretty amazing stuff, if you think about it. And the good news is that they’ve made it work.

Kapur’s team delivers a non-invasive wearable that can read your mind (through your jaw)

Kapur’s team developed a wearable headset, called AlterEgo, that measures the minute electrical impulses the brain sends your jaw muscles when you think a word. Packed into a small wearable that fits along your jaw and lip, this isn’t the scary helmet you’re probably imagining. His team trained these sensors to recognise a small vocabulary, and then demonstrated that this wearable was useful for a pretty formidable array of tasks. Larry Hardesty, writing for MIT News, reports that “the researchers began collecting data on a few computational tasks with limited vocabularies — about 20 words each. One was arithmetic, in which the user would subvocalize large addition or multiplication problems; another was the chess application, in which the user would report moves using the standard chess numbering system.” After adjusting the system for a few minutes to each test subject, they found that it was startlingly accurate – showing an “average transcription accuracy of about 92 percent”!

A man wearing the AlterEgo headset
Kapur’s team developed a wearable headset, called AlterEgo, that measures the minute electrical impulses the brain sends your jaw muscles when you think a word.

Kapur’s pretty humble about their accomplishment. “We’re in the middle of collecting data, and the results look nice … I think we’ll achieve full conversation some day,” he says. Don’t let this dampen your excitement, though; he’s just being really, really cautious.

The possibilities for this tech are exciting

But Thad Starner, a full professor in the School of Interactive Computing at the Georgia Institute of Technology, doesn’t want Kapur to fool you with his humility. As he says, “I think that they’re a little underselling what I think is a real potential for the work … Like, say, controlling the airplanes on the tarmac at Hartsfield Airport here in Atlanta.” In noisy environments, silent speech among a headphone-wearing team could be a life-saver, literally. “You’ve got jet noise all around you, you’re wearing these big ear-protection things — wouldn’t it be great to communicate with voice in an environment where you normally wouldn’t be able to? You can imagine all these situations where you have a high-noise environment, like the flight deck of an aircraft carrier, or even places with a lot of machinery, like a power plant or a printing press,” Starner explains. In these contexts, Kapur’s innovation could be revolutionary for safety.

And just take a look at how Kapur’s system, transformed into a tool to control a television, could give someone with limited motor control more freedom:

That’s pretty incredible, right? Kapur surely knows this, and what his team has accomplished is nothing short of miraculous. For people who can’t easily communicate due to injury or illness, or for those working in dangerous, noisy environments, this is life-changing tech.

Free e-books for 13 sectors.

The world is changing rapidly, and this has a major impact on all sectors. That’s why we have developed compact e-books for no less than 13 sectors. We’ve listed and explained the latest trends as well as interesting statistics.
This article is written by Richard van Hooijdonk

This article is written by Richard van Hooijdonk

Trendwatcher, futurist and international keynote speaker Richard van Hooijdonk takes you to an inspiring future that will dramatically change the way we live, work and do business.

All lectures