Mind Matters News and Analysis on Natural and Artificial Intelligence
Silent reading Jilbert Ebrahimi Unsplash -HAwA1N2gjo8

AlterEgo Does Not Read Your Mind

What it really does may surprise you but many claims made for it are deceptive

A recent article from British newspaper The Guardian lauds an MIT invention christened AlterEgo. The headline, dripping with seductive semantics clickbait, is titled: “Researchers develop device that can ‘hear’ your internal voice” and it tells us,

Researchers have created a wearable device that can read people’s minds when they use an internal voice, allowing them to control devices and ask queries without speaking …

Four electrodes under the white plastic device make contact with the skin and pick up the subtle neuromuscular signals that are triggered when a person verbalises internally. When someone says words inside their head, artificial intelligence within the device can match particular signals to particular words, feeding them into a computer.

Samuel Gibbs, “Researchers develop device that can ‘hear’ your internal voice” at The Guardian

So that “internal voice” turns out to be your tongue moving around inside your mouth (subvocalization.) A tendency to mentally read words aloud can lead to movement of muscles in the mouth, which can slow reading speed. In fact, some readers combat the tongue movement to increase their reading speed. In any event, subvocalization is a key to the operation of AlterEgo.

A TechRepublic article on the same topic shouts the headline, “Could MIT’s AI headset transcribe your future strategy straight from your brain?”

The answer is no. That headline is more seductive semantic clickbait. There is no direct interface between AlterEgo and the brain. There is no mind reading involved.

Newsweek’s headline on the invention claims: “This Strange Headset Lets You Interact with Digital Devices Simply by Reading Your Mind”

This headline is also boldly misleading. AlterEgo no more reads your mind than a computer keyboard does. You can say that your mind communicates with your computer when you type, therefore interfacing your mind with the computer. Similar communication happens with AlterEgo. But the process has absolutely nothing to do with directly reading your mind.

Reading deeper into AlterEgo’s technical details reveals that communication comes from “neuromuscular signals in the jaw and face triggered by so-called internal verbalizations (saying words `in your head’) that are not detectable by the human eye.” Your mouth is in your head so I suppose silently saying “words in your head” describes words “in your mouth.” But it’s a stretch.

Here’s what essentially happens. Think of your teeth as computer keys activated by moving your tongue around as you type in your mouth. As your tongue moves around, tiny facial muscle movements result. Small neuromuscular signals are detected by AlterEgo’s facial sensors that use simple AI to translate the movements into words. The words can be fed into a computer search engine that, like Alexa or IBM Watson, spits out a response to any question asked. When you talk inside your mouth, not only do your lips not move but there is no sound. So AlterEgo is interpreting a silent query via muscle motion that would not be noticed by a casual observer. It’s a bit like ventriloquism.

Once the words originating “in your head” are given to the computer, an answer is given and is wirelessly communicated to AlterEgo. AlterEgo communicates to the user using vibrations on the bones of the face. Think of the vibration of your muted cell phone sending tactile Morse code to your jaw.

After the success of IBM’s Watson at the game show Jeopardy in 2011, the individual technical components comprising AlterEgo do not seem that impressive today. Train AI offline on neuromuscular signals by moving your tongue around. The computer finds the answer and vibrates the answer back to you.

Or one could be trickier. If a Watson were not available, communicate your tongue wagging wirelessly to a friend at a computer behind the curtain who translates and Googles whatever you ask and sends a wireless jaw-vibrating answer back to you. AlterEgo doesn’t do that but the example illustrates the fact that the technology is much less groundbreaking than promoted.

MIT’s own publicity about AlterEgo starts out being somewhat more honest. Their news headline reads: “Computer system transcribes words users `speak silently’ “

Training AI to translate small facial muscle signals into words looks to be the only individual technical innovation behind AlterEgo. The idea is unique but I suspect is not that difficult. No deep learning is needed. And the following MIT claim is farfetched: “[AlterEgo] AI would “weave into human personality as a ‘second self’ and augment human cognition and abilities.”

This is pure hyperbole. AlterEgo will be no more a “second self” than Alexa or the Google search engine.

MIT’s additional claim that AlterEgo is “intelligence augmentation” is, likewise, seductive semantics. There is an unstated assumption of exclusivity. But such “intelligence augmentation” also applies to Alexa, Watson, and even a Google search.

The composite idea behind AlterEgo is clever and patentable. It may prove invaluable for applications like helping the severely handicapped to electronically interface with the internet of things. It might enable us to politely send and receive text messages in the middle of a movie at the local Cineplex. But despite headlines and publicity claiming otherwise, AlterEgo provides no technical stride forward in the field of AI-brain interface.

Watch the short MIT publicity video below touting AlterEgo. Can the tasks shown be done more easily using oral commands to an Alexa than by AI interpreted tongue-wagging? I think so.


Here are Dr. Marks’s picks for the Top Ten AI hypes of 2018

Also: It’s 2019! Begin the AI hype cycle again! (Jonathan Bartlett) Media seemingly can’t help portraying today’s high-tech world as a remake of I, Robot (2004), starring you and me.

Featured image: Silent reading/Jilbert Ebrahimi, Unsplash


Robert J. Marks II

Director, Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Besides serving as Director, Robert J. Marks Ph.D. hosts the Mind Matters podcast for the Bradley Center. He is Distinguished Professor of Electrical and Computer Engineering at Baylor University. Marks is a Fellow of both the Institute of Electrical and Electronic Engineers (IEEE) and the Optical Society of America. He was Charter President of the IEEE Neural Networks Council and served as Editor-in-Chief of the IEEE Transactions on Neural Networks. He is coauthor of the books Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks (MIT Press) and Introduction to Evolutionary Informatics (World Scientific). For more information, see Dr. Marks’s expanded bio.

AlterEgo Does Not Read Your Mind