Mind-reading technology has arrived

For a few years now, I’ve been writing articles on neurotechnology with downright Orwellian headlines. Headlines that warn “Facebook is building tech to read your mind” and “Brain-reading tech is coming.”

Well, the technology is no longer just “coming.” It’s here.

With the help of AI, scientists from the University of Texas at Austin have developed a technique that can translate people’s brain activity — like the unspoken thoughts swirling through our minds — into actual speech, according to a study published in Nature.

In the past, researchers have shown that they can decode unspoken language by implanting electrodes in the brain and then using an algorithm that reads the brain’s activity and translates it into text on a computer screen. But that approach is very invasive, requiring surgery. It appealed only to a subset of patients, like those with paralysis, for whom the benefits were worth the costs. So researchers also developed techniques that didn’t involve surgical implants. They were good enough to decode basic brain states, like fatigue, or very short phrases — but not much more.

Now we’ve got a non-invasive brain-computer interface (BCI) that can decode continuous language from the brain, so somebody else can read the general gist of what we’re thinking even if we haven’t uttered a single word.

How is that possible?

It comes down to the marriage of two technologies: fMRI scans, which measure blood flow to different areas of the brain, and large AI language models, similar to the now-infamous ChatGPT.

In the University of Texas study, three participants listened to 16 hours of storytelling podcasts like The Moth while scientists used an fMRI machine to track the change in blood flow in their brains. That data allowed the scientists, using an AI model, to associate a phrase with how each person’s brain looks when it hears that specific phrase.

Because the number of possible word sequences is so vast, and many of them would be gibberish, the scientists also used a language model — specifically, GPT-1 — to narrow down possible sequences to well-formed English and predict which words are likeliest to come next in a sequence.

The result is a decoder that gets the gist right, even though it doesn’t nail every single word. For example, participants were asked to imagine telling a story while in the fMRI machine. Later, they repeated it aloud so the scientists could see how well the decoded story matched up with the original.

When the participant thought, “Look for a message from my wife saying that she had changed her mind and that she was coming back,” the decoder translated: “To see her for some reason I thought she would come to me and say she misses me.”

Here’s another example. When the participant thought, “Coming down a hill at me on a skateboard and he was going really fast and he stopped just in time,” the decoder translated: “He couldn’t get to me fast enough he drove straight up into my lane and tried to ram me.”

It’s not a word-for-word translation, but much of the general meaning is preserved. This represents a breakthrough that goes well beyond what previous brain-reading tech could do — and one that raises serious ethical questions.

The staggering ethical implications of brain-computer interfaces

It might be hard to believe that this is real, not something out of a Neal Stephenson or William Gibson novel. But this kind of tech is already changing people’s lives. Over the past dozen years, a number of paralyzed patients have received brain implants that allow them to move a computer cursor or control robotic arms with their thoughts.

Elon Musk’s Neuralink and Mark Zuckerberg’s Meta are working on BCIs that could pick up thoughts directly from your neurons and translate them into words in real time, which could one day allow you to control your phone or computer with just your thoughts.

Non-invasive, even portable BCIs that can read thoughts are still years away from commercial availability — after all, you can’t lug around an fMRI machine, which can cost as much as $3 million. But the study’s decoding approach could eventually be adapted for portable systems like functional near-infrared spectroscopy (fNIRS), which measures the same activity as fMRI, although with a lower resolution.

Is that a good thing? As with many cutting-edge innovations, this one stands to raise serious ethical quandaries.

Let’s start with the obvious. Our brains are the final privacy frontier. They’re the seat of our personal identity and our most intimate thoughts. If those precious three pounds of goo in our craniums aren’t ours to control, what is?

Imagine a scenario where companies have access to people’s brain data. They could use that data to market products to us in ways our brains find practically irresistible. Since our purchasing decisions are largely driven by unconscious impressions, advertisers can’t get very helpful intel from consumer surveys or focus groups. They can get much better intel by going directly to the source: the consumer’s brain. Already, advertisers in the nascent field of “neuromarketing” are attempting to do just that, by studying how people’s brains react as they watch commercials. If advertisers get brain data on a massive scale, you might find yourself with a powerful urge to buy certain products without being sure why.

Or imagine a scenario where governments use BCIs for surveillance, or police use them for interrogations. The principle against self-incrimination — enshrined in the US Constitution — could become meaningless in a world where the authorities are empowered to eavesdrop on your mental state without your consent. It’s a scenario reminiscent of the sci-fi movie Minority Report, in which a special police unit called the PreCrime Division identifies and arrests murderers before they commit their crimes.

Some neuroethicists argue that the potential for misuse of these technologies is so great that we need revamped human rights laws to protect us before they’re rolled out.

“This research shows how rapidly generative AI is enabling even our thoughts to be read,” Nita Farahany, author of The Battle for Your Brain, told me. “Before neurotechnology is used at scale in society, we need to protect humanity with a right to self-determination over our brains and mental experiences.”

As for the study’s authors, they’re optimistic — for now. “Our privacy analysis suggests that subject cooperation is currently required both to train and to apply the decoder,” they write.

Crucially, the process only worked with cooperative participants who had participated willingly in training the decoder. And those participants could throw off the decoder if they later wanted to; when they put up resistance by naming animals or counting, the results were unusable. For people on whose brain activity the decoder had not been trained, the results were gibberish.

“However, future developments might enable decoders to bypass these requirements,” the authors warn. “Moreover, even if decoder predictions are inaccurate without subject cooperation, they could be intentionally misinterpreted for malicious purposes.”

This is exactly the sort of future that worries Farahany. “We are literally at the moment before, where we could make choices to preserve our cognitive liberty — our rights to self-determination over our brains and mental experiences — or allow this technology to develop without safeguards,” she told me. “This paper makes clear that the moment is a very short one. We have a last chance to get this right for humanity.”

rn rnvox-markrn rn rn rn rn rn“,”cross_community”:false,”groups”:[{“base_type”:”EntryGroup”,”id”:76815,”timestamp”:1683726001,”title”:”Future Perfect”,”type”:”SiteGroup”,”url”:”https://www.vox.com/future-perfect”,”slug”:”future-perfect”,”community_logo”:”rnrn rnvox-markrn rn rn rn rn rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:1560,”always_show”:false,”description”:”Finding the best ways to do good. “,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:”https://cdn.vox-cdn.com/uploads/chorus_asset/file/16290809/future_perfect_sized.0.jpg”,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:true},{“base_type”:”EntryGroup”,”id”:27524,”timestamp”:1683728509,”title”:”Technology”,”type”:”SiteGroup”,”url”:”https://www.vox.com/technology”,”slug”:”technology”,”community_logo”:”rnrn rnvox-markrn rn rn rn rn rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:24389,”always_show”:false,”description”:”Uncovering and explaining how our digital world is changing — and changing us.”,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false},{“base_type”:”EntryGroup”,”id”:80311,”timestamp”:1683724211,”title”:”Artificial Intelligence”,”type”:”SiteGroup”,”url”:”https://www.vox.com/artificial-intelligence”,”slug”:”artificial-intelligence”,”community_logo”:”rnrn rnvox-markrn rn rn rn rn rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:362,”always_show”:false,”description”:”Vox’s coverage of how AI is shaping everything from text and image generation to how we live. “,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false},{“base_type”:”EntryGroup”,”id”:102794,”timestamp”:1683724211,”title”:”Innovation”,”type”:”SiteGroup”,”url”:”https://www.vox.com/innovation”,”slug”:”innovation”,”community_logo”:”rnrn rnvox-markrn rn rn rn rn rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:162,”always_show”:false,”description”:””,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false}],”internal_groups”:[{“base_type”:”EntryGroup”,”id”:112404,”timestamp”:1683726218,”title”:”Approach — Connects something to larger stakes”,”type”:”SiteGroup”,”url”:””,”slug”:”approach-connects-something-to-larger-stakes”,”community_logo”:”rnrn rnvox-markrn rn rn rn rn rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:157,”always_show”:false,”description”:””,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”}],”image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/72248059/tang_prepping_mri_participant_full.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”PhD student Jerry Tang prepares to collect brain activity data in the Biomedical Imaging Center at the University of Texas at Austin.”,”credit”:”Nolan Zunk/The University of Texas at Austin”,”focal_area”:{“top_left_x”:1680,”top_left_y”:1014,”bottom_right_x”:2320,”bottom_right_y”:1654},”bounds”:[0,0,4000,2667],”uploaded_size”:{“width”:4000,”height”:2667},”focal_point”:null,”image_id”:72248059,”alt_text”:”A person in a plaid shirt puts a device over the head of a person lying down about to enter an MRI machine.”},”hub_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/72248059/tang_prepping_mri_participant_full.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”PhD student Jerry Tang prepares to collect brain activity data in the Biomedical Imaging Center at the University of Texas at Austin.”,”credit”:”Nolan Zunk/The University of Texas at Austin”,”focal_area”:{“top_left_x”:1680,”top_left_y”:1014,”bottom_right_x”:2320,”bottom_right_y”:1654},”bounds”:[0,0,4000,2667],”uploaded_size”:{“width”:4000,”height”:2667},”focal_point”:null,”image_id”:72248059,”alt_text”:”A person in a plaid shirt puts a device over the head of a person lying down about to enter an MRI machine.”},”lede_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”PhD student Jerry Tang prepares to collect brain activity data in the Biomedical Imaging Center at the University of Texas at Austin.”,”credit”:”Nolan Zunk/The University of Texas at Austin”,”focal_area”:{“top_left_x”:1680,”top_left_y”:1014,”bottom_right_x”:2320,”bottom_right_y”:1654},”bounds”:[0,0,4000,2667],”uploaded_size”:{“width”:4000,”height”:2667},”focal_point”:null,”image_id”:72248060,”alt_text”:”A person in a plaid shirt puts a device over the head of a person lying down about to enter an MRI machine.”},”group_cover_image”:null,”picture_standard_lead_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”PhD student Jerry Tang prepares to collect brain activity data in the Biomedical Imaging Center at the University of Texas at Austin.”,”credit”:”Nolan Zunk/The University of Texas at Austin”,”focal_area”:{“top_left_x”:1680,”top_left_y”:1014,”bottom_right_x”:2320,”bottom_right_y”:1654},”bounds”:[0,0,4000,2667],”uploaded_size”:{“width”:4000,”height”:2667},”focal_point”:null,”image_id”:72248060,”alt_text”:”A person in a plaid shirt puts a device over the head of a person lying down about to enter an MRI machine.”,”picture_element”:{“html”:{},”alt”:”A person in a plaid shirt puts a device over the head of a person lying down about to enter an MRI machine.”,”default”:{“srcset”:”https://cdn.vox-cdn.com/thumbor/NHzH2En1wNgTdKDWgsuq9D5vQkE=/0x0:4000×2667/320×240/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/CyfLzYE8ZFNFMpE5DwA6B0LY-cY=/0x0:4000×2667/620×465/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/LT98z931lGao9E4zu_9602KJIS8=/0x0:4000×2667/920×690/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/xVgLaVcqX3Kzy5hF11VUxHC04JM=/0x0:4000×2667/1220×915/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/Bv3DUqKnVq0hU5EuLG2HebCBQw4=/0x0:4000×2667/1520×1140/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 1520w”,”webp_srcset”:”https://cdn.vox-cdn.com/thumbor/bY1eVZ3BlHrw-UdzqCGL-koLMSc=/0x0:4000×2667/320×240/filters:focal(1680×1014:2320×1654):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/swoAagkE3GV9G5zv-v1fW2N01tU=/0x0:4000×2667/620×465/filters:focal(1680×1014:2320×1654):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/duUUkLHSmt3nNYxiPkS9C95MVI4=/0x0:4000×2667/920×690/filters:focal(1680×1014:2320×1654):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/BMywkCqtHeaqM4_Oe8suXyesaiI=/0x0:4000×2667/1220×915/filters:focal(1680×1014:2320×1654):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/wnnymSvywEggmliDI-5j_GC8Gk8=/0x0:4000×2667/1520×1140/filters:focal(1680×1014:2320×1654):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg 1520w”,”media”:null,”sizes”:”(min-width: 809px) 485px, (min-width: 600px) 60vw, 100vw”,”fallback”:”https://cdn.vox-cdn.com/thumbor/3IGjomD4yxIbM6Wy7fYqK499rYE=/0x0:4000×2667/1200×900/filters:focal(1680×1014:2320×1654)/cdn.vox-cdn.com/uploads/chorus_image/image/72248060/tang_prepping_mri_participant_full.0.jpg”},”art_directed”:[]}},”image_is_placeholder”:false,”image_is_hidden”:false,”network”:”vox”,”omits_labels”:true,”optimizable”:false,”promo_headline”:”Mind-reading technology has arrived”,”recommended_count”:0,”recs_enabled”:false,”slug”:”future-perfect/2023/5/4/23708162/neurotechnology-mind-reading-brain-neuralink-brain-computer-interface”,”dek”:”An AI-powered “brain decoder” can now read your thoughts with surprising accuracy.”,”homepage_title”:”Mind-reading technology has arrived”,”homepage_description”:”An AI-powered “brain decoder” can now read your thoughts with surprising accuracy.”,”show_homepage_description”:false,”title_display”:”Mind-reading technology has arrived”,”pull_quote”:null,”voxcreative”:false,”show_entry_time”:true,”show_dates”:true,”paywalled_content”:false,”paywalled_content_box_logo_url”:””,”paywalled_content_page_logo_url”:””,”paywalled_content_main_url”:””,”article_footer_body”:”At Vox, we believe that everyone deserves access to information that helps them understand and shape the world they live in. That’s why we keep our work free. Support our mission and help keep Vox free for all by making a financial contribution to Vox today. “,”article_footer_header”:”Explanatory journalism is a public good“,”use_article_footer”:true,”article_footer_cta_annual_plans”:”{rn “default_plan”: 1,rn “plans”: [rn {rn “amount”: 95,rn “plan_id”: 74295rn },rn {rn “amount”: 120,rn “plan_id”: 81108rn },rn {rn “amount”: 250,rn “plan_id”: 77096rn },rn {rn “amount”: 350,rn “plan_id”: 92038rn }rn ]rn}”,”article_footer_cta_button_annual_copy”:”year”,”article_footer_cta_button_copy”:”Yes, I’ll give”,”article_footer_cta_button_monthly_copy”:”month”,”article_footer_cta_default_frequency”:”annual”,”article_footer_cta_monthly_plans”:”{rn “default_plan”: 1,rn “plans”: [rn {rn “amount”: 9,rn “plan_id”: 77780rn },rn {rn “amount”: 20,rn “plan_id”: 69279rn },rn {rn “amount”: 50,rn “plan_id”: 46947rn },rn {rn “amount”: 100,rn “plan_id”: 46782rn }rn ]rn}”,”article_footer_cta_once_plans”:”{rn “default_plan”: 0,rn “plans”: [rn {rn “amount”: 20,rn “plan_id”: 69278rn },rn {rn “amount”: 50,rn “plan_id”: 48880rn },rn {rn “amount”: 100,rn “plan_id”: 46607rn },rn {rn “amount”: 250,rn “plan_id”: 46946rn }rn ]rn}”,”use_article_footer_cta_read_counter”:true,”use_article_footer_cta”:true,”featured_placeable”:false,”video_placeable”:false,”disclaimer”:null,”volume_placement”:”lede”,”video_autoplay”:false,”youtube_url”:”http://bit.ly/voxyoutube”,”facebook_video_url”:””,”play_in_modal”:true,”user_preferences_for_privacy_enabled”:false,”show_branded_logos”:true}” data-cid=”site/article_footer-1683732949_9489_3612″>

$95/year

$120/year

$250/year

$350/year

Yes, I’ll give $120/year

Yes, I’ll give $120/year


We accept credit card, Apple Pay, and


Google Pay. You can also contribute via


paypal logo

Read More