acm-header
Sign In

Communications of the ACM

ACM News

A.I. Is Getting Better at Mind-Reading


View as: Print Mobile App Share:

Scientists recorded M.R.I. data from three participants as they listened to 16 hours of narrative stories to train the model to map between brain activity and semantic features that captured the meanings of certain phrases and the associated brain respons

Credit: Jerry Tang and Alexander Huth

Think of the words whirling around in your head: that tasteless joke you wisely kept to yourself at dinner; your unvoiced impression of your best friend's new partner. Now imagine that someone could listen in.

On Monday, scientists from the University of Texas, Austin, made another step in that direction. In a study published in the journal Nature Neuroscience, the researchers described an A.I. that could translate the private thoughts of human subjects by analyzing fMRI scans, which measure the flow of blood to different regions in the brain.

Already, researchers have developed language-decoding methods to pick up the attempted speech of people who have lost the ability to speak, and to allow paralyzed people to write while just thinking of writing. But the new language decoder is one of the first to not rely on implants. In the study, it was able to turn a person's imagined speech into actual speech and, when subjects were shown silent films, it could generate relatively accurate descriptions of what was happening onscreen.

From The New York Times
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account