Our private thoughts were strictly a private affair. However, they won’t remain so for long. Researchers claim they have developed techniques that can be used to eavesdrop on our conversations with ourselves. In other words, agencies in the near future could be able to listen to our internal monologue, despite us not uttering a single syllable out loud.
When we read, our brain appears to be reading out aloud, explains Brian Pasley at the University of California, Berkeley.
“If you’re reading text in a newspaper or a book, you hear a voice in your own head. We’re trying to decode the brain activity related to that voice to create a medical prosthesis that can allow someone who is paralyzed or locked in, to speak.”
When a person hears another human being speak, sound waves activate sensory neurons in the inner ear. These neurons pass information to areas of the brain. Interestingly, different aspects of the sound are extracted and interpreted as words in different sections of the brain. For the mind, words are essentially a symphony of sound that needs to be broken down and then interpreted.
To examine this phenomenon, Pasley and his colleagues recorded brain activity in people who already had electrodes implanted in their brain to treat epilepsy. While the patients listened to speech, the research team found that certain neurons in the brain’s temporal lobe were only active in response to certain aspects of sound, such as a specific frequency, reported Gizmodo. They realized, one set of neurons might only react to sound waves that had a frequency of say a 1000 hertz, while an entirely different set of neurons only handled or cared about the sound waves that had a frequency of say 20,000 hertz.
Using this knowledge of selective and dedicated neuron processing sound waves, the team built an algorithm that could decode the words heard based on neural activity alone, reported New Scientist. The team initially hypothesized that hearing speech and thinking to oneself must spark some of the same or similar neural signatures in the brain. Hence they extrapolated that an algorithm, trained to identify speech heard out loud, might also be able to identify words that are essentially just thoughts.
To test the hypothesis, they recorded brain activity in another seven people undergoing epilepsy surgery, while they looked at a screen that displayed text from either the Gettysburg Address, John F. Kennedy’s inaugural address, or the nursery rhyme Humpty Dumpty. They were asked to read the text aloud and think about it as well.
Promisingly, despite the neural activity from imagined or actual speech differing slightly, the decoder was able to reconstruct which words several of the volunteers were thinking, using neural activity alone. Though the scientists accept the algorithm isn’t perfect yet, fine-tuning the same could one day enable people with partial or complete speech impairment to translate their thoughts into words.
[Image Credit | ReidAboutSex]