Skip to main content

Mind Meld: Translating Thoughts Into Speech

Triff/Shutterstock.com

Scientists are trying to translate speech-paralyzed patients’ thoughts into speech using brain implants. The technique will potentially provide a brain/computer interface (BCI) to enable people with a spinal cord injury, amyotrophic lateral sclerosis, stroke or other paralyzing conditions to “talk” again. Experts think a system that decodes whether a person is silently saying yes, no, hungry, pain or water is now within reach, thanks to parallel advances in neuroscience, engineering and machine learning. “We think we’re getting enough of an understanding of the brain signals that encode silent speech that we could soon make something practical,” says Brian Pasley, of the University of California, Berkeley.

The first BCI read electrical signals in the motor cortex corresponding to the intention to move, and used software to translate the signals into instructions to operate a computer cursor or robotic arm. In 2016, scientists at the University of Pittsburgh went a step further, adding sensors to a mind-controlled robotic arm so it produced sensations of touch.


This article appears in the February 2019 issue of Natural Awakenings.