The extraordinary experiment presents a proof-of-concept that could pave the way for a large variety of brain-controlled communication devices in the future.
A huge hurdle neuroengineers face on the road to effective brain-computer interfaces is trying to translate the wide array of signals produced by our brain into words and images that can be easily communicable. The science fiction idea of being able to control devices or communicate with others just by thinking is slowly, but surely, getting closer to reality.
Recent advances in machine learning technology have allowed scientists to crunch masses of abstract data. Just last year a team of Canadian researchers revealed an algorithm that could use electroencephalography (EEG) data to digitally recreate faces that a test subject had been shown.
Translating brainwaves into words has been another massive challenge for researchers, but again, with the aid of machine learning algorithms, amazing advances have been seen in recent years. The latest leap forward, from a team of American neuroengineers, has revealed a computer algorithm than can decode signals recorded from the human auditory cortex and translate them into intelligible speech.