Neuroscientists have made another important step in “reading” of the human mind, aiming to “listen” and to decipher thoughts through monitoring of the brain waves that correspond to the silent speech or the inner dialogue that is constantly taking place in our minds.
The group of researchers, led by Brian Paisley and Robert Knight, professor of psychology and neuroscience of the University of California-Berkeley, has managed to “translate” the electrical brain waves into words.
Scientists hope that in the future the new technique, possibly through some specific brain implants and prosthetic devices, will allow patients in coma or with severe speech impairment, provoked by stroke or other disorders, to communicate with those around them.
The new achievement is a part of a series of advances in neuroscience, which gradually led to the realization of the dream (or the nightmare) of mind reading. The new study is focused on the auditory cortex and, in particular, on the area where sounds are refined, which allows people to understand what they hear. Initially, the researchers watched the brain waves of 15 patients while they were hearing several talks. Then they used a computer to sort the chaos of the acoustic/electric impulses and signals in order to find out which sound frequencies of words correspond to which brain signals. Finally, they managed to associate words with specific waves and signals of the brain.
Eventually the researchers created a computer model (algorithm) that can “hear” the mind and “guess” some and certainly not all of the words that a person has said. As the scientists have noted, so far the research is based on sounds (words) that a person actually hears in his environment, from which a prediction of what can be said is resulted.
However, the prediction of completely imaginary conversations that take place in the mind is not as easy but neither impossible, since scientists, as they said, have indications that both real and imaginary sounds activate similar brain regions. The brain seems to break the words it really hears and those of the inner dialog in individual frequencies, typically in the range from 1 to 8.000 Hertz. Thus, according to the researchers, a musician (even deaf like Beethoven) can imagine and “hear” a musical piece in his mind.
The technique is at an early stage and still there are many things to improve. A major difficulty is to create a small wireless handheld device, which can be used in everyday life of each patient. However, the U.S. researchers hope that in a decade the new method will be widely available.
Jan Schnupp, Professor of Neuroscience at Oxford University, described the study as “remarkable”.
He said: “Neuroscientists have long believed that the brain essentially works by translating aspects of the external world, such as spoken words, into patterns of electrical activity.
“But proving that this is true by showing that it is possible to translate these activity patterns back into the original sound (or at least a fair approximation of it) is nevertheless a great step forward, and it paves the way to rapid progress toward biomedical applications.”
But a major problem remains to be solved. How can a computer program understand which thoughts a man wants to communicate to others and which he wants to keep only for himself? Can you imagine the consequences if a hacker could sneak in every human skull?