thoughts in the language of translate – this is a target, scientists come closer gradually. To transfer brain signals into spoken language, managed a Team quite well. The signals originated from the auditory center of the brain, and arose out of the conscious Hearing of Spoken.

“We found that people were able to understand the sounds of 75 percent and repeat, which goes far beyond all previous Attempts,” said study leader Nima Mesgarani from Columbia University in New York City.

The researchers combined generation system (Vocoder), a Language with an artificial neural network, brain signals into spoken language. In the study, five patients with epilepsy, their actually for a purpose other implanted electrodes can record signals from the hearing centre were Included.

Training like Siri and Amazon Echo

First of all, the speech production system has been trained. “This is the same technology used by Amazon’s Echo and Apple’s Siri, to answer our questions verbally,” said neurosurgeon Ashesh Mehta, of the Hofstra North well School of Medicine in Manhasset, New York, was also involved in the study.

However, the training does not pass data in this case from spoken language, but from the brain of the patient signals in response to words which were said to them. The System should be recognizing the patterns of learning that arise in the case of certain sounds in the brain.

Then, the researchers used an extensive network of artificial neurons, in order to analyze the data of the speech production system and process. The result is a robotic sounding is listen to the voice, the digits from Zero to Nine ( a sample, see this paragraph ).

75% hit ratio

Mesgarani and colleagues played the results, which were obtained from the brain signals of the five epilepsy patients, eleven test persons. They recognized the spoken words in 75 percent of the cases. In addition, they were able to detect 80 percent, whether or not the Person was male or female, the researchers report in the journal “Scientific Reports”.

The Team Mesgarani hopes that a more developed Version of your procedure could be part of an implant, the thoughts directly into spoken language is transformed. “If the carrier thinks in this scenario: I need a glass of water’ that could accommodate our System by these thoughts generated by brain signals, and in a synthetic, verbal, language convert,” Mesgarani in view.

application in patients still in the distance

Niels Birbaumer of the University of Tübingen who was not involved in the study, dampens the expectations: “basically, the researchers have only recorded the brain’s response to an external stimulus.” This was, to other stimuli, based, basically, ever since the introduction of the electroencephalography (EEG) in 1929.

to actually transfer the thoughts of a man in spoken words, probably several 1000’s of electrodes in the brain is necessary. “In the case of the Paralytic, we can read ,Yes’ or’ no ‘ from the brain waves ,but no thoughts to reconstruct it.”


Sam Yoon has many years of experiences in journalism. He has covered such areas as information technology, science, sports and politics. Yoon can be reached at 82-2-6956-6698.