女生小视频

Mind-reading AI turns thoughts into words using a brain implant

By Jason Arunn Murugesu

30 March 2020

CPU Mind series. Backdrop design of human face silhouette and technology symbols for illustrations on computer science, artificial intelligence and communications

AI can recognise some speech patterns in the brain

Andrew Ostrovsky/Getty Images

An artificial intelligence can accurately translate thoughts into sentences, at least for a limited vocabulary of 250 words. The system may bring us a step closer to restoring speech to people who have lost the ability because of paralysis.

Joseph Makin at the University of California, San Francisco, and his colleagues used deep learning algorithms to study the brain signals of four women as they spoke. The women, who all have聽epilepsy, already had electrodes attached to their brains聽to monitor seizures.

Each woman was asked to read aloud from a set of sentences as the team measured brain activity. The largest group of sentences contained 250 unique words.

The team fed this brain activity to a neural network algorithm, training it to identify regularly occurring patterns that could be聽linked to repeated aspects of聽speech, such as vowels or consonants. These patterns were聽then fed to a second neural network, which tried to turn them into words to form a sentence.

Each woman repeated the sentences at least twice, and the final repetition didn鈥檛 form聽part of聽the training data, allowing聽the researchers to test the system.

Each time a person speaks the same sentence, the brain activity associated will be similar but not identical. 鈥淢emorising the brain activity of these sentences wouldn鈥檛 help, so the network instead has to learn what鈥檚 similar about them so that it can generalise to this final example,鈥 says Makin. Across the four women, the AI鈥檚 best performance was an average translation error rate of 3 per cent.

Makin says that using a small number of sentences made it easier for the AI to learn which words tend to follow others.

For聽example, the AI was able to decode that the word 鈥淭urner鈥 was聽always likely to follow the word 鈥淭ina鈥 in this set of sentences, from brain activity alone.

The team tried decoding the brain signal data into individual words at a time, rather than whole sentences, but this increased the error rate to 38 per cent even for the best performance. 鈥淪o the network clearly is learning facts about which words go together, and not just which neural activity maps to which words,鈥 says Makin.

This will make it hard to聽scale up the system to a larger vocabulary because each new word increases the number of possible sentences, reducing accuracy.

Makin says 250 words could still be useful for people who can鈥檛 talk. 鈥淲e want to deploy this in a patient with an actual speech disability,鈥 he says, although it is possible their brain activity may be different from that of the women in this study, making this more difficult.

Sophie Scott at University College London says we are a long way from being able to translate brain signal data comprehensively. 鈥淵ou probably know around 350,000 words, so it鈥檚 still an incredibly restricted set of speech that they鈥檙e using,鈥 she says.

Nature Neuroscience DOI: 10.1038/s41593-020-0608-8

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with New 女生小视频 events and special offers.

Sign up
Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop