Now do this put-the-text-back-together activity.
This is the text (if you need help).
Scientists may soon be able to interpret what someone is saying simply by analysing their brainwaves as they speak. This revolutionary advance in neuroscience would help millions of people who suffer from communication problems and neurological disorders. The scientists developed a form of artificial intelligence that can decode brainwaves and translate them into text. Algorithms take the brain activity created as a person speaks and translates it in real time into sentences on a screen. The scientists are from the University of California, San Francisco. They say their algorithms have a 97 per cent translation accuracy rate but are working hard to improve on this.
The scientists say they are at the early stages of being able to machine-translate everything someone says. The software used in their experiments matched features of speech that were repeated frequently to parts and shapes of the mouth. These included elements of English speech such as vowels, consonants and commands. The experiments were limited to around 40 short and simply-constructed spoken sentences. The scientists said: "Although we should like the decoder to learn and exploit the regularities of the language, it remains to show how many data would be required to expand from our tiny languages to a more general form of English."
Comprehension questions- Who may be able to interpret what someone is saying?
- What kind of disorders might the software help?
- What translates brain activity as a person speaks?
- When does the software translate brainwaves?
- What is the accuracy rate of the scientists' algorithms?
- What stage are the scientists at in the testing?
- What was matched to parts and shapes of the mouth?
- How many short sentences were used in the experiments?
- What do scientists want to exploit regularities of language?
- What must scientists expand to get to a more general from of English?
Back to the brainwaves lesson.