But it struggled with more complex phrases.
Pushing the frontier
Once the mental privacy safeguard was in place, the team started testing their inner speech system with cued words first. The patients sat in front of the screen that displayed a short sentence and had to imagine saying it. The performance varied, reaching 86 percent accuracy with the best performing patient and on a limited vocabulary of 50 words, but dropping to 74 percent when the vocabulary was expanded to 125,000 words.
But when the team moved on to testing if the prosthesis could decode unstructured inner speech, the limitations of the BCI became quite apparent.
The first unstructured
→ Continue reading at Ars Technica