While for most of us yelling across the room ‘Hey Google! What’s the weather like today?’ is an easy task that we really don’t think about too much but there are millions of people out there with speech impairments who actually find this very simple action very frustrating as, most of the time, our virtual assistants are unable to understand them.
But Google plans to make a difference.
During its I/O developer conference, the company told the public that it is currently training its AI to better understand different speech patterns, including impaired speech caused by either brain injury or conditions such as ALS.
To achieve this, Google partnered with the ALS Therapy Development Institute as well as ALS Residence Initiative in a project they dubbed Project Euphoria. The idea behind it was that, if the people working with those who suffer with ALS, and their friends and loved ones can understand them, AI should be able to as well.
For starters, Google recorded thousands of voice samples of people with ALS and one volunteer called Dimitri Kanevsky, recorded 15,000 phrases himself, phrases which were subsequently turned into spectrograms and were used to train AI to understand what he is saying.
“Speech recognition should work for everybody.”– Michael Brenner, Research Scientist at Google
For now, this is still a work in progress and there are a number of issues and challenges the developers need to tackle. For the time being, Google is currently working to bring the feature to people who have impairments that are more typically associated with ALS and who speak English; so the next big obstacle will be to start making the AI available to those who speak other languages and have various other types of speech impairments. It is a mammoth task but Google seems determined to tackle it.
For now, the company is looking for volunteers who can record a few phrases and fill out a form. If you want to help out or know someone who can, you can do so via this link.
“If I ever need help, I want to be able to say, ‘OK, Google, broadcast call for help!’, have it understand me and send the message.”– Andrea Lytle Peet, person with ALS
In addition to all this, Google wants its AI to be able to translate sounds such as hums and gestures into action – this will allow the users to give out Google Home commands and send text messages, in the hope that, eventually, the AI will be capable of understanding anyone, regardless of how they communicate.