Speech recognition software should mold to our voices in order to increase security but it seems that, for now, it’s the other way around. Apple users, at least, have started changing their accents so that Siri can understand them easier.
It’s easier for Apple’s intelligent PA to recognize and act on North-American accents, so iPhone users are changing their accents to meet her halfway. Siri is not the only one to trigger such reactions, but most of speech recognition apps encourage a similar response. On the other side, users have an inner tendency to make their voices sound as “clinical” as possible. You’ve probably did it yourself; treated the program as you would treat a child – speak slow, accentuating every consonant and taking small breaks between words, thus cancelling your accent.
The bright side of this phenomenon is the likelihood of software improvement really soon, as technology companies realize the importance of voice command on mobile phones and in the Internet of Things. Google and Apple are two of this movement’s leaders, constantly refining their technologies.