Amazon has announced a new feature coming to the Echo Show devices, called Show and Tell, which is capable of recognizing pantry items when held in front of its camera.
The new feature was actually developed by the Alexa for Everyone team alongside the Vista Center for the Blind and Visually Impaired as a way to help blind and low-vision users identify objects.
“The whole idea for Show and Tell came about from feedback from blind and low vision customers,” Sarah Caplener, head of Amazon’s Alexa for Everyone team has said of the feature. “We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
The feature is available on second-generation Echo Show devices and all the users have to ask is “Alexa, what am I holding?” in order to find out if the item they need is the item in their hands. Alexa uses advanced computer vision and machine learning technologies for object recognition to accurately identify what the users are holding.
Alexa-enabled devices are already helping out visually impaired users in more ways than one: they help them change the channels on the Fire TV for example and the users can simply issue a vocal command to turn their appliances on or off, among other things.
Brett Fowler for example, a Vista Center community member who lost his vision at 10 years old, loves to cook but often found himself in trouble when he was unable to identify spices. The Show and Tell feature has apparently really made a difference.
“All of these devices that are acting as your eyes, it’s revolutionary.” Fowler said. “For me, the less stress I have to put on somebody else is less stress on me. And it makes me feel good.”