Amazon has released a new feature for Echo Show which makes it easier for people with visual impairments to identify groceries.
Show and Tell lets blind and low vision customers hold up an item to the Echo Show camera and ask, “Alexa, what am I holding.”
Alexa is then able to identify the item through advanced computer vision and machine learning technologies for object recognition.
In announcing the feature, Amazon said that believes in starting from the customer and working backwards, which meant paying attention to what all customers are telling it.
Sarah Caplener, head of Amazon’s Alexa for Everyone team, said: “The whole idea for Show and Tell came about from feedback from blind and low vision customers.
“We heard that product identification can be a challenge and something customers wanted Alexa’s help with.
“Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
Caplener’s team collaborated with the Vista Center for the Blind and Visually Impaired in Santa Cruz, California, and its assistive technology manager Stacie Grijalva, who is visually impaired herself.
Grijalva enlisted other blind and low vision customers for user studies, providing feedback to the Alexa for Everyone team.
Speaking about Show and Tell, Grijalva said: “My job is to help people with visual impairments see how technology can affect people’s lives and make them feel better about what they do on a day-to-day basis.”
“It’s a tremendous help and a huge time saver because the Echo Show just sits on my counter, and I don’t have to go and find another tool or person to help me identify something. I can do it on my own by just asking Alexa.”