close

WeWALK joins Microsoft’s ‘life-changing’ AI for Accessibility programme

tTwHtvCQ

WeWALK has joined Microsoft’s AI for Accessibility programme to accelerate WeWALK’s capability by developing and validating a human behaviour model for visually impaired users and creating a Voice Assistant designed for the visually impaired.

In doing so, it is providing the right mobility information when needed and allowing for even greater control of the WeWALK mobility experience.

Microsoft’s AI for Accessibility $25m 5-year programme is aimed at harnessing the power of AI to amplify human capability for the more than one billion people around the world with disabilities.

Story continues below
Advertisement

Through grants, technology, and AI expertise, the program aims to accelerate the development of accessible and intelligent AI solutions and build on recent advancements in Microsoft Cognitive Services to help developers create intelligent apps that can see, hear, speak, understand and interpret people’s needs.

WeWALK’s new Voice Assistant will be released later in 2020 and will have immediate usability benefits, improving the user’s confidence as they mobilise. The assistant will be built on clearly derived requirements and natural usage patterns and the challenge that WeWALK is seeking to overcome is to make the assistant truly ‘smart’ and dynamic, where it will effectively categorize and deliver on the user’s commands in a host of different environments.

WeWALK’s human behaviour model is due for release in 2021 and is of significant importance as currently there are no accurate models for how a person who is blind moves and how their mobility holistically evolves, especially after receiving orientation and mobility training.

As a result, healthcare, government, and mobility trainers cannot effectively track how a person who is blind mobilizes and whether or not intervention has had benefit.

By using WeWALK’s built-in IMU (inertial measurement unit) sensors, including the gyroscope, accelerometer, and compass, as well as data collected from a connected smartphone, the model can be implemented and expanded organically through daily usage. The first stage will be rigorous data collection and user testing, followed by data manipulation and classification to ensure that optimum reliability and system usability can be achieved.

Commenting upon WeWALK’s entry into the program Jean Marc Feghali, R&D lead at WeWALK. “By working on these two objectives, WeWALK can set the standard for visually impaired mobility for both the individual user and the organisations that support them. We are now rigorously collecting mobility data with novel experimentation, validating our work by continuously engaging our users to ensure an exceptional product powered by Microsoft’s best.

Adding: “Being a part of the Microsoft family truly excites us, bringing us closer to mobility trainers, researchers, and the global visually impaired community.” 

Tags : aimicrosoftweWALK
Alex Douglas

The author Alex Douglas

Leave a Response