I’ve written previously about Microsoft’s commitment to accessibility in the Windows 10 and Office365 products and recently I was presenting with my colleague Vishesh when he introduced me to the “Seeing AI” app from Microsoft Research.
What is Seeing AI? Essentially, it is a free app from Microsoft (currently only available on iOS – download here) that allows you to scan objects in a few different modes. Through the incredible power of the Azure cloud and related services, it will then read aloud what it sees so the user can have an audible description. Confused? Have a watch of this video to learn more:
So what can it scan? Here’s the list:
What is really cool about Seeing AI is that it also runs on third party hardware such as the Pivothead Glasses meaning a visually impaired person could wear these glasses and use Seeing AI to not only take pictures of what is in front of them, but also audibly hear the feedback. Here is a video of one of the developers (Saqib Shaikh, who is actually blind) using the Pivothead Glasses combined with the Seeing AI app:
I first saw this video in a live presentation from Vishesh and just about flipped out – it seemed like it was straight out of science fiction. To then be able to find it on the iOS App Store directly after the presentation very quickly brought it into the real world for me. I’m very blessed to have reasonable eye sight but I can only imagine how much of a game changer this must be for visually impaired people. To that end, I found this independent review of Seeing AI from Sam at the Blindspot who is himself visually impaired:
UPDATE 19th September 2017 // I posted this blog post on one of the many forums with an EduTech focus that I contribute to and within hours got a reply pointing me to the following interview by Jonathan Mosen of Saqib Shaikh about Seeing AI app – worth a listen:
So how does an app like this actually work? I’m not a developer but if you listened carefully to the video above featuring one of the developers Saqib Shaikh, he talks about using the various API and cognitive services from the Azure Cloud. If you’ve not seen the following image it can be a bit overwhelming but I’ve circled in red some of the services likely being used to develop Seeing AI:
Coming on the back of my previous post about using Azure cloud services to support the early detection and diagnosis of dyslexia, this sort of technology just reinforces how peoples lives are literally being changed for the better through some of this type of development. It is going to be awesome to see what comes in the future.