In 2016, Microsoft put out a challenge to the tech world.
With its AI for Accessibility scheme, they wanted to see just how artificial intelligence could be used to help people with disabilities. Many companies and startups are now working on various technologies from this, but it was one from Microsoft’s own research team that drew some attention at the recent Microsoft Ability Summit.
Project Tokyo is an ongoing project with the aim of communicating the surrounding social environment to a blind or visually impaired person. Currently, using a customised version of Microsoft’s own HoloLens headset, it’s capable of not only recognising the faces of nearby people, but also giving audio cues as to where they are in relation to you. With several cameras covering a 180 degree field of view, a high resolution camera for facial recognition and built-in speakers, the ability to recognise people around you is just the beginning of what artificial intelligence could offer.
As we all know, being blind or visually impaired can bring with it some social interaction issues. One interesting experimental feature was the ability to know when someone is looking at you. If Project Tokyo detects a face looking in your direction, you’ll hear a chime sound through the built-in speakers. Also, rather than having to move your head to scan the room to know who’s around you, which in itself can be a little socially awkward, it is able to give a room overview which will announce the names of nearby people. The names will be spoken using 3D spacial audio so you know where each person is standing.
As the project continues it’s very interesting to see just how much artificial intelligence can help not just with everyday tasks like reading printed text etc, but also with more nuanced tasks like social interaction.
For more from the Microsoft Ability Summit, check out the MSFT Enable YouTube channel.
Co-host & audio producer on the Double Tap Canada radio show. Occasional contributor to Double Tap TV, full time shed resident.