Visually impaired internet users like their sighted counterparts, benefit tremendously from the information and services the web has to offer. However, as web pages have become more complex the "screen readers" that visually impared users
High level research into the potential market of visually impaired users revealed a global market reach of 285 million visually impaired individidals worldwide. Of that population:
- 39 million are fully blind
- 246 million have low vision
- About 90% of the world's visually impaired live in developing countries.
Given that 90% of the world's visually impaired population live in developing countries, we felt it was important to incorporate addtional design considerations so that our prototype is:
- Financially accessible to a wide range of demographics
- Serves the needs of users in markets where internet access may not be consistent
Only 13% of the visually impared market is fully blind, which means that we have to design an experience that uses AI that augments physical and digital experiences for broad spectrum of users, some who are fully visually impaired and others who are less. As such, it will be important to design an experience that uses a wide range of sensory experiences including:
- Auditory: Includes image and text to speech as well as other auditory based cues
- Tactile: Touch-based cues to
- Visual: Basic and/or varying level of visual object cues
How can we leverage advances in artificial intelligence and image/text-based recognition tools to improve digital and physical accessibility for the visually impaired in emerging markets?
Our prototype repurposes old or recycled touchscreen phones in developed markets where and uses image and text based recognition to provide auditory and tactile queues using haptic feedback:
Haptic feedback is the use of the sense of touch in a user interface design to provide information to an end user. When referring to mobile phones this generally means the use of vibrations from the device's vibration alarm to denote that a touchscreen button has been pressed. In this particular example, the phone would vibrate slightly in response to the user's activation of an on-screen control, making up for the lack of a normal tactile response that the user would experience when pressing a physical button.
Our Haptic feedback for visually impaired users
A future iteration of this model