We are working with Perkins School for the Blind to build Assistive Technology applications. In one conversation one member of our focus group mentioned that she was not able to set the cycles on her washing machine because it used a touch panel, a problem that is showing up in more and more modern appliances. Others in the group wholehearted agreed with this issue.
We designed a phone app that could be aimed at the touch screen and provide audio feedback telling the user where their finger is pointing on the screen. We named this solution the TouchScreen Navigator.
We will use crowdsourcing to tailor the Navigator to each touch screen since there is no standard for the icons used.
We are currently experimenting with OpenCV to the iPhone. This opens the way to creating the TouchScreen Navigator and possibly other useful applications such as; CrossWalk Navigator, Facial Recognition, indoor navigation and many others.