Low-cost tech set up that can help blind people ‘see’

We live in a visual world and build environments that rely heavily on visual perception. Want to find somewhere? You look on a map or read a road sign. Perhaps the GPS on your touch screen smartphone can help you. It probably couldn’t help you find your mislaid house keys or help you choose between a red and green apple but we don’t need it to, we have our eyes to do that.

worldwide, functioning in these visual environments is much more problematic. The signs can’t be read, and the apples, without tasting, are the same. At least the GPS will “talk” to me but the touch screen keyboard is awkward.

Efforts to increase accessibility to the environment for the visually impaired are not new – white stick, guide dog, Braille – but rapid technological advancements in the past three or four decades have facilitated not only new assistive devices but also techniques to restore a visual percept in the blind.

There is another way, however, which requires no surgery and works with technology you may have in your home. Rather than replace the damaged part of the visual system this method tries to provide the “visual” information in another way – by using a different sensory system. This is sensory substitution.

include the Brainport, which uses a camera, device and display unit to convert visual stimuli into tiny electrical signals on a person’s tongue. The main auditory devices, which convert the stimuli into sound, include The vOICe, PSVA and Eyemusic. The differences between them come down to the algorithms they use to convert the environment into sound.

For auditory devices the technology can be basic: you need a camera to extract information from the environment, a PC or smartphone to run the conversion algorithm, and headphones to relay the converted signal back to the user – but the magic in how it works lies in how the brain processes sensory information and how this is used to inform the algorithm.

by Dutch engineer Peter Meijer and uses a three principle conversion algorithm to tell the user where something is in the visual scene and how bright it is using auditory features such as pitch, volume, and stereo scan. If an object is high up, on a shelf perhaps, then it has a high pitch. If it is to the left, you hear it in the left headphone; visually bright then it is aurally loud.

published in Multisensory Research, we evaluated how much information is needed to successfully recognise simple objects using the vOICe. The level of recognition was more precise than simulations for retinal implants.

Curated from Low-cost tech set up that can help blind people ‘see’