top of page
eye-vision.jpg

 SEE-THROUGH-SOUND 

Research Project in Sensory Substitution

hearing 1.jpg
See-Through-Sound is a project in computer-aided vision whose aim is to create small portable hardware devices that convey qualitative and quantitative information about physical objects, people and the environment around the user, through the conversion of visual information into sound and sonic patterns. Different attributes of color and luminosity captured from live video are mapped in real-time to sound parameters such as pitch, timbre, loudness, rhythm and spatial location. Resulting sounds are structured into short patterns of pitch and time units of varying complexity, organically assembled to function as a 'language' that can be taught, transmitted, and continuously expanded. 
The devices are geared at aiding the visually impaired to navigate through the surrounding environment. They are also useful as wide-purpose tools that sonically translate and decode visual information from a remote location, without directly observing or having access to its source.
Interpolation HSV.png
HSV.png
The See-Through-Sound project started as a research effort of an international team sponsored by the Portuguese Foundation for Science and Technology in 2011-2012. The team for which I contributed as the Principal Investigator, produced five research publications and a software prototype. To learn more about the research and to download its deliverables click here
The project is ongoing and has evolved to incorporate machine-learning elements and algorithms borrowed from research in computer vision. 
  • White Spotify Icon
  • White Apple Music Icon
  • White Amazon Icon
bottom of page