The goal of this project is to sonify data. The main program will detect moving objects in a video and add a pitch to each moving object per frame depending on the area of each object. The larger the object, the lower the pitch. The projects aims to have a new way to interpret data and hear patterns that could be visually difficult to detect, increase science accessibility to the visually impaired, and improve communication among those with limited science literacy.
In collaboration with Tom Zimmerman at IBM Research, the code can be used for plankton detection and response. We will be continue to modify and improve code for research purposes.
This project is supported by the Center for Cellular Construction (ccc.ucsf.edu) and the National Science Foundation under Grant No. DBI-1548297. Disclaimer: Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Supporting functions for detect_music_chord file. Functions for pitch matching area of moving onjects.
Main program to detect moving objects in live or prerecorded videos and plays corresonding pitch to area of each moving object in frame.
To detect and add sound to objects:
1. Open music_v1
2. Open GarageBand on Mac and select "Software Instrument". Select instrumentation.
example mp4 file