~~NOTOC~~ ====== Sonification of Gesture in Music ====== {{template>projects:summary |title=Sonification of Gesture in Music| |participants=[[people:Alexandre Savard]]\\ [[people: Vincent Verfaille]] (SPCL)\\ [[people:Oswald Quek]]\\ [[people:Marcelo M. Wanderley]]\\ [[people:R. Michael Winters]]| |funding=| |type= Thesis| |period=2005| |status=Completed.| |image=projects:sonification:sonificationdesktop.png |caption=The Sonification Desktop. }} ===== Description ===== This thesis presents a multimodal sonification system that combines video with sound synthesis generated from motion capture data. Such a system allows for a fast and efficient exploration of musicians’ ancillary gestural data, for which sonification complements conventional videos by stressing certain details which could escape one’s attention if not displayed using an appropriate representation. The main ob jective of this project is to provide a research tool designed for people that are not necessarily familiar with signal processing or computer sciences. This tool is capable of easily generating meaningful sonifications thanks to dedicated mapping strategies. {{projects:sonification:sonificationdesktop.png?600|The Sonification Desktop}} On the one hand, the dimensionality reduction of data obtained from motion capture systems such as the Vicon is fundamental as it may exceed 350 signals describing gestures. For that reason, a Principal Component Analysis is used to ob jectively reduce the number of signals to a subset that conveys the most significant gesture information in terms of signal variance. On the other hand, movement data presents high variability depending on the sub jects: additional control parameters for sound synthesis are offered to restrain the sonification to the significant gestures, easily perceivable visually in terms of speed and path distance. The following figure presents an example of control signal used to drive sound syntehsis parameters (left/right knee angles) and their related principal components. {{projects:sonification:leganglespca.jpg|Principal Components}} Then, signal conditioning techniques are proposed to adapt the control signals to sound synthesis parameter requirements or to allow for emphasizing certain gesture characteristics that one finds important. All those data treatments are performed in realtime within one unique environment, minimizing data manipulation and facilitating efficient sonification designs. Realtime process also allows for an instantaneous system reset to parameter changes and process selection so that the user can easily and interactively manipulate data, design and adjust sonifications strategies. ---- ===== Video ===== {{vimeo>32054289?680x400}} ===== Source Code and Use Examples ===== Source code and demonstration videos are kept on a public GitHub account [[https://github.com/IDMIL/SonificationDesktop |here]]. ===== Examples of sonification ===== {{projects:sonification:sonification_example1.wav |Sonification example 1}} {{projects:sonification:sonification_example2.wav |Sonification example 2}} {{projects:sonification:sonification_example3.wav |Sonification example 3}} \\ ===== Publications ===== * [[people:R. Michael Winters]], [[people:Alexandre Savard]], [[people:Vincent Verfaille]], and [[people:Marcelo M. Wanderley]]. [[http://airccse.org/journal/jma/4612ijma02.pdf|"A Sonification Tool for the Analysis of Large Databases of Expressive Gesture"]], //International Journal of Multimedia and Its Applications//, 4(6), 2012. * [[people:R. Michael Winters]], [[people:Marcelo M. Wanderley]]. [[https://smartech.gatech.edu/handle/1853/44450|"New Directions for the Sonification of Expressive Movement in Music Performance"]], In //Proceedings of the International Conference on Auditory Display//, Atlanta, Georgia, June 18-22, 2012. * [[people:Vincent Verfaille]], [[people:Oswald Quek]], and [[people:Marcelo M. Wanderley]]. “Sonification of Musician’s Ancillary Gestures.” In Proc. of the 2006 International Conference on Auditory Display (ICAD’06), London, England, 2006, pp. 194–7. * [[people:Vincent Verfaille]] and [[people:Marcelo M. Wanderley]]. “Mapping Strategies for Sound Synthesis, Digital Audio Effects and Sonification of Performer Gestures.” Paper presented during the Acous. Soc. Am. Meeting, Rhodes Island, June 2006. ==== Unpublished Reports ==== * {{publications:2011:documentation_of_the_sonification_desktop.pdf|"Documentation of the Sonification Desktop"}}, [[people:R. Michael Winters]], IDMIL Report, December 2011. * {{publications:2011:literature_review_for_sonification_of_gesture.pdf|"Literature Review and New Directions for Sonification of Musicians' Ancillary Gestures}}, [[people:R. Michael Winters]], IDMIL Report, September 2011. {{tag>Sonification}}