Posted on 03 December 2012
Part 5 in a series of videos recorded from ACM MIRUM 2012 in Nara, Japan.
When performing music, a musician uses physical gestures to translate a musical score into sound. Yet, unlike the score domain and the sound domain, this "gesture domain" is often neglected in music information retrieval research.
Alfonso Perez Carrillo presents a method to identify a violinist's physical gestures using the sound produced by the violin. First, spectral features such as low-level frequency band information and perceptual descriptors are extracted from the audio signal. Next, instrumental gesture controls such as finger position, bow tilt, and bow velocity, are extracted with the help of small sensors. Finally, a multi-layer perceptron is used to learn the gestures from the training data.