Workshop Abstracts

 

Synchronized Multimodal Recording and Analysis of Violin Performance with Motion Capture Systems

 

Ksenia Kolykhalova
Erica Volta
Simone Ghisio
Corrado Canepa
Gualtiero Volpe

Casa Paganini - InfoMus
DIBRIS - University of Genova, Italy

 

 

Abstract

    Learning to play a musical instrument (either traditional or new) encompasses complex motoric tasks needing a long and accurate training, Multimodal interactive systems can offer students guidance during self-study and can help teachers and students to focus on details that would be otherwise difficult to appreciate. This contribution introduces a multimodal corpus consisting of the recordings of expert models of success, provided by four professional violin performers. The corpus includes audio, video, motion capture, and physiological (EMG) data. The initial analyses carried out on the corpus will be introduced as well.

The pdf version of the full abstract is available here

 

 

 

Acquisition of Human Body Kinematic and Dynamic Features in Violin Performances with Kinect

 

Pablo Fernandez Blanco
Alfonso Perez-Carrillo
Mario Ceresa

Music Technology Group
Symbiosis Group
Universitat Pompeu Fabra
Barcelona, Spain

 

 

 

Abstract

    Playing a musical instrument is a highly complex activity, which requires a combination of mental and sensorimotor skills acquired during a long learning trajectory. Traditional music performance pedagogy is mostly based on imitation and feedback on the performance through semantic metaphors and imagery that attempt to explain an acoustic quality or a physiological aspect of the performance such as the posture or the feeling of movement. Such an approach is based on subjective and vague perception that, in many cases, result in frustrating lack of progress to chronic problems, forced breaks from performing during extended periods or even career-ending injury.


    The long-term aim of this research is to study the scientific and technological challenges around the analysis and modeling of human motion in musical practice as a sensorymotor activity and involves the acquisition, description, storage and analysis of datasets of motion and audio data captured during real performances. More specifically, this work deals with the measurement and description of the kinematics and dynamics of the human upper-body motion during violin playing. A model of expert performers is built as a reference to compare with and automatically evaluate amateurs.


    Human body motion is typically measured with camera based systems or with wearable inertial sensors technology (e.g. accelerometer, gyroscope) attached to the body. Both techniques, along with bio-mechanical models, can be used to extract features such as joint angle, posture, and joint rotation speed as well as many others which could be used for improved performance coordination to reduce the risk of injury through the measurement and feedback of prolonged joint extension and angular rotation intensity. In this work, we aim to develop low-cost implementations, using off-theshelf technology, namely RGB-D cameras, to make the system available to a large potential user group. In general, the accuracy of low-cost systems is low. However with the advent of recent consumer level 3D cameras such as the Microsoft Kinect, body motion tracking becomes feasible.


The pdf version of the full abstract is available here

 

 

Studying Movement Coarticulation in Drumset Performance

 

Sofia Dahl & Distribution:

Aalborg Universtiy
Copenhagen, Denmark

Minho Song
Rolf Inge Godøy
& Distribution:

Department of Musicology
University Of Oslo, Norway

 

 

 

Abstract

    So far, studies of drumming movements have primarily focused on isolated tasks, typically played on single instruments or force plates [1]. Outside the laboratory settings, however, any typical setup of instruments in the drum set would require the drummer to move feet, hands, arms, shoulders, as well as often torso and even the whole body, in order to play the musically interesting textural patterns so often found in both various grooves and in particular in various fills. Coarticulation [2] then comes to play with both the preparatory motion of the effectors prior to any beaterinstrument impact, as well as following such an impact on the way to the next point of impact. ‘Coarticulation’ may be defined as the fusion of otherwise distinct motion and sound events into more superordinate sound-motion events, e.g. to various patterns, grooves, or gestalts. As instantaneous body motion is not possible, and because of a need to move effectors (e.g. beater/hands, arms, shoulders, and even torso) ahead of the actual impact with a drum or cymbal, this results in what we call a contextual smearing of otherwise discrete excitatory motion.


    We present our method of approach to study coarticulation in drumset performance with the objective of finding relationships between the singular stick-instrument impact points (i.e. the sound onsets) and the coarticulatory context of the entire effector apparatus (from fingers to whole body) of the drummer. This necessitates plotting the motion trajectories at different points of this effector apparatus, as well as the velocities and acceleration profiles of these points, all in relation to the singular mallet-instrument impact points. From these figures and associated calculations of derivatives, we believe it is possible to get a better image of the salient sound-motion features that we may experience as the hallmarks of great drum set performances.


The pdf version of the full abstract is available here

 

A Virtual Violin System Based on Gesture Capture using EMG and Position Sensors

 

David Dalmazzo
Rafael Ramirez
& Distribution:

Music and Machine Learning Lab
Universitat Pompeu Fabra
Barceona, Spain

 

 

 

Abstract

    We apply sensor technology to the automatic detection and analysis of gestures during music performances. Taking the violin as a case study, we measure both right hand (i.e. bow) and left hand (i.e. fingering) gestures in violin performances. Based on these data, we create the first model of an interactive music instrument as a virtual violin that gives musical feedback of gestural performance to novice users, by giving sound output directly mapped from motion data, as well learning from user gestures to adapt the sound manipulation.


The pdf version of the full abstract is available here