Department of Musicology University of Oslo, Oslo, NORWAY.
Body movement is integral to both performance and perception of music, and this dissertation suggests that we also think about music as movement. Based on ideas of embodied music cognition, it is argued that ecological knowledge of action-sound couplings guide our experience of music, both in perception and performance. Then follows a taxonomy of music-related body movements, before various observation studies of perceiver’s music-movement correspondences are presented: air instrument performance, free dance to music, and sound-tracing . These studies showed that both novices and experts alike seem to associate various types of body movement with features in the musical sound. Knowledge from the observation studies was used in the exploration of artificial action-sound relationships through the development of various prototype music controllers, including the Cheapstick, music balls, and the Music Troll. This exploration showed that it is possible to create low-cost and human-friendly music controllers that may be both intuitive and creatively interesting. The last part of the dissertation presents tools and methods that have been developed throughout the project, including the Musical Gestures Toolbox for the graphical programming environment Max/MSP/Jitter; techniques for creating motion history images and motiongrams of video material; and development of the Gesture Description Interchange Format (GDIF) for streaming and storing music-related movement data. These tools may be seen as an answer to many of the research questions posed in the dissertation, and have facilitated the analysis of music-related movement and creation of artificial action-sound relationships in the project.