Thesica.org, the #1 open access web portal for PhD theses...

Why PhD theses...

PhD thesis is the result of years of hard work.

keyword researchMeasured by download count PhD theses are one of the most popular items world wide on open access repositories. But unless a thesis is published, it is very difficult for other researchers to find out about it and get access to it. Theses are often under-used by other researchers. Thesica.org attempts to address this issue by making it easy to identify and locate copies of many theses in various disciplines.

ACTION – SOUND: Developing methods and tools to study music-related body movement

ACTION – SOUND: Developing methods and tools to study music-related body movement
Alexander Refsum Jensenius

2007

Department of Musicology University of Oslo, Oslo, NORWAY.

ABSTRACT

Body movement is integral to both performance and perception of music, and this dissertation suggests that we also think about music as movement. Based on ideas of embodied music cognition, it is argued that ecological knowledge of action-sound couplings guide our experience of music, both in perception and performance. Then follows a taxonomy of music-related body movements, before various observation studies of perceiver’s music-movement correspondences are presented: air instrument performance, free dance to music, and sound-tracing . These studies showed that both novices and experts alike seem to associate various types of body movement with features in the musical sound. Knowledge from the observation studies was used in the exploration of artificial action-sound relationships through the development of various prototype music controllers, including the Cheapstick, music balls, and the Music Troll. This exploration showed that it is possible to create low-cost and human-friendly music controllers that may be both intuitive and creatively interesting. The last part of the dissertation presents tools and methods that have been developed throughout the project, including the Musical Gestures Toolbox for the graphical programming environment Max/MSP/Jitter; techniques for creating motion history images and motiongrams of video material; and development of the Gesture Description Interchange Format (GDIF) for streaming and storing music-related movement data. These tools may be seen as an answer to many of the research questions posed in the dissertation, and have facilitated the analysis of music-related movement and creation of artificial action-sound relationships in the project.