This project aims at studying the relationship between melodic contour perception and body motion. The interaction of bodily responses to rhythmic content are fairly well understood in terms of physical motion, entrainment and synchronization. In terms of pitch material, several models have been suggested regarding the embodied nature of conceptual space as well as in the form of physical motion. In this project, we take melody as the central unit and analyse the gestural responses to melodic contour. The project will include a study of theoretical rules for well-formed melodies from various cultures. The project aims at developing methods for computational analysis of motion capture data for sound tracings. An exploration of the relationships of these to contours, and contour typologies, and additional work in the interaction of melodic contour with sound and timbre features form a part of this project.
The aim of this dissertation is to understand the role of embodiment in melodic contour perception. In other words, it studies how we move our bodies in response to music. Melodies play an important role in both speech and music. This thesis consists of two parts, the first part being a background section discussing the theoretical motivations and methods used. The second part is a collection of four articles. Each of the articles explores a dimension of melodic contour: verticality, motion metaphors, body use, and multi-feature correlational analysis. The empirical work is based on sound-tracing as an experimental paradigm. This brings together the multimodal mappings of pitched sound, gestural imagery evoked by these sounds, and defining geometries of these contours. Two sound-tracing experiments were conducted, resulting in three datasets that have been used in the analyses. In the experiments, participants listened to 16 melodies from four different genres: operatic vocalise, jazz scatting, North-Indian singing, and Sámi joik. The participants listened to each melody twice, the first time standing still, and the second time ``drawing'' the sounds in the air. Infrared motion capture was used to record the participants' body movement, and the analysis is focused primarily on the movement of their hands. The sound analysis is based on signal processing algorithms for pitch detection and methods for contour representation. Cross-correlation of the data is performed using a range of methods from statistical hypothesis testing to canonical correlation analysis. The analysis reveals that although there is a natural propensity to describe pitches in terms of the vertical dimension, the experimental data do not clearly show such an association. Average profiles of movement responses to melodies have an arch-like representation, regardless of the contour of the melody. Spatial height is associated more with relative pitch in the melodic context traced, rather than an absolute pitch scale. In addition, movement indicating metaphoric representation of sound was used more often by participants.
Past Projects> Raga DB
A raag is a melodic structure with grammatical rules for improvised phrases. Raags define tonal relationships between various notes. There are hundreds of raags in number, all having unique descriptors. In this paper, we visualize tonal spaces of raag by creating a graph with a force directed layout, and a propose mapping of colour to this tonal space. We derive the graph for the visualization by parameterization of raags as described in the theory of HCM. We compare a radial layout for these tonal spaces to a colour harmony profile and explain some cross raag relationships using the methods used to derive colour schemes. We discuss the affective implications and empirical verifications of this model.
This model has potential applications in sonification and tone-color mapping. The implications of a layout for tonal music is also useful for deriving implicit harmonic relationships. The graphical relationships of raags can be accessed here>
Music and Gesture in Hindustani Classical Vocal Performance
My thesis investigates the relationships between visual and auditory modalities through the medium of North Indian Classical Music. Intermodal binding and unitization can be studied via natural body gestures and hand movements in improvised music. In this thesis, we present experiments to analyze the dependance relationships between musical parameters and body gestures through data from pitch analysis, motion capture and video processing. We provide evidence for a cognitive model for music-motion binding. We investigate the typology and categorization of these gestures and perception of monodic musical shapes amongst musicians and non musicians. We also implement models to convert these gestures back into sound, using rule based grammars derived from raga models.