There exists a vast literature on studies analyzing spontaneous rendering of sonic parameters to movement. The fundamental question of these studies has been elucidating what kind of gestures do listeners and performers associate with various musical sounds. Usually participants are asked to draw contours or move while they are listening to short excerpts of music. The intention is capturing instantaneous multimodal association of sounds or music. These results can be used to inspire intuitive interactions and musical controllers. However, we are not aware of any previous study focusing on the production of physical artifacts which were envisioned following similar methods. For Tanaka, the cognitive mappings enacted at these types of studies are always informed, mediated, and inspired by the actual materiality of the controller used (i.e., size, material, shape, acoustic properties, etc). Therefore, in this paper, we present a study analyzing how people materialize their own sound-producing gestures as physical features in musical interfaces.
According to Clarke we all have some ecological knowledge on how sound-producing actions relate to sound. The hypothesis that sound gestures identified in music reflect sound-producing gestures was shown by Henbing and Leman. Godoy and Nymoen have extensively studied the phenomenon of `sound-tracing’ understood as rendering of perceptual sound features through body motion. These authors described a number of music-related action trajectories based on a few features: pitch, dynamics, timbre, and texture. With similar intentions, Caramiaux et al. elaborated a quantitative analysis of movement demonstrating that sound energy is correlated with movement’s velocity and pitch is correlated with hand’s vertical position.