NIME2017 paper with Ricky Graham et al. Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design

Just a quick preview of some stuff that’s coming up in our NIME 2017 paper (with Ricky Graham and Christopher Manzione of Stevens Institute of Technology and William Brent of American University).

Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design

ABSTRACT

Our paper builds on an ongoing collaboration between theorists and practitioners within the computer music community, with a specific focus on three-dimensional environments as an incubator for performance systems design. In particular, we are concerned with how to provide accessible means of controlling spatialization and timbral shaping in an integrated manner through the collection of performance data from various modalities from an electric guitar with a multichannel audio output. This paper will focus specifically on the combination of pitch data treated within tonal models and the detection of physical performance gestures using timbral feature extraction algorithms. We discuss how these tracked gestures may be connected to concepts and dynamic relationships from embodied cognition, expanding on performative models for pitch and timbre spaces. Finally, we explore how these ideas support connections between sonic, formal and performative dimensions. This includes instrumental technique detection scenes and mapping strategies aimed at bridging music performance gestures across physical and conceptual planes.

 

Keywords

Gesture, embodied, schemas, mapping, metaphor, spatialization, timbre, feature, tracking.

 

Further details

The paper uses Unity to explore embodied ideas within musical gestures (including the use of Brent’s timbreID feature detection library). I contributed some ideas on embodied structures which can be used in mapping.

nime-2017-figure-3

Figure 3. An embodied timbre–space; concepts from Smalley (1997) and Johnson (2008), after Bridges and Graham (2015)

 

Embodied Dynamics – Potential Mapping Strategies
Dynamic 1: Temporal Synchronicity of Attack Envelopes X ranges from motion launching (rapid dynamic change, more synchronous entry) to gradual contour energy (asynchronous entry of partials).
Dynamic 2: Spectral Energy Distribution: Height vs. Rootedness Y via the spectral centroid gives us two scales and dynamics: contour energy (verticality schema: pitch height) and associated motion rootedness; regions of stability.
Dynamic 3: Spatial Clarity within Individual Sound Sources Z via presence or absence of attack transients articulates motion rootedness or tension (audible transient products of inertia) to ungrounded events (diffuse or sustained tones). This is related to a diffuse–to–point source spatial coverage schema.

Table 3. Mapping Strategies based on Embodied Dynamics

 

Gesture Sonic Affordance
Short, repetitive movements; cycles if complete repetition Detached individual sound events, cycle–loops.
Expansive gesture Clear path or projection outward (versus inertia)
Less expansive gesture and small gesture again Weak projection (inertial) and chaotic path.

Table 4. Using Bodily Gestures to Drive Parametric Change

 

Ricky and Christopher have made some great demo videos of how performance gestures can be creatively mapped within a 3D/VR space.

I’m only sorry I won’t be able to be at the conference in Copenhagen in person!