Other Research

OpenMusic

 Tools for Spectral Manipulation 

 To facilitate the extraction and manipulation of spectral materials for use in my doctoral thesis composition Elucide, I developed a set of software tools using the graphical programming language OpenMusic.  These tools provide the composer with musical representations of the pitch and amplitude data contained in a spectral analysis (derived from SDIF files produced in AudioSculpt), and enable him or her to intuitively navigate this information to select the materials that are most interesting and relevant to the composition.  Once the source materials have been selected, the composer can filter, project in time, or otherwise manipulate the selected materials using a modular set of patches which afford the composer musical feedback for aesthetic control throughout the process.  A more complete description of these patches can be found here.



Gestural Controllers

I have used a number of different gestural interfaces to control electronic musical instruments.

Wacom tablet: I have found this to be a versatile and expressive interface, and have used it to control electronic sounds for a number of pieces.


Video tracking: I have used colour-tracking to create an SVP-based instrument, and am currently extending this concept to an instrument based on the Leap Motion controller.


This browser cannot play the embedded video file.


P5 3D Glove: As part of the GRASSP project at UBC (with Dr. Robert Pritchard), I developed an instrument for the manipulation of spectral materials that uses this gaming glove. I used it in my piece from that which could, and presented it in a poster presentation at ICMC’07 and SMC’07, entitled “Fractured Sounds, Fractured Meanings: A glove-controlled spectral instrument".


DSC02440


NoteAbilityPro/Max Score Following

This project centers on the development of an environment for interactive composition and performance, integrating Dr. Keith Hamel’s notation software NoteAbilityPro with Max/MSP-based score following and responsive sound generation.  I presented a paper (co-authored by myself and Dr. Hamel) at the ICMC ’07 conference entitled “A Score-Based Interface for Interactive Computer Music”, detailing the operation of the environment.  In a live performance, the system is able to follow the live musicians and trigger soundfiles, send MIDI messages, and control complex processes at specified score locations, regardless of the performer’s tempo fluctuations.  Extensions were made to NoteAbilityPro (forming the Integrated Interactive Music Performing Environment, or IIMPE) that allow a NA score to send control messages to 16 different IP addresses and ports during score playback, and 8 ports are available in NoteAbilityPro for receiving network messages from the connected applications.  Modules created in Max/MSP receive messages sent by NA, and allow Max/MSP to control the NA score playback.  My involvement in the project has centered around the implementation of IRCAM’s suivi.score and antescofo objects for score following.  In this configuration, a score following patch tracks the pitches played by the performer, and sends score location and tempo information to NoteAbilityPro, which in turn adjusts its playback tempo to align with the performer.  

 A video of a portion of my piece Conduits for clarinet and electronics demonstrates this system. More information on this project can be found here.  This project received support from the SSHRC foundation.



© David Litke 2014