Project title: Real-time Rhythmic Transformation of Polyphonic Audio by Gestural Control
Participants: Joseph Malloch, Jason Hockman
Time period: 2009–2010
This project combines techniques from a variety of fields – music information retrieval (MIR), music theory, digital signal processing (DSP), human-computer interaction (HCI) and machine learning – to create a system for gesturally exploring alternate rhythmic possibilities of recorded polyphonic audio. The system automatically generates a rhythmic transformation space for mapping performer gesture to rhythm modification.
In “edit” mode, the positions of sample tracks can be manually adjusted. In “global” mode the influence of the sample tracks on the output is linearly weighted according to their distance from the cursor. In “Gabriel” and “Delaunay” modes, the interpolation space is tessellated; only a subset of the sample tracks local to the cursor will influence the generated rhythmic feel.
The working prototype used a Polhemus Liberty 6dof magnetic tracking device to track the position of the interactor’s hands. This position was mapped to control of the 2D interpolation space shown above. A phase vocoder (pv from the WaoN project) was used to resynthesize polyphonic target audio while adjusting the rhythmic feel in real time. For the initial prototype, example and target tracks must be manually annotated to identify beat and sub-beat timing – future work addresses automatic analysis and annotation of this information.