We recently posted a teaser video for our project “Instrumented Bodies: Digital Prostheses for Music and Dance Performance” — enjoy!
I made this video today to quickly document some work I’ve been doing using multiple inertial measurement units (IMUs) for tracking the orientation, bending, and twisting of deformable objects. Intertial measurement units generally contain 3-axis accelerometers, gyroscopes, and magnetometers – the data can be cleverly combined to obtain an estimate of orientation (as long as the IMU is being used in an environment with constant gravity and magnetic field, and not in free-fall).
This particular object has two Mongoose IMUs embedded inside it, running some custom firmware I wrote for complementary filtering/sensor fusion for estimation of orientation.
An excellent textbook co-authored by my PhD advisor Marcelo Wanderley which documents developments in the field of new digital musical instruments (/gestural controllers/new interfaces for musical expression) while placing them within the historical context of the field.
Authors: Eduardo Reck Miranda (University of Plymouth) and Marcelo M. Wanderley (McGill University)
Who should read it: musicians, instrument designers, music technology researchers.
The SenseStage workshop is meant to bring together people from different disciplines (dance, theatre, sound, video, light) and cooperate in a collaborative environment with interactive technologies. The workshop will take place in the Hexagram BlackBox, a special, configurable room equipped with a full set of theatre lights. Furthermore a multichannel sound setup and video equipment will be available. Last but not least, a set of sensor devices with sensors will be available for use during this workshop.
After quite a lot of fiddling around I finally finished porting the SensorWiki.org website from its old MediaWiki version to the new DokuWiki version. This afternoon we pulled the switch and it’s working almost perfectly (except for some new user registration issues to be worked out later). The old version was spammed daily, but it should be much more manageable now 🙂
Last Wednesday I was involved with another concert: Pieces for solo percussion and live electronics performed by UCSD’s Steven Schick. This amazing performance was part of the Live@CIRMMT performance series, and also has another connection to the IDMIL and me. One of the pieces performed was Chatter/Clatter, part of composer Roger Reynolds‘ Sanctuary Project, which was initially workshopped in the IDMIL as part of a project exploring gesture control of sound spatialization. We developed a sensing system using piezo-electric contact microphones on the percussionist’s fingertips, a technique which is now used in the piece.
I also drove the computer for this performance, which was running software in Pure Data (pd) which handled processing of the live sound and 12-channel sound spatialization.