We recently posted a teaser video for our project “Instrumented Bodies: Digital Prostheses for Music and Dance Performance” — enjoy!
Music Hack Day comes to Montreal to explore and build the next generation of music applications. It’s a full weekend of hacking in which participants will conceptualize, create and present their projects. Music + software + mobile + hardware + art + the web. Anything goes as long as it’s music related. Music Hack Day is presented by developers for developers.
Music Hack Day arrive à Montréal pour explorer et construire la prochaine génération d’applications musicales. Durant tout un week-end de programmation informatique, les participants vont conceptualiser, réaliser et présenter leur projets. Musique + logiciel + mobilité + hardware + art + le web. Tout y passe du moment qu’on y parle de musique. Music Hack Day est conçu par les programmeurs pour les programmeurs.
Attendance is free but you must register to guarantee a spot. We provide workspaces and snacks throughout the day, but need confirmed numbers to make the event great for everyone attending. Visit the Registration Page to register as a hacker or if you want to attend the demo session at the end of the weekend.
Music Hack Day Montreal will be held at Eastern Bloc in Montreal, on the 24th and 25th of September.
What happens at a Music Hack Day? Lots of hacking, lots of pizza, very little sleep. For details read this post: What happens at a Music Hack Day. Read about some of the winning hacks at the recent Music Hack Day in New York City.
Here’s a short video demonstrating some software developed in the IDMIL for mapping digital musical instruments. Mapping refers to the process of connecting gesture parameters with sound synthesis parameters, and forms a crucial part of the interaction design. In the course of the McGill Digital Orchestra project, we developed a number of tools for assisting collaborative creation of mapping layers.
First a quote from Carmine Casciato’s M.A. Thesis:
Originally referred to as an electronic drum controller, the Radio Baton consists of two batons which are in essence radio transmitters. It employs a near-field capacitive measurement. Each baton is driven by an oscillator at a different frequency (50 kHz and 55 kHz respectively) so as to allow for independent tracking of both. They are tracked over a rectangular tablet that houses two pairs of shaped radio receiving antennas. The first pair is shaped so that the X coordinate of each baton can be determined as close to linearly as possible; similarly, the final pair corresponds to the Y coordinate. These coordinates refer to the horizontal plane in front of the user. The incoming signals are processed by a CPU such that a vertical Z-coordinate up to 15 centimeters above the surface is also output.
The IDMIL is currently improving the data-capture methods used by Casciato by improving synchronization and sample-rates and reducing latency and jitter. One problem is that while the Radio Baton outputs MIDI, the messages are all “Poly Aftertouch” messages, even when the device is struck like a drum. Previously this was dealt with by using a general purpose computer to listen to the MIDI datastream and output “note on” messages when appropriate. The MIDI was then routed to a Roland TD-20 drum sound module.
One solution is to use a microcontroller dedicated to translating the MIDI messages instead of a PC, since it can run much faster than the routine running on the PC, and will have vastly reduced jitter since it is not doing anything else. We whipped up something fairly quickly using an Arduino Pro Mini board, with just a few extra parts and some online resources:
- Information on the Radio Baton
- MIDI hardware specification
- MIDI messages
- Discussion on Arduino forum
Below are some photos of the finished box and its firmware: