Using multiple IMUs for sensing deformation

I made this video today to quickly document some work I’ve been doing using multiple inertial measurement units (IMUs) for tracking the orientation, bending, and twisting of deformable objects. Intertial measurement units generally contain 3-axis accelerometers, gyroscopes, and magnetometers – the data can be cleverly combined to obtain an estimate of orientation (as long as the IMU is being used in an environment with constant gravity and magnetic field, and not in free-fall).

This particular object has two Mongoose IMUs embedded inside it, running some custom firmware I wrote for complementary filtering/sensor fusion for estimation of orientation.

pyo and libmapper

This morning I attended a very interesting presentation on the pyo DSP module for python. From their website:

pyo is a Python module written in C to help digital signal processing script creation.

pyo is a Python module containing classes for a wide variety of audio signal processing types. With pyo, user will be able to include signal processing chains directly in Python scripts or projects, and to manipulate them in real time through the interpreter. Tools in pyo module offer primitives, like mathematical operations on audio signal, basic signal processing (filters, delays, synthesis generators, etc.), but also complex algorithms to create sound granulation and others creative audio manipulations. pyo supports OSC protocol (Open Sound Control), to ease communications between softwares, and MIDI protocol, for generating sound events and controlling process parameters. pyo allows creation of sophisticated signal processing chains with all the benefits of a mature, and wildly used, general programming language.

Here’s an incredibly simple example – it creates and plays a single sine at 200Hz:

from pyo import *

s = Server().boot()
s.start()
a = Sine(freq=200, mul=0.5).out()

Since we already have Python bindings for libmapper, with a few lines of code we can make this example “mappable” and interactive.

from pyo import *
import mapper

s = Server().boot()
s.start()
a = Sine(freq=200, mul=0.5).out()

try:
    dev = mapper.device('pyo_example')
    dev.add_input_signal('/freq', 1, 'f', 'Hz', 200, 500,
                         lambda s, i, v, t: a.setFreq(v))
    dev.add_input_signal('/mul', 1, 'f', 'na', 0, 1,
                         lambda s, i, v, t: a.setMul(v))

    while 1:
        dev.poll(5)

finally:
    s.stop()
    s.shutdown()

*Note: The above code snippet has been updated for compatibility with libmapper v1.0 and greater.

Public release of libmapper


libmapper

This is a belated announcement for libmapper 0.1, the first public release of this C library intended to ease connection and mapping between input devices and synthesizers.

libmapper provides a network-enabled middleware that allows an application to announce input and output signals to a subnet using a shared multicast UDP port, and automatically respond to requests to create dynamic connections.  By using multicast, we avoid dependence on a central hub to manage connections, and use it to arrange signal transmission on a peer-to-peer basis without requiring a rebroadcasting infrastructure for data sharing. Additionally, this decentralized approach allows collaborative manipulation of signal connections on the local subnet, encouraging an experimental approach to mapping design.

Mappings between signals can be constructed dynamically that perform translation of OSC message addresses for the receiver, and additionally can apply arbitrary transformation of signal values based on a given formula.

Documentation, source code, and binaries can be found on the project’s website.

libmapper constitutes a new C version of our protocol that has been previously developed in Max/MSP (download publication), and marks the beginning of an effort to implement support for this protocol in several different languages and audio environments.  Currently we provide bindings for Python (through SWIG) and provide external objects for PureData and Max/MSP.  Managing connections on the network is still accomplished via our Max/MSP GUI, but we are currently working on a cross-platform application to replace it.  Although libmapper 0.1 was actually made public in December, we were originally waiting to finish a first version of this new GUI before announcing libmapper, but have since decided that it would be best to announce libmapper sooner for those who might be interested in using it or contributing to development.

A video of the GUI being used in a previous version of the system can be seen here.  In this video, the GUI is shown interacting with a controller and a synthesizer both in Max/MSP; now with libmapper, the synthesizer and also the program that communicates with the controller could now be written in C, or any language with bindings to the library.

libmapper’s only dependency is liblo, which is used to send and receive OSC messages, which are used for all communications.  It has been developed and tested on Linux and Mac OS X operating systems.

Please see the README and Tutorial on that page for further details on concept and usage.

Thank you for your attention, feedback is welcome on the project mailing list.

Yours,
The libmapper team, Input Devices and Music Interaction Laboratory, McGill University.

DOT Mapper documentation video

Here’s a short video demonstrating some software developed in the IDMIL for mapping digital musical instruments. Mapping refers to the process of connecting gesture parameters with sound synthesis parameters, and forms a crucial part of the interaction design. In the course of the McGill Digital Orchestra project, we developed a number of tools for assisting collaborative creation of mapping layers.