- API simplification
- Basic support for vector signals
- API for querying the values of device input signals
- Python bindings now include monitor functionality
- Java bindings for devices and signals
This is a belated announcement for libmapper 0.1, the first public release of this C library intended to ease connection and mapping between input devices and synthesizers.
libmapper provides a network-enabled middleware that allows an application to announce input and output signals to a subnet using a shared multicast UDP port, and automatically respond to requests to create dynamic connections. By using multicast, we avoid dependence on a central hub to manage connections, and use it to arrange signal transmission on a peer-to-peer basis without requiring a rebroadcasting infrastructure for data sharing. Additionally, this decentralized approach allows collaborative manipulation of signal connections on the local subnet, encouraging an experimental approach to mapping design.
Mappings between signals can be constructed dynamically that perform translation of OSC message addresses for the receiver, and additionally can apply arbitrary transformation of signal values based on a given formula.
Documentation, source code, and binaries can be found on the project’s website.
libmapper constitutes a new C version of our protocol that has been previously developed in Max/MSP (download publication), and marks the beginning of an effort to implement support for this protocol in several different languages and audio environments. Currently we provide bindings for Python (through SWIG) and provide external objects for PureData and Max/MSP. Managing connections on the network is still accomplished via our Max/MSP GUI, but we are currently working on a cross-platform application to replace it. Although libmapper 0.1 was actually made public in December, we were originally waiting to finish a first version of this new GUI before announcing libmapper, but have since decided that it would be best to announce libmapper sooner for those who might be interested in using it or contributing to development.
A video of the GUI being used in a previous version of the system can be seen here. In this video, the GUI is shown interacting with a controller and a synthesizer both in Max/MSP; now with libmapper, the synthesizer and also the program that communicates with the controller could now be written in C, or any language with bindings to the library.
libmapper’s only dependency is liblo, which is used to send and receive OSC messages, which are used for all communications. It has been developed and tested on Linux and Mac OS X operating systems.
Please see the README and Tutorial on that page for further details on concept and usage.
Thank you for your attention, feedback is welcome on the project mailing list.
The libmapper team, Input Devices and Music Interaction Laboratory, McGill University.
Here’s a short video demonstrating some software developed in the IDMIL for mapping digital musical instruments. Mapping refers to the process of connecting gesture parameters with sound synthesis parameters, and forms a crucial part of the interaction design. In the course of the McGill Digital Orchestra project, we developed a number of tools for assisting collaborative creation of mapping layers.
Read more at www.idmil.org/software/mappingtools
Andrew Stewart recently posted some short videos demonstrating some of the new mapping and synthesis he has been working on for the soprano T-Stick, a DMI I built for my Master’s thesis. He is currently working on a composition for two soprano T-Sticks, which he has dubbed the “sonar jo” when matched with his mapping and synthesis. A jō is a short staff used in some Japanese martial arts, but it is also my first name…