Implicit Mapping

matmap

A screenshot of the matmap software application, bridging MnM tools from IRCAM with libmapper.

Time period: 2011–present
Links: Implicitmap externalMaxMSP patches

One of the design philosophies of the libmapper project is avoid reinventing or reimplementing existing tools for mapping – instead we endeavour to make it easy to link disparate tools together with a low overhead for compatibility.  As an example, we linked several tools for using supervised machine learning for creating implicit mapping between gesture and sound synthesis, such as the MnM tools from IRCAM (Bevilacqua et al., NIME 2006) with libmapper through a custom external object for MaxMSP. This object – implicitmap – takes the place of the regular MaxMSP binding objects for libmapper, adding support for querying destination devices for their states (which we already supported through libmapper) and for dynamically adjusting the number of local inputs and outputs as necessary.

In addition, we created a new graphical interface for supporting the needs of implicit mapping, with a live display of mapped input and output and curved lines to indicate sampling of the source and destination parameter spaces. Each time a snapshot is requested, libmapper handles retrieving the current destination state over the network, so parameters of the (e.g.) synthesizer can be adjusted using its own interface or manipulated using a 3rd souce of control data.  A “randomize” button was also added to allow quick exploration of the destination parameter space without leaving the implicit mapping GUI.

We have successfully used this software framework for interfacing with two different mapping techniques from MnM and our own implementations of artificial neural networks.