The Routledge Companion to Embodied Music Interaction

embodied_music_interaction The Routledge Companion to Embodied Music Interaction captures a new paradigm in the study of music interaction, as a wave of recent research focuses on the role of the human body in musical experiences. This volume brings together a broad collection of work that explores all aspects of this new approach to understanding how we interact with music, addressing the issues that have roused the curiosities of scientists for ages: to understand the complex and multi-faceted way in which music manifests itself not just as sound but also as a variety of cultural styles, not just as experience but also as awareness of that experience.
With contributions from an interdisciplinary and international array of scholars, including both empirical and theoretical perspectives, the Companion explores an equally impressive array of topics, including:

  • Dynamical music interaction theories and concepts
  • Expressive gestural interaction
  • Social music interaction
  • Sociological and anthropological approaches
  • Empowering health and well-being
  • Modeling music interaction
  • Music-based interaction technologies and applications

This book is a vital resource for anyone seeking to understand human interaction with music from an embodied perspective.

Along with many other interesting chapters, the volume includes a contribution I co-wrote with Marcelo Wanderley titled Embodied Cognition and Digital Musical Instruments: Design and Performance.

A NIME Reader: Fifteen Years of New Interfaces for Musical Expression

A little more than 15 years have passed since the small NIME workshop was held during the ACM Conference on Human Factors in Computing Systems (nime_readerCHI) in 2001 (Poupyrev et al. 2001b). Already from 2002, NIME was a conference on its own, and today it is an important annual meeting point of researchers, developers, designers, and artists from all over the world. The participants have very different backgrounds, but they all share a mutual passion for groundbreaking music and technology.

More than 1200 papers have been published through the conference so far, and staying true to the open and inclusive atmosphere of the community, all of the papers are freely available online. The archive is great if you know what to look for, but it has grown to a size that is difficult to handle for newcomers to the field. Even for long-timers and occasional visitors, it is difficult to get an overview of the history and development of the community.

At recent editions of the conference, we have seen a growing number of papers focusing on historical, theoretical, and reflective studies of the NIME community (or even communities) itself. As this level of meta-studies started to grow, we began to see the potential for a collection of articles that could broadly represent the conference series. This thought has now materialized in the anthology you are currently holding in your hand, or reading on a screen.

The anthology includes the chapter 2005: Towards a Dimension Space for Musical Devices by David Birnbaum, Rebecca Fiebrink, Joseph Malloch, and Marcelo M. Wanderley.

CIRMMT Symposium on Force Feedback and Music

Dec 9-10, 2016. See the symposium website for more information and for registration.cirmmt_logo2005onlyhi

Though haptics research in music is a very active research field, it seems presently dominated by tactile interfaces, due in part to the widespread availability of vibrotactile feedback in portable devices. Though not recent—with some of its early contributions dating back to the end of the 70s—research on force-feedback in musical applications has traditionally suffered from exogenous issues such as hardware cost, as well as the lack of community-wide accessibility to software and hardware platforms for prototyping musical applications. Despite this situation, in recent years several works have addressed this topic proposing software platforms and simulation models.

This symposium will discuss the current state of research and future trends on force-feedback and music (FF&M).

Speakers

  • Bret Battey, De Montfort University, England
  • Edgar Berdahl, Louisiana State University, USA
  • Christian Frisson, Inria Lille, France
  • Alexandros Kontogeorgakopoulos, Cardiff School of Art and Design, Wales [on career break]
  • James Leonard, Grenoble, France
  • Joseph Malloch, Dalhousie University, Canada
  • Julian Neri, McGill University, Canada
  • Thomas Pietrzak, Université Lille 1, France
  • Ian Sinclair, MPB Technologies Inc, Canada
  • Stephen Sinclair, Inria, Chile
  • Marcelo Wanderley, IDMIL/CIRMMT, McGill University, Canada

The Spine

The Spine is a “prosthetic” digital musical instrument developed for the collaborative project Les Gestes, in which we endeavoured to design new instruments for dancers.  The new instruments would extrapolate from the T-Stick, which we had already used in the performance Duo pour un violoncelle et un danseur with the same collaborators. Starting with foam prototypes, the Spine and its companion instruments the Rib and the Visor were developed iteratively using participatory design through frequent workshops, parallel problem solving, and digital fabrication methods. The current models are fabricated from laser-cut transparent acrylic, transparent PVC tubing, and PETg rods. The entire structure is assembled using interference fitting rather than any glues or fasteners.

The Spine tracks and reports it’s orientation and shape in real-time, accomplished through the use of inertial and magnetic-field sensing at each end of the structure. Sensor-fusion algorithms run on-board the instrument.

I have previously blogged some teaser photos and a couple of videos showing a demonstration of the orientation and deformation sensing I developed for the Spine and a promo for the upcoming shows.