The MINT Forum 2018 is devoted to the advancement of music through new technologies. How can advances in new technologies change the experience of making music (from the musician’s perspective) and listening to music (from the audience’s perspective)? What are the musical and performative implications and applications of the vast array of new technologies that are now emerging?WHERE: University of Kings College, Halifax, NSWHEN: 16–18 November, 2018
This project is a collaboration between GEM Lab and the Narratives in Space+Time Society (NiS+TS). It serves as a public platform for exploring the past and present urban geography of the area surrounding the Narrows where the Halifax Explosion took place in December 1917. The tabletop consists of a semi-opaque glass projection surface representing the harbour and solid pine CNC-shaped forms for land. Projection mapping is used to project various content on the tabletop, including aerial photographs and historical maps showing the devastation caused by the explosion. Additional computer-generated content can be interactively explored using a Microsoft HoloLens head-mounted display.
Now available online: Joseph Malloch, Stephen Sinclair, and Marcelo M. Wanderley. “Generalized Multi-Instance Control Mapping for Interactive Media Systems”. In IEEE MultiMedia, 25(1), January–March 2018. DOI: 10.1109/MMUL.2018.112140028
We articulate a need for the representation of temporal objects reflecting dynamic, short-lived mapping connections instantiated from a template, in tools for designing and using interactive media systems. A list of requirements is compiled from an examination of existing tools, practical use cases, and abstract considerations of node connectivity and information propagation within a graph of connected devices. We validate the concept through implementation in the open source software libmapper, and explore its application by integration with existing controller/synthesizer software and hardware.
We tend to focus a lot on new Max objects in the Package Manager, but with Max there are many ways to solve problems without compiling externals. This Package Manager release brings a collection of highly practical Max abstractions from McGill University’s IDMIL, designed with music and digital orchestra projects in mind. Looking at this package, all of the well-organized abstractions are clearly the result of real-world patching that we can all learn a few tricks from.
The Digital Orchestra Toolbox is now available in the Max Package Manager
The Routledge Companion to Embodied Music Interaction captures a new paradigm in the study of music interaction, as a wave of recent research focuses on the role of the human body in musical experiences. This volume brings together a broad collection of work that explores all aspects of this new approach to understanding how we interact with music, addressing the issues that have roused the curiosities of scientists for ages: to understand the complex and multi-faceted way in which music manifests itself not just as sound but also as a variety of cultural styles, not just as experience but also as awareness of that experience.
With contributions from an interdisciplinary and international array of scholars, including both empirical and theoretical perspectives, the Companion explores an equally impressive array of topics, including:
- Dynamical music interaction theories and concepts
- Expressive gestural interaction
- Social music interaction
- Sociological and anthropological approaches
- Empowering health and well-being
- Modeling music interaction
- Music-based interaction technologies and applications
This book is a vital resource for anyone seeking to understand human interaction with music from an embodied perspective.
Along with many other interesting chapters, the volume includes a contribution I co-wrote with Marcelo Wanderley titled Embodied Cognition and Digital Musical Instruments: Design and Performance.
A little more than 15 years have passed since the small NIME workshop was held during the ACM Conference on Human Factors in Computing Systems (CHI) in 2001 (Poupyrev et al. 2001b). Already from 2002, NIME was a conference on its own, and today it is an important annual meeting point of researchers, developers, designers, and artists from all over the world. The participants have very different backgrounds, but they all share a mutual passion for groundbreaking music and technology.
More than 1200 papers have been published through the conference so far, and staying true to the open and inclusive atmosphere of the community, all of the papers are freely available online. The archive is great if you know what to look for, but it has grown to a size that is difficult to handle for newcomers to the field. Even for long-timers and occasional visitors, it is difficult to get an overview of the history and development of the community.
At recent editions of the conference, we have seen a growing number of papers focusing on historical, theoretical, and reflective studies of the NIME community (or even communities) itself. As this level of meta-studies started to grow, we began to see the potential for a collection of articles that could broadly represent the conference series. This thought has now materialized in the anthology you are currently holding in your hand, or reading on a screen.
The anthology includes the chapter 2005: Towards a Dimension Space for Musical Devices by
Dec 9-10, 2016. See the symposium website for more information and for registration.
Though haptics research in music is a very active research field, it seems presently dominated by tactile interfaces, due in part to the widespread availability of vibrotactile feedback in portable devices. Though not recent—with some of its early contributions dating back to the end of the 70s—research on force-feedback in musical applications has traditionally suffered from exogenous issues such as hardware cost, as well as the lack of community-wide accessibility to software and hardware platforms for prototyping musical applications. Despite this situation, in recent years several works have addressed this topic proposing software platforms and simulation models.
This symposium will discuss the current state of research and future trends on force-feedback and music (FF&M).
- Bret Battey, De Montfort University, England
- Edgar Berdahl, Louisiana State University, USA
- Christian Frisson, Inria Lille, France
- Alexandros Kontogeorgakopoulos, Cardiff School of Art and Design, Wales [on career break]
- James Leonard, Grenoble, France
- Joseph Malloch, Dalhousie University, Canada
- Julian Neri, McGill University, Canada
- Thomas Pietrzak, Université Lille 1, France
- Ian Sinclair, MPB Technologies Inc, Canada
- Stephen Sinclair, Inria, Chile
- Marcelo Wanderley, IDMIL/CIRMMT, McGill University, Canada