IDMIL News!
Marcelo Wanderley awarded a Francqui Chair at the University of Mons
Prof. Marcelo Wanderley was awarded a Francqui Chair at the University of Mons, Belgium, by the Francqui Foundation.
He will present 5 lessons in October 2017 at the Numédiart Institute in Mons, when he will discuss several aspects of multidisciplinary research on music, science and technology.
Lesson Schedule:
- Inaugural lecture, October 5: “Easier Said than Done”- On more than Two Decades of Applied Multidisciplinary Research Across Engineering, Science and Music
- October 6: Human-Computer Interaction and Computer Music: Similarities, Differences and Opportunities for Cross-Fertilization
- October 10: Digital Musical Instruments: Design of and Performance with New Interfaces for Musical Expression
- October 11: Analysis of Performer Movements: Techniques and Applications
- October 12: Creating Large-Scale Infrastructures for Multidisciplinary Research in Music Media and Technology: The Case of the Centre for Interdisciplinary Research in Music Media and Technology, CIRMMT/McGill University
Best Paper Award at HCII 2017
Ian Hattwick, Ivan Franco and Marcelo Wanderley win Best Paper Award at 2017 HCI Conference
HCI International Conference recently announced that the paper “The Vibropixels: a Scalable Wireless Tactile Display System” has received the Best Paper Award of the Human Interface and the Management of Information Thematic Area of the 2017 Human-Computer International Conference in Vancouver, BC. The paper presents a brief overview of the VibroPixel tactile display system and discusses ways in which the system was created to facilitate its use in the artistic creation process.
Call for Abstracts - CIRMMT Symposium on Force Feedback and Music
The Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) and the Input Devices and Music Interaction Laboratory (IDMIL), McGill University, will organize a 2-day workshop on December 9-10, 2016, to discuss the state-of-the-art and the possible future directions of research on force-feedback and music (FF&M).
We would like to invite submissions of research reports (abstracts) to be presented during the event.
Summary of Important Information:
- CIRMMT Symposium on Force Feedback and Music
- When: December 9 – 10, 2016
- Where: CIRMMT, McGill University, Montreal, Qc, Canada (www.cirmmt.org)
- Deadline for abstract submission: Monday November 14 16 2016
- Acceptance notification: Thursday November 17 2016
Context: Though haptics research in music is a very active research field, it seems presently dominated by tactile interfaces, due in part to the widespread availability of vibrotactile feedback in portable devices. Though not recent—with some of the its early contributions dating back to the end of the 70s—research on force-feedback in musical applications has traditionally suffered from issues such as hardware cost, as well as the lack of community-wide accessibility to software and hardware platforms for prototyping musical applications. In spite of this situation, in recent years, several works have addressed this topic proposing software platforms and simulation models.
We believe that it is then time to probe the current state of research on Force-feedback and music (FF&M).
Topics: We invite the submission of abstracts summarizing innovative work addressing force-feedback applications to music and media, including, but not limited to, the following topics:
- Music and media applications of force-feedback systems
- FF models of musical actions
- Audio and mapping systems adapted to force-feedback applications
- Open-hardware and open-source solutions for developing and sharing force-feedback/musical projects
- Evaluation of system quality vs cost in musical applications
- The pros/cons of existing hardware and software systems
- The obsolescence of hardware and software systems and ways to deal with it
- Perceptual issues in FF&M
Abstract submission: To propose a contribution, please send a one page summary of your proposal detailing:
- the research you carry out
- its current results
- its future perspectives
Deadlines and how to submit: The deadline for abstract submission is Monday November 14 2016. Selection results will be announced on Thursday Nov 17.
Please send your one-page abstract to reception@cirmmt.mcgill.ca, with the email subject: [FF&M 2016] Contribution by “XXX”
Support: The CIRMMT Symposium on Force Feedback and Music is sponsored by the Input Devices and Music Interaction Laboratory, McGill University, the Inria-FRQNT MIDWAY “équipes de recherche associées” project, and the Centre for Interdisciplinary Research in Music Media and Technology and McGill University.
Video featuring fMRI-compatible musical instruments
The development of an fMRI compatible cello (Avrum Hollinger's Ph.D. thesis) was featured recently in a video shown at Canal Savoir and referenced in Wired Magazine. These instruments are used by neuroscientists and psychologists to study the brains of musicians while performing their instruments.
Tech showcase at SERI Montreal
SERI Montreal will take place on Monday 1st of May, 2016. This year theme is “La mobilité, du véhicule à la molécule” (Mobility, from vehicle to molecule“). The purpose of this event is to give a visibility to the researchers from all Montreal universities. Approximatively 250 companies will participate, and it is the opportunity to present what IDMIL lab is doing in Mobility field (in the large sense). One third of the companies are related to Big data, virtual reality, medical devices, etc (see a video on their youtube channel).
IDMIL researchers Ian Hattwick and Baptiste Caramiaux will attend the event and demonstrate some of the latest movement and music technologies: from realtime movement analysis to hardware controllers.
IDMIL to present research at EuroHaptics 2016 Conference
IDMIL PhD researcher John Sullivan will present recent haptics research at the EuroHaptics 2016 Workshop “Musical Haptics: use and relevance of haptic feedback in musical practice”. The workshop will take place at July 4, 2016 from 9:15 - 13:15 at Imperial College London, UK.
The workshop, organized by Stefano Papetti (Institute for Computer Music and Sound Technology, Zurich University of the Arts, Switzerland) and Ercan Altinsoy (Institute of Acoustics and Speech Communication, Dresden University of Technology, Germany), will include presentations by Vincent Hayward, Gareth Young, Nicolas Castagné, and Edgar Berdahl.
Abstract: Tactile Augmented Wearables for Delivery of Complex Musical Score Information
John Sullivan, Deborah Egloff, Marcello Giordano, Marcelo M. Wanderley
Tactile augmented wearables have been object of much research in recent years, both in academia and industry, and have been used to convey information such as navigational cues or system notifications. In the music domain, tactile wearables have been used to convey simple musical information to performers, in the form of, for instance, tempo cues or instantaneous feedback about the interaction with a live-electronics system. More complex tactile cues can also be designed. “Musicking the body electric” is a multidisciplinary project aiming at developing a set of tactile augmented garments for professional musicians, and a vocabulary of complex tactile icons (“tactons”) to be used by composers to deliver score information. What are the perceptual limitations of delivering complex, whole-body patterns of vibrations to performing musicians? Can musicians learn to reliably recognize icons and associate them to score elements? What are the best strategies for actuator choice and placement? These research questions are at the core of the “Musicking the body electric” project, and offer a cause for a more general reflection on the many issues to be addressed to evaluate abstract languages of tactile icons delivered by specialized wearable devices.
See the full workshop program here.
IDMIL Researchers at the Workshop "Haptics and musical practice"
Prof. Marcelo M. Wanderley and Ph.D. student Deborah Egloff participated in the Workshop “Haptics and musical practice”, that took part on February 4-5 2016 at the Institute for Computer Music and Sound Technology, Zurich University of the Arts, Zurich, Switzerland.
They were part of a group of researchers from Switzerland, Italy, Austria, France, Scotland, Germany who research haptics and music.
The full program is available at the Workshop webpage.
IDMIL researchers at CNMAT, University of California Berkeley
Prof. Marcelo Wanderley and Dr Baptiste Caramiaux will be at the Center for New Music and Audio Technologies (CNMAT), University of California, Berkeley, on May 6th, 2016. In the morning, Baptiste will give a talk jointly with Frédéric Bevilacqua from IRCAM on Movement and Sound Interaction, a research overview from music performance to motor cognition. In the afternoon, Marcelo will give a lecture on research conducted at CIRMMT.
Details are available on the CNMAT website: http://cnmat.berkeley.edu/
A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression
Now available online: Carolina Brum Medeiros and Marcelo M. Wanderley (2014). A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression. Sensors Journal 14, no. 8: 13556-13591.
Abstract:
Digital Musical Instruments (DMIs) are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009–2013). Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments.
Distributed tools for interactive design of heterogeneous signal networks
Now available online: Malloch, J., Sinclair, S., M. M. Wanderley (2014). Distributed tools for interactive design of heterogeneous signal networks. Multimedia Tools and Applications. DOI: 10.1007/s11042-014-1878-5
Abstract:
We introduce libmapper, an open source, cross-platform software library for flexibly connecting disparate interactive media control systems at run-time. This library implements a minimal, openly-documented protocol meant to replace and improve on existing schemes for connecting digital musical instruments and other interactive systems, bringing clarified, strong semantics to system messaging and description. We use automated discovery and message translation instead of imposed system-representation standards to approach “plug-and-play” usability without sacrificing design flexibility. System modularity is encouraged, and data are transported between peers without centralized servers.
Lab Photo
From left: R. Michael Winters, Stephen Sinclair, Carolina Brum Medeiros, Marcelo M. Wanderley, Vanessa Yaremchuk, Ian Hattwick, Mark Zadel, Joseph Malloch. Missing: Marcello Giordano, Avrum Hollinger, Marlon Schumacher, Erika Donald, Bruno Angeles, Mahtab Ghamsari-Esfahani.
photo by: Scott Levine
Sonification Talk at CIRMMT Workshop on Symbolic Music
Mike Winters will present his research on Sonification at the CIRMMT Workshop on Symbolic Music Processing, Semantic Audio, and Music Information Retrieval. The talk, entitled “Sonification in MIR: Corpora Analysis and Music Emotion Recognition Model Matching,” concludes the workshop, and will include motivations and techniques for sonification in symbolic music analysis, and his research in sonification of emotion, most specifically in emotion model matching.
Sonification of Emotion Research to Appear in Organized Sound
Masters student R. Michael Winters' research on Sonification of Emotion is scheduled to appear in the April 2014 issue of Organised Sound. The paper is entitled “When the Data is Emotion: Strategies and Results from the Intersection of Sonification and Music,” and features the first ever computational design and evaluation of sonifications of emotion. He used the MIREmotion function from the MIRToolbox and specially designed two MATLAB frameworks for interactive exploration and iterative design. The paper will also appear in Chapter 4 of his manuscript based thesis, “Exploring Music through Sound: Sonification of Emotion, Gesture, and Corpora.”
Instrumented Bodies article in Gizmag
There is a new article on the Instrumented Bodies project at Gizmag:
Instrumented Bodies gives music and dance some backbone by Paul Ridden (with accompanying image gallery)
Software Download Statistics
Top 10 downloads (external to the lab) since November 2010:
Software | Number of downloads |
---|---|
Digital Orchestra Toolbox | 1001 |
DIMPLE | 849 |
libmapper | 647 |
libmapper bindings for MaxMSP/Pure Data | 567 |
libmapper GUI | 387 |
Different Strokes | 306 |
OMPrisma | 282 |
Multiplayer for OMPrisma | 237 |
granul8 synth | 76 |
libmapper bindings for Python | 39 |
IDMIL in Jyväskylä
R. Michael Winters presented two IDMIL research projects at the International Conference on Music and Emotion in Jyväskylä Finland June 11-15th. Mike was the recipient of a CIRMMT student travel award and a SEMPRE award to attend this conference. The first paper was a collaboration between IDMIL and Emotional Imaging Incorporated and is entitled “Emotional Data in Music Performance: Two Audio Environments for the Emotional Imaging Composer” and was co-written with Ian Hattwick and Marcelo M. Wanderley. The second paper “Sonification of Emotion: Strategies for Continuous Auditory Display of Arousal and Valence” was co-written with advisor Marcelo M. Wanderley.
libmapper 0.3 released
In addition to many improvements and bugfixes, libmapper v0.3 includes several major new features:
- Time: libmapper now supports timetagging every signal update. With this we include the ability to bundle several signal updates together using “queues” which defer sending of updates. We also add an optional
rate
property to signals which can be used to represent regularly-sampled signals. This means that it is possible to support “blocks” of data where each sample does not have the overhead of an individual message header. The default behaviour, however, is still to send each signal update as an individual, time-stamped OSC message, assuming irregular updates. Software sampling sensors should be aware of how to distinguish between regularly- and irregularly-sampled signals and use the correct behaviour accordingly. At low sample rates, of course, the difference is not as important.
- Instances: libmapper now supports the concept of mapping multiple “instances” of a signal over a single connection. This can be used for supporting interaction with multi-touch devices as well as for polyphonic musical instruments. A sophisticated “ID map” between instances created on a sender and instantiated on the receiver is automatically maintained, allowing the synchronization of behaviours between the two connection endpoints. Allocation of (possibly limited) resources to individual instances can be managed by a callback, which is called when new instances are created and old instances are destroyed. A complete treatment of this topic is not possible here, but will be documented in a forthcoming paper.
- Reverse connections: A connection mode called
MO_REVERSE
has been added which can be used to establish signal from from output to input. The intention is to use this for training machine learning systems, allowing the construction of so-called example-based “implicit mapping” scenarios.MO_REVERSE
of course does not invert the mapping expression, so is best-used for direct connections to an intermediate device that sits between a data source and destination. Single “snapshots” of output state can also be acquired by a query system, usingmsig_query_remotes()
.
- Compatibility with select(): It is now possible to get a list of file descriptors that libmapper reads from. This can be used as arguments to
select()
orpoll()
in a program that uses libmapper among other sockets or files to read from.
- Local link/connection callbacks: Programs using libmapper can now know when a local device has been linked or connected by registering a callback for these events. Although it is not encouraged at this time, this can be used for example to implement data transport over alternative protocols, using libmapper only for determining the destination IP and agreeing on the connection details. In the future we plan to add support for alternative protocols within the libmapper framework, so that mapping expressions can still be used even when alternatives to OSC are used to send and receive data.
- Null values: An important change that users should be aware of is that the signal callback (whose signature has changed) can now be called with the
value
pointer being zero. This indicates that the mapper signal is actually non-existant, i.e. no value is associated with the signal. That is to say that “null” is now considered a valid state for a signal and is different from, for example, a value of zero. User code can choose to ignore this case or to make use of it if there is a semantically relevant action to take. An example of this condition is if some signal does not have a valid value when some other signal is in a certain state, or for reverse connections, a null value might indicate that the output has not been set to anything yet. Local code can query the last-updated value of any signal using themsig_value()
function.
The tutorial has also been translated for several supported language bindings: Python, and Max/MSP. Build documentation has also been added.
EIC Performance System Video
The performance system for the Emotional Imaging Composer, described in this video, presents an approach to using emotion data in live music performance. For this system, biosignal data from the performer is analyzed to determine emotional valence and arousal, and this emotional data is used to control the parameters of a spectral delay.
MIREmotion-Visualizer
After working the past month with MIREmotion, I decided to make a visualizer for it. The output is a small GUI that generates graphs displaying the relevant audio-features for each emotional dimension or category. The graphs include not only the MIREmotion score, but also the limits where relevant. It also saves these graphs in .eps format and also a new .wav file with the descriptive title.
I've been using it to create “emotional measurements” from sounds generated synthetically in Supercollider. It hasn't been tested by anyone by myself, so at this point I would expect catastrophe if you were so brave as to try to use it. Nevertheless, it is available on github: