CoMo | Collective Movements

CoMo is a collection of prototype web apps for movement-based sound interaction, targeting collective interaction using mobile phones.

CoMo is based on a software ecosystem developed by IRCAM, using XMM (see in particular xmm-soundworks-template, Collective Soundworks, waves.js. It will also integrate soon the RAPID-MIX API.

CoMo allows for distributed interactive machine learning architecture. It consists of lightweight client-side JavaScript libraries that enable recording gestures and performing analysis/recognition, and of a server performing the training and storage of statistical models.

Other Web apps can be developed starting with the template example found here. This template is similar to the app called elements, but running on the localhost. It features OpenSoundControl output, which allows to perform gestures recognition on smartphones and control environments such as Max, Pd, Processing, openFrameworks, etc. on the local machine.


The CoMo ecosystem is developed within the framework of the RAPID-MIX project, an Innovation Action funded by the European Commision (H2020-ICT-2014-1 Project ID 644862). Collective Soundworks is developed in the framework on the CoSiMa project, funded by the French National Research Agency (ANR)

Developers&Researchers: ISMM team @ UMR STMS Ircam - CNRS - UPMC
Joseph Larralde, Benjamin Matuszewski, Norbert Schnell, Riccardo Borghesi, Frederic Bevilacqua (coordination)
XMM has been developed by Jules Françoise

Sound samples: Roland Cahen, Andrea Cera, Olivier Houix
Thanks to Xavier Boissarie, Marion Voillot, Anne Dubos, Jan Schacher, Joël Chevrier, Jean-François Jégo