The BOEUF project aims to research and promote advanced musical collaboration between musicians using digital musical instruments (DMIs).
We have proposed a theoretical framework that can be used when analysing, designing and implementing networked ensembles with a variety of collaboration modes.
We have developed bf-pd, a set of software components running in Pure Data, which musicians and makers can integrate into their digital instruments and systems. Bf-pd allows musicians to share parameters and output data between instruments, to control each other’s instruments, to synchronize between instruments, to visualize each other’s activity, and to exchange messages.
We are conducting research on mixed-reality interfaces to improve non-mediated (i.e. in-person) collaboration between musicians, and to enable them access the advanced musical collaboration modes offered by our framework. These interfaces will allow musicians to preserve both the existing control interfaces of their instruments and the physicality of their performance, and would improve spectators' ability to perceive the musical exchanges (thereby improving visibility).
The Boeuf project is led by Dr Florent Berthaut at the Université de Lille, France, and by Dr Luke Dahl at the University of Virginia, USA.
More details at http://bf-collab.net/
Tabletop interface and Collaboration Window in PureData
Adapting & Openness: Dynamics of Collaboration Interfaces for Heterogeneous Digital Orchestras (NIME 2020)
Appropriation of visual feedback on control surfaces through GUI remixing and augmented-reality
This video shows the results of the ControllAR project, a collaboration between Florent Berthaut (CRIStAL, Université de Lille) and Alex Jones (University of the West of England).
Despite the development of touchscreens, many expert systems for working with digital multimedia content, such as in music composition and performance, video editing or visual performance, still rely on control surfaces. This can be due to the accuracy and appropriateness of their sensors, the haptic feedback that they offer, and most importantly the way they can be adapted to the specific subset of gestures and tasks that users need to perform. On the other hand, visual feedback on controllers remains limited and/or fixed, preventing similar personalizing. In this paper, we propose ControllAR, a novel system that facilitates the appropriation of rich visual feedback on control surfaces through remixing of graphical user interfaces and augmented reality display. We then use our system to study current and potential appropriation of visual feedback in the case of digital musical instruments and derive guidelines for designers and developers.