Researchers are using augmented reality display to allow the audience to visually follow digital music performances for a better understanding of it.
A team of researchers from the University of Bristol shows the mechanisms of instruments by using 3D virtual content and mixed-reality displays to improve the audience experience during performances.
Digital musical instruments have become increasingly prevalent with the emergence of software and hardware which musicians can use, but it also makes it more difficult for the public to appreciate the performances.
As opposed to acoustic instruments, digital instruments can be used to play any sound; some controlled in complex ways by musicians, some programmed and automated.
The original prototype of the reality display, called Rouages, was devised in 2013. Since then, the team had focused on two main areas: mixed-reality display technologies adapted to musical performances and visual augmentations to improve the perception of musical gestures.
Reflets, a mixed-reality environment, displays virtual content anywhere on stage, even overlapping the instruments or the performers. It doesn’t require special glasses or the use of smartphones to see the augmentations, which remain consistent at all times.
It uses reflective transparent surfaces to combine the audience and stage spaces to reveal the virtual content by intersecting it with their bodies or physical props.
The team plans on organising workshops and demonstrating the new technology at public events.