09.08.24

Audio tracking for VR Live – Orange Lab

Orange Lab

Vectors of innovation where new technologies are developed, Orange Labs work to improve telecommunications services and digital experiences.

At the Orange Atalante site in Rennes, France, a project has been undertaken to enhance the live virtual reality (VR) experience.

Objectives

The aim is to be able to follow artists in motion for a spatialized audio mix in real time, and to illuminate them thanks to automated tracking. This project is particularly relevant for live music, sports and theater applications in VR, as well as for television and podcasts in Dolby Atmos, Binaural or Ambisonic formats.

Technologies used

  1. K SYSTEM tracking system from Naostage: K SYSTEM enables artists to be tracked without the need for tags on performers or in their pockets. Tracking is done solely by cameras.
  2. Spat Revolution software from FLUX:: : Used for real-time spatialization of singers.
  3. Equipment from Kariba Productions and Virtuel Audio: The performance was filmed in 12K VR and recorded with high-definition audio quality.

Project progress

The project involved artists June Caravel and Gwendolyne Coronado, whose positions were sent to Spat Revolution software for real-time spatialization and to the lighting for automated tracking. Here are the main stages:

  1. Studio capture:
    Using the K SYSTEM in the studio at a height of 3.50 m and a minimum distance of 3 m from the sources.
    Data transmission via OSC to Spat Revolution.
    Capture with VR 360 cameras for ambisonic mixing.
  2. Production and recording:
    The “Ma star à moi” performance was filmed in VR 12K by Éric VIDAL of Kariba Productions.
    Audio recording by Eric Munch of Virtuel Audio.

 

Results

The result was unmistakable: efficient tracking and seamless integration. Indeed, as expected, the artists’ tracking enabled real-time audio spatialization without beacons, and without a hitch from installation to final result. What’s more, Naostage’s OSC worked perfectly with Spat Revolution, ensuring optimal coordination between sound and light.

Conclusion

The project carried out at the Orange Lab in Rennes demonstrates the effectiveness and innovation of using wireless tracking for live VR performances. Working with a range of technologies and experts, the team was able to deliver an immersive, real-time experience, opening up new possibilities for live performance and VR capture.

Greetings

Thanks to Marc Emerit, Christophe Daguet and Arnaud Lefort from Orange Lab, Éric VIDAL from Kariba Productions, Alexis Reymond and Nathan Van de Hel from Naostage, Nicolas Erard from FLUX::, as well as artists June Caravel, Gwendolyne Coronado and Vincent Thevenot for their participation and commitment to this project.