This groundbreaking IBC Accelerators 2024 project aims to develop a solution for delivering all data between venues, in sync, and by using 5G to unlock ultra-low latencies and enable synchronised and/or distributed performances between multiple locations - all connected by the public internet.

The ultimate aim of the Connecting Live Performances of the Future with ULL-AVLM (Ultra-Low Latency Audio, Video, Light and Media Data) project is to recreate the experience of a live performance in multiple locations and/or bring together remote performers into a seamless and immersive live experience. While that might sound a prohibitively aspirational challenge, many in the project team have considerable form in this area, having successfully collaborated on the 2023 IBC Accelerator project 5G Motion Capture for Live Performance and Animation.

Indeed, the broader aims of the 2023 project and the 2024 edition have considerable crossover, as Andy Hook, Director Of Technology Strategy, d&b explains: “Last year’s project was hugely successful and facilitated a live performance between two dancers about 500 miles apart. However, the need to synchronise the video, audio and motion capture data stood out as an important thread in the challenge of sharing performances across multiple venues, which formed a significant element of this project. Here, in addition to motion capture data to drive rendered graphics, we also include full lighting control data as well as telemetry required to generate spatial audio, adding to the immersive experience.”

Read more IBC2023 Accelerator Project: 5G Motion Capture for Performance Art and Animation

Sam Yoffe, Research Associate at the University of Strathclyde and Senior Systems Engineer, Neutral Wireless, adds: “Right from the start, we aimed to run this project slightly differently to previous projects, with a dedicated project manager and equipment resources. The idea was to iterate with regular “hackathon”-style days, which would allow us to build up the concept and avoid endless talking about what we planned to do.

“We quickly made progress developing a protocol to allow us to easily share any kind of synchronised IP data within a standard MPEG transport stream, alongside regular video and audio. From the initial telnet proof-of-concept, we have worked to distribute lighting control data, motion capture and telemetry to remote venues, allowing us to recreate a performance in a remote location with multi-channel positional audio and rendered graphics.”


Watch the Connecting Live Performances of the Future with ULL-AVLM Kickstart Pitch:

The challenges

That protocol is at the heart of the project challenge, as Yoffe explains: “The local network topology is largely unchanged, and it is instead how we package and distribute all of the audio, video and data feeds between venues (over public internet) while preserving synchronisation and minimising latency that represents the technical achievement.

SYoffe_headshot

Sam Yoffe, University of Strathclyde, Neutral Wireles

“While there’s a plethora of ways to distribute all of the individual components, the feeds have no knowledge of each other and it is inevitable that they will not remain synchronised over a long period. Instead, we have developed a way to embed all of our auxiliary and control data into a standard MPEG-TS, which can be transported using whatever protocol is desired (such as UDP, Zixi, SRT, …). The data will always arrive synchronised, and is then handed over to the required hardware at the remote location.”

As Hook points out, this particular challenge isn’t as simple as inventing a completely new protocol, however, as interoperability is key to reducing latency and improving usability from a real-world perspective.

“It would have been easy for us to go down the rabbit hole of building a new, proprietary transport protocol to enable the distribution of all of the synchronised feeds, but this is no mean feat and there already exist many mature protocols that have become “industry standards”. Instead, we focussed on how we could embed the data within a standard MPEG-TS, which can then be carried using any of the existing transport protocols and routed through standard IP video networks to our “unwrapper” at the remote venue(s).”

Yoffe agrees: “While the development of the synchronisation mechanism was quick, understanding how to best “wrap” and “unwrap” the relevant data streams has taken some investigation. We wanted to work with industry-standard protocols used within the actual venues. For example, it was important for our system to integrate with Dante audio networks, and not rely on multiple layers of time-consuming and drift-inducing transcoding.

“Driving down the latency has also been a challenge and with so many elements of the chain to consider, this is something that we’re likely to continue working on as a team after the project.”

Solutions and looking to the future

In spite of the varied challenges, progress has been assured, and although the final proof-of-concept performance is still in the works, there are many pieces of the jigsaw in place already, not least on the technical side: “We’ve made great progress and demonstrated all of the desired technical components in action. The application of the techniques developed here is actually very flexible. The workflow could find use in many different multiple-venue scenarios, whether that is one-to-many recreation of an event or two-way interactive live performances. We are working towards delivering a “final” showcase event to demonstrate one such application with a live performance between two venues,” states Yoffe.

The Music Venue Trust (MVT) is one of the Champions of the project and is potentially an ideal use case for such technology – helping grass-roots music venues in local towns get access to a wider pool of artists while promoting sustainability. Hook confirms: “Our final proof-of-concept plans to test the effectiveness of the technology with a live performance between an MVT venue in the South East of England and d&b’s Immersive Technology Experience Centre at the Science Museum in Central London.”

Aside from the imminent POC, the project speaks firmly to the future, both from a connectivity and networking perspective, and from a wider performing arts perspective too. Yoffe references the recent UHD SMPTE ST 2110 IP-based workflow deployed in Paris for the Olympic Games as an example of the innovation that can be achieved at scale, and that can be used as an inspiration in a different sphere, in this case the performing arts.

“There is huge potential to deliver new experiences and open up the arts to engage new audiences. There are a myriad of reasons why performing arts is inaccessible to a large portion of the population, but imagine if the live experience could be recreated elsewhere, in a more accessible setting. Watching a video of a live performance is not the same as being steeped in sound, feeling the lights dance across your face, and potentially interacting with the performers. Live performance inherits some of the innovation developed for live broadcast, but the IBC Accelerator Programme has allowed us to bring together passionate and skilled people to think outside the box and innovate - specifically for live performance.”

The project will be presented and demoed in full at IBC2024.

Project Challenge Proposed by: D&B Solutions & Strathclyde University

Supported by Champions: EBU, Kings College London, TV2, University of Kent, Music Venue Trust, Royal Central School of Speech and Drama

Participants: D&B Solutions, Spectral Compute Ltd, Salsa Sound, Neutral Wireless

Read more IBC2024 Accelerator Project: IP networks: Finding the needle in the haystack