The Evolution of the Control Room Accelerator 2024 project has been jointly proposed by Champions TRANSMIXR, ITN, BBC and TV2 Denmark with support from Champions HSLU, TCD, TG4, TUS and the University of Strathclyde. The project combines two original Accelerator challenge proposals into a wide-ranging project that seeks to break technical boundaries and pose wider industry questions around live production workflows, architectures and controls.

The original projects were the Evolution of the Control Room - Leveraging XR, Voice & AI for Live Media Production, and an HTML Based Graphics project. Grace Dinan, TUS Senior VP Broadcast Specialist with TRANSMIXR (EU Horizon-funded research consortium) is leading the headline strand of the project - Evolution of the Control Room, leveraging XR, Voice and AI – together with Jon Roberts, Director of Technology, Production & Innovation at ITN. This strand of the project aims to optimise workflows using extended reality (XR) technology and AI solutions, enabling production teams to realise their creative vision without advanced technical expertise.

Evolution of the Control Room - the aims

Getting a strong start is key when tackling such a complex project, says Dinan, a veteran of IBC Accelerator projects: “We’ve had an unprecedented level of interest in this year’s Accelerator, with more than 30 Champions and Participants signed up so far and a team of more than 60 members at last count. The Kickstart event was very successful for us, which meant we were very quickly in a position to start developing our PoCs, with a wide range of expertise within the team from the outset.

“This is my fourth time participating in the IBC Accelerator programme, and previously the onboarding phase of the project has taken weeks or even months, whereas this year the enthusiasm for the project has put us in a really good position to develop a range of compelling PoCs to showcase in September,” she says.

The core of the project is to develop a suite of flexible options for broadcast media production: “We’ve identified areas within broadcast production that are in need of transformation and we’ve selected an overarching theme focusing on Elections, to demonstrate the potential of an XR-powered control room solution, with integrated voice control, AI and GPT, and HTML-based graphics solutions. Rather than aiming to replace the traditional on-premise control room altogether, we want to develop an alternative solution that can add value where it’s most needed. For instance, during rapid scale-up of production facilities during a special event like an Election, or the Olympics”, explains Dinan.

Evolution of the Control Room - challenges

Of course, while XR is a longstanding buzzword, most recently bolstered by the launch of Apple’s Vision Pro, there are considerable challenges within the maturing niche.

Workshop feedback indicated that wearing traditional VR he new

Workshop feedback indicated that wearing traditional VR headsets for extended periods would not be practical

“We conducted a series of XR workshops, led by TRANSMIXR, to get more in-depth feedback on the early-stage PoCs. During the discussion, many participants made the point that XR head-mounted displays (HMDs) wouldn’t be practical to wear for long periods because of their size and weight, and the discomfort they would cause. So the current production model, where the end-user might need to wear an HMD for several hours at a time isn’t practical.

“It was noted that the current production model needs to change, but not just to accommodate the use of XR devices,” continues Dinan. “More so, because the television production model is no longer fit for purpose. Nowadays, broadcasters are much more than television studios. They’re media production facilities, meeting audience needs across a wide range of platforms and devices. The television production model isn’t suitable for every scenario. For instance, podcasts and social media content were identified as areas that could embrace a new model and incorporate XR devices.”

Read more Spatial computing and immersion - a new era for volumetric production

Another central concern in XR is standardisation and interoperability, which in a fragmented and developing market is always in short supply. “Interoperability is key. We want the crew to be able to move seamlessly between the traditional on-prem gallery and the virtual control room solution, without dramatically changing the production workflow”, said Dinan.

Needless to say, the AI element of the project also poses challenges, not least due to the white-hot development temperature of the sector at the moment. “AI and GPT integration is somewhat challenging, because it’s developing so fast,” she says. “Some of the challenges we were exploring have already been solved with the latest release of GPT-4o, and there’s a risk that by September things will have changed again!”

HTML Graphics - the aim

The HTML Based Graphics project stream is no less groundbreaking, as it aims to develop a modular graphics solution that supports multi-platform delivery, and real-time end-device rendering and playback. It aims to use industry-standard graphics and programming tools for graphics development along with common off-the-shelf web components for storage and visualisation – essential elements to achieve the desired modularity.

Niels Borg, Head of Graphics, TV 2 Danmark, and Ryan McKenna, Executive Product Manager for Graphics and Automation at the BBC Technology Group, have been leading the charge on the HTML Based Graphics project - also originally a standalone pitch at the 2024 Accelerators Pitch day. The original central plank of the project is to develop a modular, end-to-end HTML-based graphics workflow, and also develop a common API for broadcast graphics. Borg has considerable form in this area, as the heart of the project is based around an EBU standard that he has been collaborating on, and is due to be ratified imminently.

The aim is to allow broadcasters and their creative teams to mix and match graphics tools as they wish, based on which is best for individual use cases, then being able to seamlessly manage, edit and the results into whichever formats are required. So for example, a graphics artist might build a graphic in their tool of choice, such as Adobe After Effects, and then export that to a web server. Rights holders such as UEFA or FIFA could create their own HTML graphics templates too, also stored on a server. Those templates are then available to editors via a plugin, and can be pulled into the timeline, and then stored in the newsroom computer system (NRCS) as metadata. The same plugin is available in the NRCS to add additional graphics. All graphics elements are finally transferred to studio automation, which in turn instructs the webserver and the graphics are rendered on the end device - which might be a home TV, a smartphone, or an XR device.

HTML Graphics - the challenges

Borg is under no illusions as to the technical and philosophical challenges this project faces:

“Unlike some of the other projects, this one also requires development. It’s not just taking bits and pieces and glueing them together in a new way. We do require that we have a generic HTML format that we can use, we do need to get a generic controller interface [and] an API that everyone will adopt, and these are still in the works [at the time of speaking] and they are a requirement in order to get the whole thing working and [for it to be] be successful.”

One of the aims of the HTML Based Graphics project is - new

One of the aims of the HTML Based Graphics project is to develop a common API for broadcast graphics

Read more Skills and training: live sport tech roles evolving as automation advances

However, there is a considerable roster of key broadcasters collaborating on the project, keen to take the pain out of the live graphics equation. This includes the BBC, represented by McKenna, who summarised the project USP at the IBC Accelerator pitch day: “The big difference here is that we want to pick and choose which components make the most sense. We want to have the ability to preview content as “what you see is what you get”. That includes inside a newsroom system, an edit timeline and for real-time graphics - and we don’t want to have to recreate graphics on multiple platforms.

“So the graphic that we use in a traditional broadcast way should also be available for a website or for a stream. The goal here is basically to get away from the monolithic way of thinking and into a much more modular way of thinking.”

Borg is keen to emphasise that modularity does not enforce homogeneity, enabling vendors and broadcasters to continue to innovate. “From day one we have this approach that we have to work with an 80/20 ratio. The 80% has to be generic, which allows you to open any graphic on any system, while the 20% allows you to be unique, so that there is a competitive edge that can be applied to this.

“So for example, if you then open a graphic with the 20% custom features on a system that doesn’t have those, the graphic will still apply and work, it will still animate to a degree. That’s the whole philosophy. I’ve tried to present the project [in this light] because I think in order for this to be a success, you must be able to have some competitive edge as a vendor.”

Evolution of the Control Room - a modular approach

Dinan wrapped up by highlighting the way in which the outputs and learnings from last year’s Gallery Agnostic Media Production accelerator have been - and will continue to be - central to 2024’s accelerator. “This year’s project is in many ways a continuation of last year’s, where we explored a single common workflow across all tech stacks and all devices, with a focus on the technical components, having the ability to connect the hardware in the on-prem gallery to a common interface, using TinkerList’s rundown and Cuez Automator. Having a common interface allowed us to easily switch between the on-prem facilities, the cloud production software-defined solutions, remote production solutions, and even explore hybrid options for disaster recovery and failover. It’s really important to us to continue with this modular approach, while the focus this year is on the end-user experience and how to leverage XR, voice, AI and GPT, and incorporate HTML-based graphics solutions.”

Read more IBC Accelerators 2023: Where Are They Now?

Another learning from last year’s project will impact directly on what will be on show at IBC2024’s accelerator area. Demos at IBC2023 of a Meta Quest Pro XR HMD and XReal mixed reality glasses to interface with TinkerList’s rundown and Cuez Automator garnered positive feedback overall, but with one key niggle: “It became clear that the interaction methods, particularly the handheld controllers, were not user-friendly,” says Dinan. “Some people initially loved the virtual screens and were really enthusiastic but once they tried to use the pointer-based interaction some people just removed the headset and said no! So we knew immediately where we needed to focus our attention this year. We have to ensure that an XR solution will make the workflow easier for the end user, otherwise there’s no point. So hand-held controllers won’t be making an appearance at the pod this year.”

Both workstrands will be presented and demoed in full at IBC2024.

Evolution of a control room - team summary

Champions:

TRANSMIXR

BBC

ITN

TV2

YLE

EBU

Channel 4

Trinity College Dublin

HSLU Lucerne University, Switzerland

Al Jazeera Media Networks

Participants:

Tinkerlist

NxtEdition

Read more IBC Accelerators 2024: Kickstart Day in review