Broadcast graphics have been a hotbed of innovation since someone had the bright idea to animate a digital on-screen graphic with a computer, but the latest crop of systems to bear the title bear little relation to the broadcast graphics of the ’80s and ’90s.
Nowadays visual impact and viewer retention are served by video walls, virtual studios, volumetric avatars and AI-based object tracking.
Virtual sets and AR
“For the producers, having an opening and catching the viewer’s attention of course is super important,” says Vizrt CTO Gerhard Lang. “But the whole thing needs to be functional. You need to tell a story. No matter whether this is a news show or a sports show, people will focus on how you present the content, and how you explain what was happening. In the end, data is king so how can you present the data or bring in new content? Those are the areas that are crucial for producers and where we get the most requests.”
“Trends like virtual studios and AR mean broadcasters can now visualise anything for their audience,” says Hakan Öner, Business Development Manager for Nordics and DACH, Zero Density. “This has huge benefits when it comes to storytelling and engaging audiences with more immersive experiences. For instance, a weather forecaster can now show snow falling in the studio rather than just talk about a snowy storm miles away. Sports personalities can be teleported into virtual sets from across the world. Entire stadiums can be filled with virtual fans or billboard ads. The possibilities are endless.”
Zero Density’s real-time compositing software, Reality, uses Unreal Engine as its renderer, which has just seen the release of its fifth iteration by Epic Games.
“We’ll deliver an updated version of Reality to take advantage of Unreal Engine’s new rendering features,” says Öner. “This will include the ability for broadcasters to have virtual assets dynamically lit with real-time global illumination, better reflections and higher resolution shadows for more photoreal virtual and AR graphics than ever before — all with faster performance.”
“We are working on building a new software architecture for the media and entertainment space that will include multi-GPU and multi-node capabilities,” he adds. “This new product will open new horizons to content creators.”
Öner says customers regularly ask for three things: photorealism, performance and scalability. “For photorealistic composites, we’ve responded by developing Reality Keyer, the world’s first real-time image-based keyer that works on the GPU and that is aware of camera tracking. This provides spectacular results for keying contact shadows, transparent objects and sub-pixel details like hair. We’re constantly improving the Keyer and planning to take advantage of the AI algorithms coming onto the scene.”
“Our declared goal is to fuse reality and virtual reality so much that you’d need to be super-educated to understand that this is CG and not part of the real scenery,” Gerhard Lang, Vizrt
“As a response to the need for performance and scalability, we’ve recently launched a new hardware product, Re Ampere,” he continues. “This plug-and-play virtual graphics box is designed to double the rendering performance of real-time graphics at the highest possible stability.”
Several broadcasters like ITV and Turner Sports have been using XR (Extended Reality) from disguise to create hybrid studios without the challenges and limitations of green screens.
“The flexibility that XR stages offer is amazing, and the storytelling capabilities they present are unparalleled,” says Gideon Ferber, Broadcast Product Director at disguise. “During the pandemic, the capabilities of XR became even more apparent when events like sports, concerts and award shows had to be broadcasted without a live audience. AR graphics were used to bring the thrill of the event to audiences watching on their TVs at home.”
Since 2019, MRMC has been working with Dimension Studios in the development of Polymotion Stage, a mobile studio environment for the creation of volumetric video, avatars and stills. “Polymotion Stage has been used across first-class TV and film productions,” says Sara Gamble, Head of Volumetric Solutions, MRMC. “Most recently on the broadcast front, Sir David Attenborough was filmed in 4K at the Polymotion Stage for the BBC Studios series The Green Planet AR Experience, presented by Factory 42, combining captivating storytelling and cutting-edge volumetric technology to reveal the magical depths of our natural world.”
“It offers 106 video cameras – 53 RGB cameras and 53 infrared IR cameras that record depth and position in space, placed around the walls and ceiling – as well as motion capture and prop tracking equipment, and four overhead mics to record audio if required for projects,” continues Gamble. “Following the capture, all images are stitched together to create a seamless lightweight 3D video that can be delivered for use in WebAR, broadcast, AR, VR and more. Not only can a volumetric capture be suitable for broadcast purposes, but the same asset can be used across marketing platforms or direct to consumer engagement opportunities.”
Gamble says all broadcasters, in particular, “pay-TV broadcasters who must deliver premium material to sustain subscribers”, are asking how they can add to their consumer experiences and allow them to interact with content. “We are also getting asked how, with just a short time with talent, we can create captures that can be used across a wide variety of platforms from AR to WebAR, marketing materials and second-screen experiences alongside broadcast or sporting events,” she says. “With Polymotion Stage being mobile we can turn up to locations where the talent is, ensuring that we utilise the time that we have with them at a location that suits them, while capturing content for production to make the creative or editorial decisions at a later stage with the 3D asset.”
Vizrt has been a pioneering force in broadcast graphics for many years, and the latest Vizrt eXtended Reality (XR) Suite offers a combined AR, virtual set, video wall, mixed reality and telestration toolset.
It has recently taken the concept of the virtual studio to new levels with the BBC’s coverage of the Beijing Winter Games. With Vizrt’s Viz Engine 4 coupled with Unreal Engine 4 render pipeline and Vizrt’s Fusion Keyer, all driven by Vizrt’s virtual set controller, the BBC could place its presenters into a seamless virtual studio environment with easy operator control.
The aim is always to make the final composition believable. “Our declared goal is to fuse reality and virtual reality so much that you’d need to be super educated to understand that this is CG and not part of the real scenery,” says Lang, who adds that this can be aided by greater photorealism in rendering, using methods such as raytracing.
“Raytracing up until now has really limited use inside large productions because it’s still way too demanding for the hardware, but if you want to make things super realistic, having a ray-traced image is the final goal,” says Lang. “The method that is most used now is what is called screen space reflection, where only what is visible inside the screen gets used for reflections in the scene. We have implemented two techniques [to improve this]. One is a multi-pass reflection that allows us to reflect things that are behind objects in the foreground in the scene. For example, you have a video monitor that sits on a bathroom floor. In the usual screen space reflection environments, you will not see anything that is behind that monitor reflected on the floor. But we can render the reflection for the background before we render the other objects so that the reflections on the floor are 100% believable. That can of course be eliminated when using ray tracing.
In addition, shadows are very hard to calculate with standard render procedures. But a ray tracer will make this very believable,” he adds. “Going forward, raytracing will become a standard for broadcast.”
Nvidia is helping Vizrt with both hardware and software solutions towards this end, but Lang says currently it’s possible to train an AI to complete a partially raytraced image. “You can stop at a certain amount of rays and then have the AI complete the picture,” he explains.
AI
The use of AI to enhance live graphics is being explored in other ways. Zero Density recently launched Traxis talentS, an AI-powered, markerless stereoscopic talent tracking system. “Providing the 3D location of the talent, Traxis enhances the photorealism of virtual studio graphics,” says Öner. “It also enables effects like automatically triggering AR graphics to pop up or lights to turn on when talent moves to a certain part of the stage.”
The new Viz Object Tracker from Vizrt was recently premiered by FoxSports in its production of the Nascar season-opening Daytona 500. Powered by Viz AI, it allows the detection and tracking of objects, such as race cars, over any incoming video feed and ties 3D elements and informational graphics as overlays in eye-catching colours.
Vizrt has been tracking racing objects for enhanced storytelling for some time, notably when it used remote sensing capabilities to deploy AR planes and data graphics in the Red Bull Air Race, while image-based tracking to calculate a virtual camera has long been available in the Viz Arena product line, but Lang says combining this with the object tracking AI could offer a very powerful solution “with very little influence on the production workflow”.
Vizrt is also looking at AI to repackage graphics and content for multiple outputs, particularly the vertical video format favoured by Millennials and Gen Z consumers.
“I don’t want to have four people working on four different formats,” says Lang. “Ideally, this would be one scene that can be reused and utilised for all performance aspect ratios and screen resolutions. That’s a key thing we are working on.
Cloud
Some broadcasters are transplanting the heavy lifting of graphics to cloud-based systems, particularly OTT content services.
For example, hundreds of thousands of hours of content every month go live with Intelligent Overlays from Singular.live, dynamic live graphics that are created and controlled from a web browser. For example, for Sunset + Vine’s live coverage of Crufts 2022, footage of the Obedience competition was sent using a single camera and LiveU attachment to Sunset + Vine’s office in Hammersmith, where a producer added live graphics via Singular before streaming to the Crufts YouTube channel (420,000 subscribers).
Singular.live Head of Marketing Mike Ward says he’s seen a wide range of benefits across the industry for this approach. “Large-scale media companies like Gray.tv have been able to use Singular to deliver graphics at scale as they roll out adoption across their 100+ local news stations. Multiple clients have significantly reduced their costs by integrating Singular with their platforms.”
“Most of the challenges with OTT come down to the fact that the industry has been focused on the video delivery,” he adds. “High-quality, reliable, low-latency video was the major challenge, but now that there are good solutions available for that, the next consideration becomes how you enhance that video.”
Singular can be adapted using customisable HTML templates through the intelligent overlays authoring environment as well as APIs and SDKs, and Ward says it offers great flexibility for OTT platforms to create bespoke solutions.
“As a cloud-native platform, Singular is effortlessly scalable and our SaaS model means OTT platforms can scale up (and down) as they require without having to incur large capex costs for under-utilised graphics hardware,” he adds. “Our device-side rendering enables adaptive overlays which are responsive graphics that deliver the perfect solution for every screen format, as well as personalisation and interactivity.
“The flexibility that XR stages offer is amazing, and the storytelling capabilities they present are unparalleled,” Gideon Ferber, disguise
“While a lot of traditional vendors talk up virtualised solutions, these don’t offer the many benefits of cloud-native solutions, so there is still a lot of education to be done. Our growth is accelerating, so we have been working on new graphics for our template library and integrations. We’re also working on a new offering targeted at simpler productions to take broadcast-quality graphics to other content creators beyond broadcasters; as well as a new infrastructure for interactive overlays that we will be releasing in the next few weeks, to respond to the growing demand for audience engagement.”
Zero Density’s RealityHub is a universal control UI that lets broadcasters automate the entire graphics workflow from a web browser. “You just need to open a browser and you’ll start managing real-time graphics, robotic cameras and more right away. The broadcast teams can manage everything from on-set equipment to external data sources without installing any software to their computers,” explains Öner. “The new edition, RealityHub 1.2, now has support for Unreal Engine vanilla.”
RealityHub also offers integration with data sources in real-time to automate the display of weather, election, sports, or financial information, as well as the ability to develop custom control and integration modules. It integrates with newsroom control systems through MOS and other third-party integrations through REST API.
Polygon Labs, which supplies a cloud-native platform to the likes of CNN, Univision, The Weather Channel and TV Globo, has recently been acquired by disguise to enable XR studios to have access to workflows to power remote production and cloud-based collaboration.
“We offer direct integration with the vanilla version of the Unreal game engine and integration with Polygon Labs Ipsum data aggregation platform and Porta control system,” says Ferber. “In the near future, users can expect an integrated solution that enables their XR studios to run easy-to-manage, data-driven graphics workflows running fully native Unreal Engine, as well as extended design and production capability for graphics.”
Ferber feels that current challenges with cloud-based solutions centre around latency and cost.
“Where the latency might be enough for on-air production graphics, it might be too long for XR stages, as the graphics are constantly in shot,” he says. “The price might be another barrier to keep in mind, as the current cloud solutions in the market charge for usage, meaning that every test of graphics, every rehearsal, every production will come with a price.”
Changing the face of graphics
“Most recently, we’ve seen broadcasters like BT Sport adopting Singular in their cloud productions to help them meet their sustainability targets,” says Ward. “Singular is the only live graphics platform that is accredited for sustainability by the BAFTA-affiliated Albert consortium.”
Sky and the BBC are also using it to test the sustainable benefits of cloud-based production.
There’s also a move to help non-profit news teams and organisations with free access to broadcast graphics, as well as address the skills gap.
“We have over 500 schools, colleges, universities and non-profits who are taking advantage of free Pro-grade accounts through our expanding Singular For Good programme,” says Ward. “This is also helping drive diversity and inclusion in our industry.”
Supported by Zero Density’s recent MegaGrant from Epic Games, the Community version of RealityHub is feature-complete and can be downloaded free from the Zero Density website. The company will also soon be launching ZD Academy. “ZD Academy is a free centralised online learning platform based on Reality,” says Oner. “It will offer comprehensive, hands-on video courses and guided certification paths. Developed using feedback from industry and educational partners, ZD Academy is based on self-paced learning. Anyone can log in and search for a video on anything about virtual studios, XR, on-air graphics, and more. They can also follow pre-defined Reality courses that cover skills for key career tracks. We hope that the academy will support the community to get more creative and help everyone tell their stories better than ever before.”
No comments yet