Football star Eden Hazard’s augmented reality interview on Belgian TV went viral soon after it aired during the World Cup. Here’s how it was produced.
Belgium may have had to settle for third place at this year’s World Cup but it can now claim a first in broadcast graphics following the transmission of RTBF’s ‘Hazard Hologram’ earlier this month, which caught the attention of the world’s press.
The national pubcaster made headlines when it ‘transported’ their country’s star player, Eden Hazard, from a Russian locker room onto the set of La Une in Belgium for post match live TV interviews.
“Belgium TV may just have given us a glimpse of the future with the Hazard hologram,” proclaimed The Sun newspaper after the nation’s quarterfinal win over Brazil.
This wasn’t a futuristic 3D holographic image created from laser beams however, the interview was produced via chroma key insertion - a virtual production technique which has been around since the late nineties.
The effect - sometimes referred to as ‘teleportation’ by graphics companies - allows a remote interview to appear side-by-side through the use of augmented reality.
RTBF’s Head of Production Clothilde Burnotte caught a highly photorealistic demo of this technique in March at an open house event held by its long-term Belgian-based virtual studios designer Dreamwall and set her sights on using it for the broadcaster’s World Cup coverage.
“We need to open up the virtual studio market – making photorealistic sets and cost effective camera tracking technology accessible to a new generation of content creators.” Michel Loiseau, Zero Density
The resulting ‘Hazard Hologram’ is collaboration between Dreamwall, Turkish virtual studio start up Zero Density and US gaming engine company Epic Games.
The soccer player was interviewed live ‘in studio’ while actually sitting in front of a green screen in Russia. The feed – which is brought into RFBF via a fibre channel link – enters a Zero Density chroma keyer and is keyed over a conventional camera placed at a specific angle inside the studio.
Many of the press reports about the broadcast remark on how the studio anchors were able to engage in complete eye contact with the player.
According to Zero Density’s Director of Sales Michel Loiseau this was because an off-camera monitor that only the presenters could see was placed in the studio at an angle that interfaced exactly with the live feed coming from Russia – all these measurements had been worked out beforehand.
Loiseau adds that the World Cup presented several unique challenges for Dreamwall and the production team.
“The satellite feed meant that there was more of a time delay; the space where the green screen was placed in Russia was very small and there was also the availability of the players to factor in after a live match,” he says.
Preparation therefore, was critical. The production team flew to Russia the week before for a trial set up – experimenting with crane angles and real time tracking technology provided by German-based outfit TrackMen.
“It was important that we had all the angles worked out beforehand so that we could maximise the time we had with the players during the interview,” Loiseau says.
‘Teleportation’ has been used before – including during the CNN coverage of the 2008 Presidential Elections and, more recently, by Groupe M6, which broadcast a successful stadium-to-studio transportation during the 2016 UEFA Euro Cup.
But what stands out about Hazard interview is the quality of the render - the image of the player was so photorealistic that it seemed like a hologram.
This is the culmination of a key trend in virtual set production that has seen broadcast graphics companies team up with games rendering engine firms.
Zero Density and Unreal creator Epic have produced a keyer capable of achieving photorealistic rendering in real time – exploiting technology that exists in Unreal’s Triple A gaming engine.
While gaming engines were never designed for live broadcasts, the companies came up with a solution which has enabled them to deliver the required broadcast signals both on a software level and on a signal level using conventional I/O cards.
The resulting Zero Density’s Reality Keyer software is a compositing pipeline that sits on top of the Unreal Engine as the render.
According to Zero Density’s European Head of Business Development Guillaume Godet, unlike traditional keyers, the Reality Keyer doesn’t use layer based compositing technique, it composits internally in 3D, which, he claims is key to achieving the required level of photorealism.
“Regular chroma key technology relies on layer-based insertion. The presenter will always be in front of the background. Some companies have also developed a layer that can be inserted in front of the presenter – but with these techniques there’s no interaction between the set that is being rendered or the presenter,” he explains
“This new generation of keyer allows for complete interaction between virtual and real elements including eye contact, the ability for the presenter to move behind, in front and around virtual objects and one that is also able to handle shadows and other reflections and refractions of light,” Godet adds.
“This new generation of keyer allows for complete interaction between virtual and real elements including eye contact,” Guillaume Godet, Zero Density
It’s these subtle nuances that make the Hazard broadcast appear so photorealistic.
Other recent deployments of this technology include Eurosport’s localised coverage of the Winter Olympics in the Netherlands. TF1 show Le Mag in France also used the technology extensively during its World Cup coverage to take its TV audience inside a stadium inhabited by 30,000 moving people and into the locker rooms of the players.
Other collaborations are resulting in similar innovations: Ross Video, The Future Group and Epic offer Frontier, a virtual studios graphics rendering platform that was most recently used in China for state broadcaster CCTV’s World Cup coverage.
The industry still needs to overcome latency issues however, if it is to take live transportation to the next level, according to Godet.
There was a slight time delay between the video and audio during the Hazard interview, although this issue may soon be addressed thanks companies such as NVIDIA, which is trying to improve the transfer of signal from GPUs - primarily to solve latency issues in multiplayer gaming.
“All companies working in this area, Orad Vizart, Ross, Chryon are GPU based – so once this tech is available – maybe by the end of August in time for IBC – latency issues should be reduced,” says Godet.
Loiseau adds that using Zero Density’s keyer with affordable cameras that come with in-built 3D tracking - such as Panasonic’s AW-UE 150 – may also open up further opportunities in the niche web TV market.
“We need to open up the virtual studio market – making photorealistic sets and cost effective camera tracking technology accessible to a new generation of content creators,” he says.
No comments yet