With the world’s TV viewers cooped up for a year and most spectators barred from attending, never has there been a more pressing need for the International Olympic Committee (IOC) to put on the greatest show on Earth.
Fortunately, host broadcaster, the IOC-owned Olympic Broadcasting Services (OBS), has been experimenting with new digital technologies over successive Games and this year it will unleash most of them. It promises a visual and data-rich feast in which IP, cloud, 5G and AI are all breaking the conventional frame of broadcast production and transforming the fundamentals of content delivery and viewers’ experience. The ambition is to deliver “the most realistic experience one can get from viewing a sporting event without actually attending in person”.
OBS plans to produce 9500 hours of content in just over two weeks (of which 3800-4000 hours will be live). Here are some of the ways it hopes to generate the most immersive televised games yet.
Native UHD HDR
Tokyo is the first Olympics to be fully produced natively in UHD HDR. “The technology has reached a maturity level such that we are all confident that it is ready for the Tokyo Games,” says CTO Sotiris Salamouris. “However, this is not something that you can take lightly, especially in our own production environment, since there are so many moving parts that need to be brought together.”
All 31 host OB vans and 22 fly-away systems have been outfitted to work in UHD HDR from 42 Olympic competition venues.
Only production from seven outside tennis courts will remain in HD, while OBS will also rely on several specialty cameras that at this time can only operate in HD 1080p SDR. These sources will be up-converted to UHD HDR and integrated into the main production.
The UHD HDR feeds will be delivered simultaneously alongside a feed in HD 1080i SDR. OBS is doing this in a single HDR/SDR production workflow that will allow the trucks to generate an HD 1080i SDR output converted from the primary UHD HDR signal. A new full IP infrastructure has been built to support the transport of signals for the contribution network. A full ST-2110 platform has been implemented to carry, route and distribute UHD content around the IBC.
OBS has also developed a set of look-up tables (LUTs) to maximise the quality between all cross-conversions (from/to UHD-HD and HDR-SDR). The HDR standard will be Hybrid-Log Gamma (HLG). OBS says that by having natively captured the content in UHD HDR or up-converted to UHD HDR then down-converted again, the final HD signal will offer higher quality across all platforms than if produced in a standard HD production.
Enhanced audio
UHD also means an audio upgrade to a standard 5.1.4 configuration “to enable viewers to have a more realistic audio experience”. This expands on 5.1 surround sound by adding an overhead captured from four hanging ceiling mics with adjustable heights. Two new mics were specifically designed for this immersive sound production. In total, OBS will use 3,600 microphones (28 different models).
Cloud
The host broadcast’s digital transformation continues with the introduction of OBS Cloud in partnership with Alibaba. “It will make rights holder operations far more efficient, far more productive, less costly and will mean they need fewer people on the ground. It is a major innovation,” says OBS CEO Yiannis Exarchos.
Nonetheless it’s not clear how many broadcasters are taking advantage of OBS Cloud for all or part of their workflow. NBC, planning 7000 hours of coverage, is not.
“It is still relatively early days in the full change to cloud technology, and Tokyo 2020 will mark a first step,” Yiannis Exarchos, OBS
Salamouris speaks of “a certain reluctance among broadcast professionals [using cloud] for intensive content production workflows, especially in the context of live sports. While this hesitation is understandable up to a point, due to the known demands of our workflows for high bandwidths, large storage and low latency, we can see again that there is also an inertia factor.”
It’s likely cloud workflows will form a bigger part of successive Games. “It is still relatively early days in the full change to cloud technology, and Tokyo 2020 will mark a first step,” notes Exarchos. “The Beijing 2022 Winter Olympics may then become a facilitator for its wider use.”
AI
AI-led solutions will feature in some broadcast workflows in Tokyo as a way of testing how it will evolve in future. This includes an Automatic Media Description (AMD) pilot based on athlete recognition. The project will combine existing metadata such as the Broadcast Data Feed and video logs with image recognition based on an athlete’s bib. Additionally, OBS will use speech-to-text technology to complement and improve the tagging of media assets.
Such applications will allow a faster and more efficient turnaround of workflows such as image selection, automatic searching and clipping. By Beijing 2022, OBS is aiming to expand this process to as many sports as possible and to open the service to rights holders.
Guillermo Jiménez, OBS Director of Broadcast Engineering, explains: “We could customise the automatic content offering based on user preferences, whether by National Olympic Committee, athlete or sport. It means that, instead of broadcasters searching for content, content will be automatically pushed to them.”
OBS has traditionally tagged its footage, relying on training students from the host cities in online, real-time tagging. “We employ approximately 120 people just to tag our video sources, however even with that many human resources, it isn’t enough,” Salamouris says. “They can only do so much and it is not possible to tag everything. AI takes tagging to the next level.
“Also, we only do logging and tagging of our live coverage, simply because we don’t have the capacity to do it for the other types of content we produce during the Games, for example, the content from our ENG coverage. For all the non-live content, our tagging scope is, out of necessity, quite limited.”
In Tokyo and subsequently Beijing, OBS will run proof of concepts of AI technology, where it will try to identify which athletes are appearing where and when. Broadcasters want to use the footage that features their national athletes, but searching for specific content through hundreds of hours of footage is a laborious task if the content is not densely logged, which is exactly the case with the ENG material.
“If we were to tag all our content, it would require an incredible effort and a large number of human operators. By using trained AI systems, this could be accomplished in a fraction of the time (and thus cost), while increasing both the speed of searches and the accuracy of the search response.”
5G tests
Following tests of network performance and quality over 5G at PyeongChang 2018, OBS has partnered with Intel once again to trial contribution from ENG cameras over 5G during Tokyo’s Opening and Closing Ceremonies. Speed will be measured and the overall network performance monitored. OBS is set to adopt 5G tech further for the broadcast from Beijing 2022 where all the competition venues are expected to have 5G network coverage.
The most obvious application for 5G is an enhancement to the existing wireless solutions that rely on bonding technology. “Part of our coverage within specific areas inside the Olympic Stadium will be done in that way,” says Salamouris. “5G has a great potential to help us enhance the options for wireless and mobile video coverage without a further increase in our needs for temporary assigned frequencies − something that has always been one of the major challenges in an event the size of the Olympic Games.”
5G can also be used in conjunction with Internet of Things (IoT) devices that are connected with sensors able to receive signals from athletes during the competition.
“Broadcasters are very interested in the use of real-time performance data that can become available through 5G,” says Salamouris. “Such data can help them significantly enhance their storytelling. Of course, data from sensors can also be received through other types of wireless technologies but 5G is a much more efficient and powerful technology that can offer ultra-high data collection speeds with very low latency. It would remove the need for setting up overlay or temporary systems which always come with a certain impact on infrastructure.”
“Broadcasters are very interested in the use of real-time performance data that can become available through 5G,” Sotiris Salamouris, OBS
Digital reach
OBS will produce 30% more content compared to Rio 2016, much of this headed to digital platforms. Digital publishers can draw on a repository of up to 9000 clips and short-form assets called Content+. This includes behind-the-scenes content from the competition venues, the Olympic Village and around the city, and further content filmed with smartphones. Additionally, OBS plans to produce 1800 fast turnaround clips from all sports.
Multi-camera replay systems
Between 60 and 80 4K cameras will be placed at select venues, including those hosting gymnastics, athletics, BMX freestyle, street skateboard, sport climbing and volleyball. Each camera is mounted on a robotic platform capable of precisely panning and tilting the camera in any direction. The camera’s PT focus and zoom are remote controlled by one operator. For each replay, the operator selects the point where the motion is frozen and can manipulate the replay from side to side around the athlete, as well as zoom in without losing resolution (thanks to the 4K resolution). Since the system stitches together these feeds and does not have to virtually create filler frames, no rendering is required, allowing multi-cam replay clips to be ready in under five seconds. OBS says the effect is similar to the bullet-dodging sequences in The Matrix.
2D image tracking
Video tracking technology will help commentators and viewers keep track of the position of the athletes throughout an event in real-time across a number of sports, including marathon and race walks, road cycling and mountain biking, triathlon and canoe sprint.
Instead of GPS positioning or wireless equipment, OBS’ 2D image tracking is based on image processing technology that allows motion tracking. A ‘patch’ (a square) is defined on selected video frames in order to identify each of the athletes/boats. The computer then creates a ‘label’ that is attached to each of the identified athletes/ boats and that will be maintained even as the image changes. This captured data is then made available to a graphics rendering platform for on-screen presentation. Additional data captured using more traditional GPS positioning can be combined with the ‘labels’ to identify athletes, their speed, distance to finish or relative position to the leader.
3D Athlete Tracking (3DAT)
A claimed first of its kind broadcast enhancement aims to provide an inside view into the results of a race and how athletes perform and compare against one another. Four in-venue pan-tilt mounted cameras will be installed at the Olympic Stadium to capture the sprint athletes’ performance, live. The 3DAT technology relies on massive computing power that can process large data sets, hosted on Intel-based data centres in Alibaba’s cloud infrastructure. OBS will be able to deliver this new analytical data as part of the multilateral feed within a fast turn-around time. For instance, visual overlays will show viewers the exact moment each sprinter reached their top speed.
Biometric data
Contactless vital sensing technology will provide live heart rate monitoring of archers. Four cameras will be placed approximately 12m from the athletes, focusing on their face and analysing the slight changes of skin colour generated by the contraction of blood vessels from the captured video. Through an on-screen graphic, audiences will be able to witness the heartbeat variations and adrenaline rush experienced by the archer’s body as they shoot their arrow.
Virtual 3D graphics
Sport climbing makes its Olympic debut in Tokyo and to help audiences understand the challenges involved OBS has created a 3D representation of the holds and walls. AR technology will be used to switch between the (real) camera shots and the virtual, as well as generating virtual data about the holds, the wall’s varying angles and the routes.
360-degree replays
Intel’s True View technology will come in to play during basketball matches. Thirty-five 4K cameras are mounted at the concourse level of the Saitama Super Arena to capture volumetric video that, once processed, renders 360° replays, bird’s eye views and freeze frames from any perspective on the court. OBS will produce between up to 10 True View clips for every basketball game.
Virtual reality
OBS plans 110 hours of live immersive 180° stereoscopic and 360° panoramic coverage from the Opening and Closing Ceremonies, as well as from select sports like beach volleyball and gymnastics – sports chosen based on the possibility of getting cameras closest to the athletes.
OBS will place up to six 180° stereoscopic cameras, in a fixed position, together with a 360° camera to capture the Ceremonies and Olympic competitions. These live streams are supplemented with a number of highlights and point-of-view clips from almost every sport, some having never been captured in VR before, OBS claims. These include cameras worn by certain athletes during their training sessions (pre-Games) to record that performance for VOD and give the VR user the chance to feel what it is like to be an Olympic athlete. Such clips are delivered in a mix of 180° and 360° formats for broadcasters to deliver to their own VR services.
8K
Although not an official part of the host broadcast, 8K has been a staple piece of futuristic tech at Games since London 2012. At the Tokyo IBC, NHK has built an 8K HDR 22.2 multichannel sound theatre to wow other broadcasters. In addition to the Opening and Closing Ceremonies, OBS and NHK will offer 8K Super Hi-Vision live coverage of selected sessions of athletics, badminton, football, judo, swimming, table tennis and volleyball. Additional sports highlights will also be produced by ENG crews in 8K across other sports including artistic gymnastics, artistic swimming, basketball 3x3, skateboarding and sport climbing.
Motion capture
OBS has created a series of short ‘Sports Guides’ to highlight skills required for each sport. To do this, it recorded mocap data of athletes from each sport in a studio at Pinewood. That data was then used to create a 3D ‘avatar’ of the animated athlete which was placed in a futuristic urban rooftop environment – a nod to the high-tech reputation of the host city.
“It was an intricate process ensuring that the avatar truly reflected the movements and techniques of the athletes accurately,” OBS explains. “For each sport, prior to filming, OBS’s creative teams pre-planned the key skills that would be featured in the Sport Guides, prepared detailed scripts and fact checked all technical information with the International Federations to ensure accuracy.”
Fan engagement
With spectators limited on site, OBS has created some virtual propositions. These include an online ‘Cheer Map’ and ‘Fan Video Wall’ that brings audience participation direct to the venues.
Kit and crew
Production teams have been chosen for their expertise in covering a sport. For example, NBC crew will provide expertise in golf, Sky New Zealand for rugby sevens, China Media Group for table tennis and gymnastics, and NHK for judo and karate. OBS works with these teams to place the cameras and mics, to adjust the lighting and ensure the coverage will be of a very high standard.
Some 1049 camera systems will be used, including 210 slow-motion cameras, 145 RF cams and 250 minicams. The kit list features 11 four-point cablecam systems including at BMX and skateboarding; 27 tracking camera systems and 37 jibs/cranes.
OBS’ Tokyo workforce numbers over 8100, one-quarter of which have been hired locally (including 1200 local student loggers). They will manage 118 HD contribution feeds, 68 UHD contribution feeds, 76 HD distribution feeds and 44 UHD distribution feeds.
Sustainability
The IOC is at pains to stress its environmental response. This includes an IBC footprint 21% smaller than in Rio, mainly because more broadcasters are remote producing.
Cloud provides the IOC with a long-term vision to relieve dependency on local hardware infrastructure.
“You can build systems on the cloud, test them properly, switch them around and do all your preparation well in advance, all before setting foot in the host city,” explains Salamouris. “Then you can fire it up, just before the Games, with all the systems already configured and ready for operation. So now that you can disassociate yourself from being local in the venue or the IBC and being able to operate off-site in the cloud, it means that you can continue increasing the size, the complexity of your systems, and consequently, the volume of your output, without further increasing your infrastructure in the host city.”
No comments yet