The Responsive Narrative Factory project is one of the eight Challenges in the 2023 cohort for IBC’s Accelerator Media Innovation Programme. The project is championed by The BBC and IET and the participants are Infuse Video, MetaRex, Cuvo.io, JPB Media Solutions, and EZDRM.
The Challenge
The specific challenge as stated on the project site is:
The success of future streaming services will depend on how effectively the actual video content itself is tailored to individual preferences and needs. The key to improving engagement will be to align media metadata with personal data dynamically to create highly customised viewing experiences. This has only been achieved in a limited way to date.
To fully realise the potential for customisation, broadcasters will require richer datasets, more efficient ways of generating and transporting metadata, and to adopt a modular approach to content production and delivery.
- Watch the full project presentation from IBC2023 here.
The Problem
Ruud van der Linden the Co-founder Infuse Video, one of the participants in the project identified workflows as the biggest obstacle to customisation. “Current workflows are not flexible enough,” he said.
“They treat a movie or an episode as one big file - making even a tiny change often means the whole movie needs to be re-rendered, re-distributed, re-transcoded and re-packaged. If there are multiple language versions, this needs to happen for every language version. This quickly gets out of hand, and because of this inefficient process, smaller studios and directors cannot afford creating multiple versions. And this is just talking about existing use cases. The new use cases we’re addressing are those where we can actually show different versions of programs or even individual scenes to viewers. Deploying these new use cases today would worsen the aforementioned problems exponentially.”
“Current approaches when it comes to multiple versions are often brute-force,” he continued. “The biggest and most well-known streaming platforms host fully duplicated versions of movies if content is available in multiple visual languages or in multiple compliance versions. This can mean that instead of transcoding, storing, and streaming a 2-hour movie, they have to host upwards of 100 hours for that one title.”
The current ways of using exchange formats don’t address the issue. “The IMF format is already being used for content distribution to the streaming parties, but it stops right there,” explained van der Linden, “The flexibility disappears as soon as the content enters the streaming platform. And IMF isn’t really being used for multiple content versions yet, mostly for localisation and fixing editing mistakes.”
Bruce Devlin, Past Standards President at SMPTE and one of the people behind MetaRex, stressed the importance of the flexibility the project looks to introduce. “Whether its human tagging or AI tagging of documentaries, episodic or live content, the pipeline has to accommodate all stakeholders from in-house at the content creator to facilities doing preparation or QC as well as distributors and CDNs” he said.
“Giving flexibility though metadata exchange allows both control of the final product and adaptation to change. We’re showing a simple demo of personalised content from a pipeline that can adapt to many business situations.”
The Goal
“The primary goal is to personalise and customise video content for individual viewers,” said van der Linden. “Making the content better fit their needs. Through this Accelerator Project we do our best to make it easy for content producers and owners to achieve this goal. In a way this is all about inclusiveness. The experience is more inclusive if it’s adapted to your accessibility level and sensitivities. This could be about photosensitive epilepsy, about seeing blood, gore, violence, or about seeing on-screen signs, captions, and other elements in your own language.”
Beyond the obvious advantages for the audience, van der Linden highlighted the business benefits, “For content producers and distributors this helps increase audience size and reach,” he said. “And introduces the opportunity to deploy novel business concepts such as personalised and regionalised product placements in movies and shows, all while respecting creative intent and giving full control to the content creator. We believe this is the next generation of video experience, and the goal is to show that we can reach this with today’s technology and metadata standards.”
Devlin concurred, stressing “the variety of business benefits that can be achieved if metadata can flow and be transformed easily. Fine grained personalisation can only work if the metadata and the audience are described by metadata,” he said. “Making metadata flow easily and cheaply is a key requirement for success. This is the core concern of the MetaRex team.”
Participants
Working in the collaborative environment of the Accelerator was a real plus for van der Linden. “The benefits were to work together with real industry players, using real technology and real workflows, which allowed us to face and solve real-life problems around workflows and integration rather than just build a shiny demo,” he said. “It also helped us research the business viability for every step along the way. We learn about other organisations’ vision for the future of video, and the complexities both technical and non-technical of implementing dynamic video in existing value chains.”
“Though the technology components already exist,” said van der Linden, “we need to make changes throughout the workflow. From content production to QA, transcoding, metadata management, the origin server, the player, and we need a new type of system that decides what components to show to which viewer, as well as authoring tools that let content creators leverage these new possibilities. The most important part may be that we need content producers to embrace this new form of content creation, where they get to define alternate versions of scenes, and where they decide what parts are seen by what viewers.”
For Devlin, the ability to integrate multiple existing technologies is key to success and the Accelerator programme offers this. “The bigger the variety of inputs to these projects, the better and more generically applicable are the results,” he said. “There have been many personalisation and branching narrative projects. This project attempts to show how it can be done using standards based, readily available technologies - streaming technologies, IMF principles and free software. The MetaRex view of the project is that the metadata flow should be achievable by linking together services instead of having engineers go through lengthy integration cycles with their products. This should speed up interchange, reduce cost and let the metadata flow. Products from Infuse and CuVo have been integrated with free Open Source Software from MetaRex Media that can be readily downloaded and used.
The Accelerator Process
IBC’s Accelerator programme is unique in the industry for providing a safe place for a variety of companies, sometimes competitors, to come together and collaborate on solutions that benefit the entire industry.
“Without the Accelerator we could never have built the full workflow, and we’d never have the content to demonstrate this on, said van der Linden. “Thanks to the Accelerator programme all key players have been brought together to execute on the project vision, including co-participants MetaRex who are metadata specialists, and the BBC and IET whose content we make personalisable.”
“The Accelerator has been a unique experience to develop and discuss the business cases, the requirements, and the engineering in a set of short sprints leading up to IBC.” Said Devlin. “It has genuinely accelerated the concepts.”
The Showcase at IBC
Each Accelerator programme culminates in a showcase on the Innovation Stage at IBC2023 in September. Catch the Responsive Narrative Factory project, showcasing its PoC and findings at IBC2023, Friday 15th of September, from 14:00-15:00 at the Innovation Stage.
“We’ll be showing content from the BBC, the IET, and Blender studios that has been tagged to give viewer a dynamic personalised experience on the show floor in the Accelerator Zone,” said Devlin. “Details of how it was done and how it can be tailored for other use cases can be demonstrated at the Responsive Narrative Factory Pod.”
Van der Linden describes the specific demonstrations that visitors to the Innovation Stage should expect to see. “We have experimented with a number of different use cases, which are variations of the same principle,” he said.
The use cases are:
- Accessibility: select a version with a signer, or a specific visual language version of the content, which includes translated on-screen elements
- Self-filter: allow people to remove certain elements from content, such as flashing lights for people with photosensitive epilepsy, violence, blood and gore for those sensitive to it, and more.
- Select: create your own version of a show, such as seeing the segments featuring one specific actor or presenter
- Personalised product placement: promotional content which is personalised or regionalised.
Benefits
Both Devlin and van der Linden see significant benefits coming from the Accelerator programme to not only their own companies, but to the wider industy.
“The Accelerator is a great platform to promote the MetaRex project,” said Devlin. “Our goal is to give away all the metadata technology and infrastructure for free to enable more companies to create slick metadata workflows. The more backers who help us, the greater the reach will be. If MetaRex gets it right, then high schools in Scotland will be learning and using the same media metadata workflows as Hollywood Studios.”
“It also raises awareness in organisations on the importance of dynamic video and how it will shape how users will interact with contact in the years to come,” concluded van der Linden. “In the long term we hope it creates a movement where inclusive dynamic content becomes the norm, for both entertainment content and other types of content, and it will be easier for anyone to consume video content online.”
Read more IBC2023 Accelerators in full
No comments yet