Enabling a more personalised UX without adding unduly to workflow complexity is fast becoming the holy grail for streaming services.
With various sources pitching the overall growth of the streaming market as high as nearly 40% last year, there can be no doubt that pandemic conditions have proven highly favourable to OTT providers.
But as the world begins to open up once more and consumers review which subscriptions they wish to maintain post-Covid, the focus is inevitably shifting to long-term viewer retention and quality of experience.
In this context it’s not hard to see why there is presently such a buzz around OTT personalisation in the form of more dynamic and individual user experiences.
Themed Week: Next gen services For more features and an on-demand webinar click here
Fully aware of the potential complexity of extensive personalisation, OTT services are inclining towards technologies that enable scalable delivery of different content experiences to multiple platforms and devices. With signs of more market consolidation on the way, these platforms also need to be flexible enough to accommodate what could be a cycle of perpetual and unpredictable change.
José Luis Vazquez, founder and CEO of Mirada, remarks: “When there is more consolidation there is more market competition, and there is a need to fight harder for a bigger share of the consumer base. So that’s why we are seeing streaming services doing so much now in this area of personalisation.”
Scaled delivery
Given the huge expected demand it’s not surprising that R&D departments at both broadcasters and technology vendors are awash with major initiatives. A recently announced five-year partnership between BBC R&D and the universities of Surrey and Lancaster is a case in point. Backed by UK government money via the UK Research and Innovation’s Prosperity Partnerships scheme, the project aims to explore how personalised media can be developed to offer scaled delivery of a wide range of content experiences efficiently and sustainably – with object-based media as the particular focus.
Graham Thomas, section leader immersive & interactive content at BBC R&D, explains the primary goals of the partnership: “Working with the University of Surrey, we will develop AI-powered techniques to allow audio and video to be separated into objects – such as individual audio tracks and distinct ‘layers’ from a video scene – and create metadata describing the scene. This will allow the content to be assembled to meet particular needs (such as accessibility), tailored to specific device characteristics (like screen size), or audio interests and context.”
“The future will bring a world of complete personalisation. Broadcasters and service providers will focus on content creation and delivery, but the overall user experience will be controlled by the end viewer,” Thomas Siller, Native Waves
Pursuing this direction requires something of a sea-change among production teams, geared as they are “towards creating ‘finished’ programmes rather than the individual components needed to produce flexible content.” But there are also challenges to negotiate on the delivery and rendering side, which is where Lancaster University enters the picture.
In collaboration with the team there, says Thomas, “we will develop approaches to intelligently distribute the processing and data through the delivery network, making the best use of resources in the cloud, edge-compute nodes, and the audience’s networks and devices. To render customised content at present, high-end audience devices or large amounts of cloud computing resource are required, making it expensive and impractical to reach large audiences.”
Ultimately, the intention is that there will be two major outcomes: firstly, a method by which real-time object extraction of audio-visual objects can take place using AI, avoiding the need for content capture specifically to support object-based media; and secondly, the enabling of a distributed intelligent network compute architecture that, says Thomas, “will allow people get an object-based experience without needing the latest consumer equipment and without needing massive cloud compute infrastructure.”
Context and control
Whilst this project still has a long way to go, streamers can already access a growing number of individual vendor-led personalisation solutions. NativeWaves is among the innovators in this area, having recognised that personalisation could be instrumental in both retaining customers on a streaming platform and driving new forms of monetisation.
According to NativeWaves senior web developer Thomas Siller, more and more broadcasters have come to require a solution delivering “multiple videos, audio, data, social media and e-commerce streams, in perfect sync with each other, to their viewers, or deliver multiple events in a single place, that viewers could choose from.
“They needed a system to do all this without having to make major changes to their workflows, or having to adopt proprietary solutions that were really not scalable.”
NativeWaves’ response has to be develop an automated solution that allows broadcasters to deliver a personalised OTT experience to viewers “by providing multiple streams of audio, video and data in perfect sync to each other – but also a complete second screen experience in perfect sync with the broadcast signal.”
It has also tackled the issue of service providers and the choice of full native mobile or web solutions. While a fully native approach can deliver very high performance but requires constant app updating, web solutions can enable rapid changes to the user experience but face performance problems when it comes to video playback.
Read more The evolution of dynamic ad insertion
Hence the company’s decision to pursue a “hybrid route” with its Dynamic UX solution, which is hosted as a web application and adapted depending on the final user experience being delivered by the broadcaster. “Dynamic UX uses the performance benefits of the native codes, allowing users to quickly switch between different streams while delivering a consistent playback experience,” explains Siller. “
The web technology used to build the user interface ensures that the user experience on all platforms looks and feels the same, and changes can be deployed easily and quickly while maintaining the same code.”
As well as data, user interfaces and styling, Dynamic UX can also handle “ancillary experiences such as ads, merchandising, e-shopping and other custom experiences that have been agreed upon.”
Optimising the data
One of the most pleasing aspects of the current personalisation boom is the diversity of approaches being taken, but one thing they all have in common is a laser-sharp focus on data optimisation. Mirada is one of the trailblazers in this respect, having developed a data intelligence platform, LogIQ, that allows content creators to obtain consumption analytics across the full range of devices and services. Among other benefits, these insights make it easier for service providers to be proactive about customer churn and consumption trends.
“Using AI makes it possible to define and categorise customers into clusters according to viewer preferences, then you can go ahead and create segments of users for which there can be different personalised content experiences,” says Vazquez.
Accelerating the process of altering the user experience has been another priority, hence the development of a back-end tool named UX Evolver.
“It facilitates operators with more control of their platform than ever before, allowing them to adapt and evolve elements of the user experience on the fly.” Meanwhile, Mirada’s Iris OTT platform remains its leading solution for personalised content delivery across all devices.
Whilst this brief overview has hopefully underlined the amount of innovation already underway around personalisation, the chances are that we are still in the formative stages of what is likely to be a very long journey. Siller notes of NativeWaves: “It is our view that what the future will bring is a world of complete personalisation. Broadcasters and service providers will focus on content creation and delivery, but the overall user experience will be controlled by the end viewer.”
No comments yet