The transition to IP using SMPTE 2110 has been broadly successful in a studio environment but interoperability in the live and near live domain still has work to go. A recent innovation from the BBC could provide the answer.

The Time-Addressable Media Store (TAMS) API developed by BBC R&D is a new way of working with content in the cloud. It’s an open specification that fuses object storage, segmented media and time-based indexing, expressed via a simple HTTP API. It is intended to lay the foundations for a multi-vendor ecosystem of tools and algorithms operating concurrently on shared content all via a common interface. In effect, blending the best of live and file-based working.

robw-headshot-2024

Robert Wadge, Lead Research Engineer at BBC R&D

The open-source API specification was launched to the industry at IBC2023 which is where AWS sourced it as the basis for a proof-of-concept Cloud-Native Agile Production (CNAP) workflow, demonstrated at IBC2024.

AWS was particularly interested in the potential of TAMS to streamline the process of fast-turnaround editing in the cloud in an open, modular way.

“The most important outcome of all of this is interoperability,” says Robert Wadge, Lead Research Engineer at BBC R&D. “That’s really what we’re driving at. TAMS enables sharing between systems and sharing between workflows across and between organisations. The aim is to give the media industry a way into near-live fast turnaround cloud production that doesn’t require them to buy into a single vendor’s vertically integrated solution.”

The BBC and AWS approaches are part of a wave of similar software-defined architectures coming to market. TAMs dovetails with the EBU’s Dynamic Media Facility; systems integrator Qvest is proposing to build video streaming platforms using what it calls ‘Composable OTT’.

What is TAMS?

Work leading up to TAMS stems at least as far back as the IP Studio project showcased as a live ‘IP-end-to-end’ outside broadcast at the 2014 Commonwealth Games in Glasgow.

TAMS-centric supply chain

TAMS-centric supply chain

Source: Robert Wadge, BBC R&D

The initial goal with TAMS was to bring the worlds of live and post-production closer together. Wadge explains: “Until very recently those two worlds have been quite disparate because everything was locked into hardware devices and bespoke systems. It’s almost like you record video onto a bunch of files then you bring it into your post-production and half the referencing gets lost on the way. The move to software means that for the first time we had the opportunity to do things differently and make media addressability reliable and consistent.”

The shift to software certainly promises flexibility benefits, but it’s not enough on its own to solve the problems of scalability and interoperability. Simply replacing signal processing with software won’t move the dial beyond the limitations of workflows designed originally for coax cables and tapes.

Wadge continues: “We wanted to move beyond the ‘lift and shift’ of taking a bunch of black box fixed function devices in racks and putting them in a data centre. Instead, we designed TAMS to be cloud native. With that comes a new philosophy about the way you write and deploy software. You can take a more modular microservices approach to media workflows and the infrastructure that supports that. Crucially, it enables us to architect horizontal capabilities that can be shared among a variety of people rather than having a very specific integration for each workflow in order for people to access and move media around.”

Vendors that may have been reluctant to cede a competitive edge by opening their systems up before are apparently changing their tune. It helps that AWS has backed the project and brought in partners CuttingRoom, Drastic Technologies, Adobe, Vizrt, and Techex for the CNAP demo at IBC. Sky also participated. It’s worth noting that TAMS is cloud vendor agnostic.

“With this project we’ve seen a different approach to vendor collaboration,” Wadge says. “A lot of vendors we’ve spoken to are facing a situation where they have to do a lot of bespoke integrations themselves on behalf of their end users.

“For example, there are a whole variety of different media asset management systems (MAMS) which any tool vendor in this space is under pressure to integrate with. The interoperability interface that TAMS offers gives vendors an opportunity to integrate at a [foundational] level which means that they do one integration and everybody wins. In that scenario, people are starting to see that it will save a lot of time and effort on integration that could be spent on adding features and innovating their own products.”

Breaking video down into smaller chunks is not new. It’s pretty much ubiquitous in streaming for distribution. HLS and MPEG-DASH are both based on the concept but these are optimised for linear playout. TAMS effectively takes those short-duration segments, stores them in HTTP-accessible object storage, and applies a time-based index over the top. This creates an immutable database from which any piece of the media can be accessed via the API.

Although its prime application is to smooth inefficiencies for producing near-live sports and news content, there’s no reason why TAMS can’t be used further downstream, Wadge says.

The ‘store once, use many’ approach to repurposing media means simple edits can be expressed as a metadata ‘publish’ rather than a new asset or exported file. This strategy reduces storage duplication, time spent processing storage, and the volume of space required for the same workload. Basic operations like time-shifting, clipping, or simple assembly can be achieved without knowing the media type or format, described purely in terms of timelines.

Nor does TAMS place any constraints on the media format. Indeed, BBC R&D has experimented using uncompressed video. Most users however will want less data-heavy workflows, especially in remote production scenarios which require media to be streamed.

“The idea is to abstract everything to a timeline and that’s the key principle behind interoperability,” Wadge says. “One benefit that flows from that is that TAMS will work with any media type today or media types that might arise tomorrow.”

Next steps

The IBC demo was reportedly a success with interest in the technology from across vendors and end users globally.

blog-post-block-diagram

TAMS: blog post block diagram

Source: Robert Wadge, BBC R&D

“We’ve taken a lot of feedback from AWS and the partners who’ve been involved in CNAP to refine the specification and we expect to continue doing that with a much broader range of vendors. What we really want to happen is for people to pick up the TAMs API and to build products based on that.”

He says the BBC is looking to use TAMS internally, specifically for fast turnaround news workflows and for extraction of VOD assets from live streams.

“Beyond that, TAMS really starts to come into its own when it’s used to share media by reference more widely across the supply chain. For instance, you can store your media once in a serverless repository which is accessible by everyone who needs to access it and then people can just go and get all or a portion of it to work with. They could transform it and then write that transformation back into the store to be shared with others. That sharing function is extremely valuable. It starts to break down the silos between a lot of the different functional blocks on the supply chain.”

The identity and timing model that underpins TAMS aligns well with SMPTE 2110 and NMOS, as well as MXF and IMF file-based delivery protocols for interchanging finished assets between organisations.

“There are common principles that map very nicely between these different areas which we’d like to build on. We think that the real value here is to have that timing and identity flow throughout the supply chain. Then it becomes a foundation which we can use for richer discovery of media and management of media. That’s a big focus.”

TAMS also dovetails with project work undertaken by the BBC with the EBU.

Like BBC R&D’s foundational contributions to SMPTE ST 2110, the JT-NM Reference Architecture and the NMOS family of specifications, this is another project which could only have come out of a body that does not have a vested commercial interest. The BBC will benefit from the work just like any other media organisation if TAMS enables them to integrate best-of-breed solutions from different vendors to build better supply chains.

“We want to build the BBC’s technology estate in a more modern way, one that’s not limited by the interoperability issues that that we would have otherwise,” Wadge says. “We’ve removed the barriers to adoption by making TAMS an open and freely available spec with no license fees. It means that there’s very there’s very little friction there for vendors to come on board. So we’re really excited to see what people build with it and hopefully it can help them innovate rather than having to focus on reinventing the basics.”

Overlap with EBU Dynamic Media Facility

The EBU Dynamic Media Facility (DMF) initiative is focused on design patterns for systems that integrate software-based Media Functions, proposing a layered model and recommending the use of containers for deployment on a common host platform.

In the reference architecture, published just before IBC, Media Functions are interconnected using the Media Exchange Layer, forming chains or pipelines that can be instantiated and torn down dynamically as needed, on a common infrastructure platform. The Media Exchange Layer “provides high-performance transport of uncompressed or compressed media payloads between software Media Functions running on containers on the same compute node, or on different compute nodes in a compute cluster.” Wadge comments that this is a low-latency transfer between running processing functions and a clear point where interoperable approaches will be needed.

TAMS, on the other hand, focuses on how media can be stored in short-duration segments in an object store such as AWS S3 and accessed by ID and time index via an HTTP API. This can be used to share media between tools and systems with a fast turnaround from the live edge of an ingesting stream.

“The two projects are complementary, and there are common threads in the different domains that we’re interested in drawing together,” he says.