IBM has used NAB 2017 to launch a cloud service that aims to help companies extract new insights from video.
The service is part of IBM’s continued focus on combining artificial intelligence with the IBM Cloud to help media and entertainment companies try to make sense of unstructured data and make more informed decisions about the content they create, acquire and deliver to viewers.
The content enrichment service, which is expected to be available later this year, will use Watson’s cognitive capabilities to provide a deeper analysis of video and extract metadata like keywords, concepts, visual imagery, tone and emotional context.
It will apply a range of artificial intelligence capabilities – including language, concepts, emotions and visual analysis – to extract insights.
Semantic cues
The service will use several Watson APIs, including Tone Analyzer, Personality Insights, Natural Language Understanding and Visual Recognition. In addition, it will use new IBM Research technology to analyse the data generated by Watson and segment videos into logical scenes based on semantic cues in the content.
This capability identifies scenes based on a deeper understanding of content and context beyond what’s available in current offerings in the market.
For example, the new offering can enable a sports network to more quickly identify and package specific basketball related content that contains happy or exciting scenes based on language, sentiment and images, and work with advertisers to promote clips of those scenes to fans prior to the playoffs.
’Today, we’re creating new cognitive solutions to help M&E companies uncover deeper insights, see content differently and enable more informed decisions’ - Steve Canepa, IBM
Previously, someone would have had to manually go through every piece of video to identify each piece of content and break it into scenes. Now each scene can be more quickly identified to attract viewers and advertisers for quick-turn campaigns.
The new service can also be applied to repackaging specific scenes from years of TV shows to be used by an advertiser that wants its brand associated with certain moments – like the family eating dinner, or driving in a car.
Content management
In addition, the service could help media and entertainment companies better manage their content libraries. For example, a company might want to prioritise content that targets viewers who want more uplifting stories about world adventures.
To address this need, the new service could help this company analyse their content library with a new level of detail to determine whether they are meeting this specific interest.
“We are seeing that the dramatic growth in multi-screen content and viewing options is creating a critical need for M&E companies to transform the way content is developed and delivered to address evolving audience behaviors,” said IBM Global Telecommunications, Media and Entertainment General Manager Steve Canepa.
“Today, we’re creating new cognitive solutions to help M&E companies uncover deeper insights, see content differently and enable more informed decisions.”
No comments yet