IBM Watson

IBM’s Watson to help media companies master entertainment

Life
IBM's Watson supercomputer (Image: IBM)

25 April 2017

Vast amounts of data today are locked in video: cell phone cameras, drone cameras, traffic cameras, security cameras and CCTV are all generating increasing volumes of data that could unlock powerful insights if it could be properly analysed. Human analysis is effective, but doesn’t scale. Enter IBM Watson.

At the National Association of Broadcasters Show in Las Vegas on Monday, IBM announced plans for a Watson-powered cloud service that brings together artificial intelligence and the IBM Cloud to extract insights from video. The initial focus will be helping media and entertainment (M&E) companies better harness their unstructured data.

For instance, IBM says a sports network using the offering could leverage it to identify and package basketball-related content that contains happy or exciting scenes based on language, sentiment and images, and then work with advertisers to promote clips of those scenes to fans prior to the playoffs. Humans could do that work, but it would require manually going through each segment of video to identify each piece of content and break it into scenes. IBM notes the service could also be used to repackage specific scenes from years of TV shows, which an advertiser could then use to associate its brand with those moments.

The offering would also provide media and entertainment companies with a more efficient way to manage their content libraries. The service could help a company analyse these libraries and identify content that targets specific audiences.

“We are seeing that the dramatic growth in multi-screen content and viewing options is creating a critical need for M&E companies to transform the way content is developed and delivered to address evolving audience behaviours,” Steve Canepa, general manager for IBM Global Telecommunications, Media & Entertainment industry said in a statement. “Today, we’re creating new cognitive solutions to help M&E companies uncover deeper insights, see content differently and enable more informed decisions.”

IBM expects to make the content enrichment service available later this year. It will bring together a number of Watson APIs, including Tone Analyser, Personality Insights, Natural Language Understanding and Visual Recognition. Those APIs will be combined with new IBM Research technology that will analyse the data generated by Watson and segment videos into logical scenes based on semantic cues in the content. The service will be able to extract metadata from video, including keywords, concepts, visual imagery, tone and emotional context.

IBM Research has already been exploring similar technology. Last year, it partnered with 20th Century Fox to develop a ‘cognitive movie trailer’ for the studio’s AI horror thriller film Morgan. Leverage a number of experimental Watson APIs, they used machine learning to study 100 horror movie trailers using visual analysis, audio analysis and scene composition analysis. Once trained on the corpus of horror movie trailers, the system ‘watched’ Morgan and identified 10 moments as candidates for the trailer.

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie