AI Video Technology

Read how cognitive video using AI technology is changing our world and shaping end viewer experiences, while providing smart recommendations & analytics.

Watson the Watchdog: How AI is Changing Media Compliance

How AI is Changing Media Compliance

In media, context is everything: Just as a hand signal that means “V for victory” in one country may be incredibly offensive in another, what’s culturally acceptable in a television show in the U.S. may be verboten elsewhere around the world.

By flagging questionable content for media compliance, Watson not only saves time and costs, but opens up opportunities in new markets for content creators and broadcasters alike. For more information about how AI can save time and money with compliance, also be sure to  download IBM Watson Media’s ROI analysis paper: From AI to ROI: When playback means payback.


Using Big Data to Enrich Customers’ Streaming Experience

When average TV viewers are channel surfing on the couch, they probably aren’t thinking too much about big data. But it’s already beginning to shape the viewing experience, namely by providing recommendations. And as time goes on, it will begin to have an even bigger impact.

As the amount of video content—and video viewers—grows, so does the data available about viewing habits and preferences. This is a boon for content creators, who can begin to leverage these insights to create better experiences for their customers in a variety of different ways. While there’s still a lot of data left to be uncovered, changes are already underway in the realms of service quality, content recommendation and production.

For more details on the topic of using data for video enrichment, also be sure to read this Uncovering Dark Video Data with AI white paper.


Intelligent Captions Made Simple: IBM Watson Media Brings the Power of AI to Closed Captioning

Bringing the Power of AI to Closed Captioning

In the past few years, there has been a major shift towards video content as the primary form of media. Considering this momentum, the importance of closed captioning has only increased. However, delivering closed captions at scale is challenging for media and entertainment companies—they are costly to create, and the manual undertaking can be burdensome to production teams. Surrounding all of this is also the ever-changing compliance landscape, wherein adapting closed captions to meet regional or industrial guidelines can be tricky.

Today, IBM has introduced Watson Captioning, a new standalone offering that helps to solve these challenges. Watson Captioning leverages AI to automate the captioning process, while ensuring increased accuracy over time through its machine learning capabilities. In turn, this saves businesses both time and money, and delivers a scalable solution. Watson Captioning is a customizable offering that provides flexibility and productivity, can be easily managed across compliance standards, and has the potential to transform industries beyond media and entertainment.


Streaming Video Trends 2018: Top 5

Streaming Video Trends 2018: Top 5

By 2021, video is expected to comprise 82 percent of all global internet traffic. For the web audience, expectations of high-quality, personalized content are rising, too.

Many streaming video companies have been looking to artificial intelligence to meet the changing needs of their audience. And, according to David Kulczar, senior product manager of Watson Video Analytics at IBM Watson Media, that trend will continue in a big way, shaping streaming video trends 2018.

We sat down with Kulczar to get his predictions for how widespread the industry’s adoption of cutting-edge technologies will be in the coming year.


The Future of Closed Captioning with AI

The Future of Closed Captioning with AI

For all the great strides that live streaming video has made in the 21st century, the captioning process has remained largely stuck in the past. Humans still do the heavy lifting by manually typing captions word by word. Captioning pre-recorded video can take up to 10 times longer than the video itself — and the challenge is even greater with live video, which offers no time for review.  

It’s not only clunky and labor intensive — it also can be costly. In fact, many companies agree that budget constraints are one of the top barriers to captioning.

But for full-service video production companies like Suite Spot, manual captioning, arduous as the process may be, still remains the quickest and most accurate way to meet clients’ captioning needs.

That may change soon though, according to Suite Spot Co-Founder Adam Drescher. Automated captioning technology is maturing fast, he notes, and even may be poised to disrupt the entire video industry in the near future. Case in point: IBM’s video streaming and enterprise video streaming offerings recently introduced the ability to convert video speech to text through IBM Watson.


Powering US Open 2017 Highlights, IBM Watson Impacts Live Broadcasting

When Juan Martin del Potro faced Dominic Thiem on Day 8 of the US Open, die-hard tennis lovers might have been excited, but it didn’t have the hallmarks of a “must see” event for casual fans. Few expected del Potro, ranked 24th, to advance.

But when he staged one of the best comebacks in US Open history, everyone wanted to see how it was done. And within minutes, they were able to, thanks to IBM Watson powering US Open 2017 highlights.

Watson assembled a clip reel within five minutes of the end of every match at this year’s Open, making highlights and key moments available to fans two to 10 hours more quickly than during previous years. The event marked the official launch of IBM Watson Media, a new business unit that leverages Watson’s leading AI capabilities to meet the future needs of broadcasters and their audiences.


Future of Esports & Its Tie to Streaming Video

Future of Esports & Its Tie to Streaming Video

If any activity is tailor-made for streaming video, it’s esports. Competitive video game playing (and watching) is poised to be a $1.5 billion industry by 2020, according to market research firm NewZoo.

While online streaming services like Twitch and YouTube built a loyal audience of viewers, the lure of ad dollars has attracted the interest of mainstream broadcasters, too. Several major networks, including ESPN, NBC and TBS, regularly air esports programming. In recent months, tournaments have popped up on The Disney Channel, and Nickelodeon got into the game in June by joining a $15 million investment in esports host Super League Gaming.

It’s setting up what could be an epic battle between old media and new media. The streaming services run by Amazon (Twitch) and Google (YouTube) helped build a following for competitive video games, but now traditional networks want to use their built-in audience to lure gaming companies (and leverage their own digital platforms to lure the audience that’s already addicted to these competitions). What’s more, the audience is already proving larger than traditional sports: the audience for the 2015 League of Legends world finals topped that of the 2016 NBA Finals by 5 million viewers.


How AI Will Change Live Sports Broadcasting

Live Sports Broadcasting & How AI Will Change It

New innovations constantly change how fans watch their favorite live sports. Integral elements of today’s broadcasts like slow motion and instant replay didn’t exist before the 1950s. On-screen graphics are even more recent: Imagine watching a soccer game without the score in the top left corner of the screen, or a football game without the yellow first down line.

Now, emerging technologies like 3D and virtual reality are giving fans an entirely new perspective—and they may forever change how fans expect to experience the action.

But the real game-changer for live sports broadcasting is artificial intelligence. AI will not only affect viewers, but also advertisers, broadcasters—and even the athletes themselves. It will enrich video content with better insights and better recommendations, as outlined in this Uncovering Dark Video Data with AI white paperSoon, we may not recognize a sporting event without it.


What the Big Game Might Have Looked Like with Watson

Sports Highlights Analysis

While the audience for the biggest football event of the year—111 million strong—was on par with previous broadcasts, the game itself posted its lowest ratings in the past three years. Coming off of a less-than-stellar regular season, which saw a 9 percent drop in ratings, it’s safe to say that viewers felt something was lacking.

One factor that might account for this dip is a lack of personalized content and opportunities for interactivity. Everyone saw the same game in the same way, and while that may have been the standard up until now, artificial intelligence is raising the bar.

IBM Watson is no stranger to the sports world. Just this year Watson did sports highlights analysis and assembled highlight reels for the Master’s Tournament. It also predicted match outcomes during Wimbledon. And had Watson had a hand in the biggest football game of the year, FOX may have been able to deliver a more engaging broadcast for viewers—and even been alerted to lulls in audience attention to counteract them in real time. Artificial intelligence has the power to change the live broadcasting game. Here’s a closer look at what the NFL’s big game might have looked like with Watson in play. 

Related to these AI advancements, be sure to read our Outsmart your Video Competition with Watson white paper on how Watson will be used to unlock deep insights from untapped video content you’re generating.


Contextual Video Advertising: Why It Matters and How AI Can Help

Contextual Video Advertising: Why It Matters and How AI Can Help

Have you ever watched a sad movie on TV that was suddenly interrupted by an upbeat, loud ad? How did it make you feel? Did you suddenly find yourself switching gears emotionally? Did the ad seem jarring and inappropriate? Did you wind up resenting the advertiser?

There’s ample evidence to back up the belief that video advertising performs better when it aligns with the consumer’s mood. A 2015 report from Oxford University, for instance, showed that upbeat, cheerful ads that ran during a moment of tension during a movie made far less of an impact with consumers, leading to diminished brand recall and shorter viewing times. The swing in emotions causes viewers to enter a state of “deactiviation,” marked by lower physical and cognitive activity.

Unfortunately, few advertisers are taking context into account. But they have the power to, using contextual video advertising that is enhanced by artificial intelligence capabilities. AI tools and precision targeting are allowing advertisers to better sync their ads with the surrounding content—and the viewer’s mood. Read on to get a sense of where this market is and could be headed. Also, be sure to check out the Outsmart your Video Competition with Watson white paper too for an idea on how IBM’s Watson will start to change this landscape.