This has been a big year for IBM Watson Media. Since launching a year ago, we’ve worked with the Grammy’s, FOX Sports and The World Cup, The Masters, and introduced a new product: Watson Captioning. A highly trainable, AI-powered offering, Watson Captioning provides broadcasters and publishers alike with a new tool to take closed captions to the next level. Today, we’re excited to kick off a new collaboration with Sinclair Broadcast Group that will roll out Watson Captioning to all of their local stations across the United States, making live programming more accessible to local viewers, including the Deaf community, senior citizens, and anyone experiencing hearing loss.
Television requirements for closed captioning were established in 1996, but more than two decades later, live captioning remains both challenging and labor-intensive for production teams to deliver in real time. As a result, breaking news, weather, and live sports segments often have delayed or incorrect captions, leading to a confusing and occasionally frustrating viewing experience. With our Watson Captioning technology, Sinclair will be able to improve caption accuracy, automate time-intensive manual processes, and reduce production costs, all while providing captions in real time at scale.
Using Watson to Help Increase Accessibility for the Deaf Community
Nearly 48 million Americans suffer from differing levels of hearing loss. For these viewers who depend on closed captioning to fully digest broadcast content, accurate captions are essential. Watson Captioning is designed to solve this key challenge by automating the captioning process to increase production team efficiency. And thanks to our deep learning capabilities at IBM, Watson Captioning’s accuracy for everything from spelling to punctuation increases over time. By generating more timely and precise captions, we’re able to elevate the overall viewing experience and empower all viewers to get the most value out of their content.
As the viewer experience continues to evolve, the quality of closed captioning must evolve in tandem. Breakthroughs in new technology, like artificial intelligence, allow broadcasters to help ensure that all communities can access and enjoy local TV programming.
Training Watson with Hyper-local Data for Hyper-local News
Unique to our collaboration with Sinclair, we’re equipping individual markets with station-specific data sets to generate more precise captions for local broadcast. Hosted on IBM Cloud, Watson Captioning is trained several times a day on each station’s market-specific terminology, so that the solution can accurately identify local landmarks, events, and even names of local personalities and politicians. Over time, we anticipate the solution will get smarter, reducing the need for manual intervention, increasing accuracy rates, and improving the viewer experience.
This collaboration comes as the latest step in Sinclair’s longstanding commitment to using emerging technology to improve local viewer experiences. The company already leverages weather data, broadcast production tools, and visualization features from The Weather Company, an IBM Business, to help them forecast, detect and present the weather for audiences across the markets they serve.
Of our joint work, Sinclair’s Chief Technology Officer, Del Parks said, “We are constantly looking for new ways to optimize the local viewer experience. We noted impressive accuracy and efficiencies with IBM Watson, and look forward to rolling out IBM’s cutting-edge captioning technology across our footprint.”
To learn more on using artificial intelligence to generate captions, also download this Captioning Goes Cognitive: A New Approach to an Old Challenge white paper. To hear more about automating real-time captions for broadcast use, contact us or request a demo.