Looking for a mobile video platform?
In October 2016, mobile usage on the Internet exceeded desktop usage for the first time. This landmark occurrence had been a long time coming, and largely attributed to the influence of smartphones (which accounted for 46.53% of Internet traffic versus 4.73% for tablets). The shift in online video is showing a similar trend. As highlighted in our Video Trends to Look for in 2017, 2016’s data had already shown a major shift to mobiles for video content. In fact, for the year as a whole mobiles accounted for an average of 47.32% of video streaming traffic. What was particularly enlightening was the growth in the enterprise sector. For 2015, average mobile usage was just 5.85% for streaming video, while in 2016 that had grown to an average of 28.80%.
This change in dynamic in has painted a picture where content owners in 2018 and beyond have to be supporting mobile users. This article outlines how services are creating content that is mobile compatible, what codecs content owners should be using along with the importance of adaptive streaming and especially live transcoding for live streaming.
- Mobile compatibility for video
- Reaching mobiles with live video
- Adaptive bitrate streaming for mobiles
- Live transcoding for live adaptive streaming
Mobile compatibility for video
To reach mobile devices, content owners need access to an HTML5 video player and a supported protocol for the video content.
For IBM’s video streaming and enterprise video streaming offerings, an HTML5 player is provided with different protocols depending on the type of content. For HTML5 desktop delivery, through HTML5 MSE, the service uses mp4 chunks and supports H.264 for video and AAC for audio delivery. For mobile and CE (consumer electronic) devices, HTTP Live Streaming (HLS), Apple’s live streaming standard, is used and supports H.264 and AAC. This means content owners should be creating or encoding content that is using the H.264 video codec and AAC as the audio codec when possible.
Due to support by many browsers and devices, HLS has surfaced as key choice for a media streaming protocol to reach mobiles. This includes being able to support iOS devices, such as iPhones and iPads, and Android devices as well, such as the Samsung Galaxy line and Google Pixel line. At its core, HLS for video takes MPEG-TS content and creates short video chunks that are delivered through HTTP. Due to its use of HTTP, the technology is compatible across numerous devices and more easily supported for use with firewalls.
Reaching mobiles with live video
While HLS is the preferred protocol for delivery, it’s not uncommon for a different protocol to be used as the source for live streams. For many broadcasters, this is RTMP (Real-Time Messaging Protocol). This protocol was originally developed Macromedia for streaming media content over the Internet, between a Flash player and a server. Flash has compatibility issues, originally at the mobile level while browsers like Firefox have announced end of life plans for the technology. So the solution is to take the protocol, outputted by encoders like Telestream’s Wirecast and NewTek’s TriCaster, into a mobile friendly format.
Here at IBM Watson Media, that is done through a proprietary media server application. This is part of a solution for large scale live stream delivery with complex server-side business logic and access control that maintains a bidirectional connection with each connected client. This allows for ingesting in RTMP while delivery, through live transcoding, supports HTTP streaming, HTTP live streaming (HLS) protocols, and also legacy RTMP. The latter is for compatibility over older browsers, like Internet Explorer 11 on Windows 7 which has known issues with HLS.
Adaptive bitrate streaming for mobiles
If reaching mobiles is an important market for you, being able to deliver an experience that supports adaptive bitrate delivery can be key. Adaptive streaming, sometimes abbreviated as ABR, is a technology that can “adapt” based on the device trying to watch the content. While this takes into consideration things like the window size viewing, it is often associated and praised for its ability to manage different connection speeds.
This can be important as mobile devices can be in situations where their connection speed is less than desirable. This can happen all too often when someone is using their data plan, such as during their commute. In these situations, without adaptive streaming, the viewer would be stuck with constant buffering. With adaptive streaming, though, the system should switch to an ideal resolution and bitrate from those available. So a mobile user with a slow connection will get a lower resolution and bitrate, while a mobile user with a fast connection can get an HD version. Ultimately, the user gets a version of the stream best suited for their available download speed.
This check is done continuously through out the viewing experience. So if a viewer starts watching from a fast Wi-Fi connection, and then switch to their data plan the quality of the stream will shift accordingly. Once detected that they should be on a different bitrate and resolution combination, it will switch over at the next available keyframe (i-frame).
Read this Adaptive Streaming white paper to learn more about the process.
Live transcoding for live adaptive streaming
When choosing a mobile video platform, adaptive bitrate streaming should be a requirement. Thankfully, many providers have this technology already in their solution. What’s not as common, though, is using live transcoding to create additional bitrates for live content. This is the process of taking a source stream and creating more resolution options that can be used as part of the adaptive bitrate process.
Without live transcoding, a broadcaster would have to literally send multiple live streams with these various bitrate and resolution combinations. For example, they would would have to send a 1080p version, a 720p version, a 480p version and other variants. While possible, this is ill advised for two reasons. The first is that it overly taxes the hardware being used to create the live stream. Either be it a software based encoder or a standalone hardware solution, having to create and send additional streams will cause additional strain. The second, and more important, is that it requires more upload speed as well. A common pitfall of live streaming, especially those new to it, comes from issues with the available connection.
A good rule of thumb is to secure about twice the upload speed of the content you plan to stream out. So a 4mbps broadcast would require a 8mbps upload speed. Other variables, like wireless and shared connections being worse, also contribute to this as well. A lot of preparation goes into ensuring that a single stream can work. This includes going to the location and doing a network test. Broadcasters should do multiple tests as well, especially if using a wireless signal. Variables like the direction of the antenna or physical objects blocking can negatively impact the signal or cause it to fluctuate. This fluctuation can lead to an inadequate amount of upload speed remaining to actually broadcast the content.
Now if a broadcaster is sending multiple streams themselves to achieve a multi-bitrate experience, it makes securing a fast enough connection that much harder. The reason is that you have to add up all of the streams to find your total output. So, as a simplified example, if you are doing a 1080p stream at 4mbps, while supporting a 720p version at 2.5mbps and a 480p version at 1.25mbps and a 240p version at 0.5mbps that means the total output is 8.25mbps. This would then require an upload speed of 16.5mbps, a large jump and much harder to secure a reliable source to support it that would fluctuate to below a recommended level. Alternatively, with live transcoding the broadcaster would just have to send the single stream, which would then be used to create the additional bitrate and resolution combinations. As a result, live transcoding is much easier to secure an adequate connection for while giving mobile viewers an optimal experience.
The mobile audience is expected to continue to rise in importance. More and more users are electing to stream video from their phones and tablets. As a result, content owners need to get their content both compatible and presented in a mobile friendly format with adaptive streaming. This can be achieved through selecting a mobile video platform that offers an HTML5 player and HLS. When doing live streaming, live transcoding should also be a consideration in order to effectively deliver live content with adaptive streaming.
Interested in learning about how IBM Watson Media offers not just a mobile video platform solution through transcoding, but is also able to scale to reach large audiences? Check out our Live Video Delivery System Built for Scalability white paper to better understand this process.