Adaptive Streaming and Picture Quality
Ever wonder what could happen to picture quality when a given screen is displaying a “downshifted” stream of video, sent using adaptive bit-rate techniques? I did, and was glad to soak up a session about it at the recent SCTE Cable-Tec Expo.
Short version: Arris CTO Tom Cloonan and colleague Jim Allen built an emulator in their lab to sample what happens when different types of traffic gets smooshed together on the Internet-protocol plant.
Refresher: Tons of video is moving over the Internet. Unprecedented growth. It uses a lot of bandwidth, comparatively. Everyone’s working on it - by adding IP bandwidth and by working the end points. The “clients,” in the lingo, mean your other screens - laptops, tablets, smartphones.
From a bandwidth perspective, here in the twilight of 2011 (and the eve of big channel bonding), adding more IP bandwidth means going beyond the two to four downstream digital channels reserved for broadband and voice-over-IP services. (Watch for this to rise to 12-18 bonded channels in the next few years.)
Consequently and inevitably, video-service providers will start increasing the types of traffic sent over the IP part of the plant. That means plain old Web browsing, plus whatever’s moving “over the top” on the public Internet, plus the newer “managed IP” video services.
On the client (screen) side of the equation, adaptive bit-rate streaming (a.k.a. “fragmented” streaming) is big. It works by chunking video streams into different sizes - in the Arris experiments, 3, 2.1, 1.5 and 1 Megabits per second - so that if bandwidth isn’t available to play the bigger chunk, the client can request a smaller chunk next.
Which brings us back to the question of what happens on your various screens when network congestion causes a downshift in video bit delivery?
Nice descriptive language in this wheelhouse, by the way. Example: Things that can go wrong crop up as “rendering engine starvation” and “videoresolution dithering.”
Both conditions stem from network congestion - the former when the software in the end point device (tablet, TV) doesn’t get enough bits; the latter when not enough bits arrive to render a goodquality picture, causing the screen to “dither” between 1080p and lower resolutions.
Also factored into the simulator: An “aggressiveness factor” to assess who does what when bandwidth does become available. As it turns out, some client software is more aggressive than others - meaning they jump up to a higher resolution chunk, lickety split.
Generally speaking, though, the simulator found that most adaptive streaming protocols back off quickly in times of congestion. Sort of a digital cacophony of, “After you. No, after you.”
This just scratches the surface of the 34-page paper, and companion presentation, titled “Competitive Analysis of Adaptive Video Streaming Implementations.” For more, contact the SCTE (www.scte.org).
________________________________________
Stumped by gibberish? Visit Leslie Ellis at www.translation-please.com or multichannel.com/blog.
Multichannel Newsletter
The smarter way to stay on top of the multichannel video marketplace. Sign up below.