Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Media & Entertainment Tech Outlook
When I first got into live streaming video, it was 1998. A group of friends and I rented a small studio space in Austin, TX with a shared T-1 line and began streaming live bands and DJs four nights a week. We used software from Real Media and the data rate of the postage stamp-sized video was 56 kbit/s. Viewers (all 30 of them) tuning in to the stream connected to a Real Media server running in our office. It wasn’t long before the building’s IT department called and asked why we were using so much data at 8 PM on a Friday night. Fast-forward 18 years and our infrastructureis regularly serving multiple HD streams of 1080p at 4.5 mbit/s to thousands of concurrent viewers. How do we manage the multi-gigabit/s workload? Content Delivery Networks.
"Fewer servers mean less capital expenditure and reduce our footprint in the colocation facility/cloud, translating into lower recurring charges and maintenance"
The basic principle of CDNs is to cache and serve frequently requested data like images at the network edge, close to the end-user.
Luckily for us, this concept works just as well for live and on-demand video data as it does for images. These servers (edge nodes) are distributed around the globe in major population centers and interconnected with high-speed links. As the popularity of streaming video has grown explosively, CDNs have been keeping pace by optimizing their infrastructure and deploying more and more edge nodes to cope with increased traffic.End-users receive a better experience since the video data is served from nearby edge nodes with minimal latency, helping to eliminate those incredibly annoying interruptions in playback caused by buffer under runs. From a provider standpoint, we leverage this infrastructure to help us scale elastically and control costs. Our servers (the origin nodes) only fulfil requests from the CDN, which in turn will fulfil requests from the viewers. This means our workload and egress traffic for the video streams from the co-location facility or cloud is the same whether we are serving to 100 or 100,000 users. Fewer servers mean less capital expenditure and reduce our footprint in the colocation facility/cloud, translating into lower recurring charges and maintenance. Further, because CDNs operate on a pay-by-use pricing model, spikey traffic from popular live events isn’t a problem. It is no longer necessary to buy IP transit on a 95th percentile model and pay huge penalties for bursting while letting excess capacity sit idle.
In recent years, we’ve seen the decline of specialized streaming media servers from vendors like Real Media, Microsoft, and Adobe and their corresponding protocols, and CDNs have systematically been retiring these service offerings. Thanks to HLS and MPEG-DASH formats, HTTP is now a viable solution for streaming video. This is ultimately a good thing because CDNs tend to have a much larger footprint of HTTP edge nodes and the result is a better end-user experience. It also means it’s easier than ever to create an in-house video solution that leverages the CDN for cost-effective delivery at scale. While I prefer media server software from vendors like Wowza, it’s entirely possible to serve streaming video with existing HTTP servers. If your organization already has an existing CDN relationship for static content, you likely have everything needed to get started with streaming video.