High Availability Video Streams for March Madness Live



mm-bts-pic

Three seconds are left on the clock and your team is down by two points in the National Championship game. This is it! The ball is dribbled the length of the court and a 3-point shot is hoisted in desperation as time expires… the video goes dark.

How terrible would that be?

March Madness Live is a huge undertaking and requires delivering millions of high-quality video streams over 15 days to basketball fans throughout the country. We were challenged with providing a great video experience with zero downtime for fans everywhere…and we delivered!

iStreamPlanet is no stranger to high-profile events, having streamed multiple Olympic Games, Super Bowls, NBA basketball, Master’s Golf tournament, Premier League Soccer, E-League and countless others. This was the first year we delivered March Madness Live and it provided some interesting challenges, particularly in regards to video stream redundancy. Our goal was to ensure redundant stream coverage from the video source (arenas, studios and ads) all the way down to the user’s video players so that fans could catch every tick of the clock.

In this article, I will share some background information on our video delivery workflow for March Madness Live 2017. This workflow delivered over 275 hours of uninterrupted, high-quality video with iStreamPlanet’s Aventus encoder.

The graphic below depicts a high-level overview of the complete video signal flow for one game. As you can see, there are several steps in the workflow, each built to support redundancy and automatic failover. In this article, we will breakdown each of the major components in terms of how we solved this challenge of bringing March Madness Live to fans across the country.

mm-high-avail-fig1

Source Feeds from the Arenas and Studios

The tournament began with overlapping games at four different arenas across the country. The video feeds from each arena were sent over Fiber to Turner’s Broadcast Operations Center (BOC) in Atlanta, GA. Additionally, the studio feeds from CBS were sent to Turner’s BOC. These feeds were used for the television broadcast on CBS, TNT, TBS and TruTv networks as well as for Internet distribution. The resultant television broadcast was then received off satellite at Turner’s BOC which served as a backup in case the arena or studio feeds went down. Finally, a disaster recovery (DR) feed was received over satellite by a third party. Each of these feeds were then converted from the television broadcast format (HD-SDI) into high-bitrate streams using contribution encoders for distribution to iStreamPlanet.

mm-high-avail-fig2

Source Feeds are sent to iStreamPlanet

Once the video feeds are converted to IP they are sent to our Primary Cloud as well as to a discrete Backup Cloud. To ensure we received all signals successfully, the contribution encoders at Turner sent duplicate copies of the streams over primary and secondary circuits (from different companies) while the DR feed was sent over a third circuit from yet a different provider. These feeds were then ingested into our Cloud Video Routing system at iStreamPlanet. Ingesting the feeds over three different circuits provided redundancy should any of them have issues.

mm-high-avail-fig3

Feeds are Selected and Ads Inserted using a Video Switcher

Once the video streams are in our network we begin processing them. At this point we have 7 different video feeds:

  • Primary & Secondary Arena
  • Primary & Secondary Studio
  • Primary & Secondary Backup
  • Disaster Recovery

We provided production services using a Video Switcher that we developed specifically for March Madness. The video switching is controlled by our Digital Control Panel application whose role in life is to switch between the Arena, Studio, Backup and DR feeds as well as insert all those wonderful ads.

To ensure we always have an active stream coming into the Video Switcher, we run video inputs in an Active/Passive configuration whereby the Video Routers send their respective instance of the source feed to different ports on each Video Switcher. The Video Switcher starts processing video on the Primary port and if the that stream is lost it automatically switches over to the Secondary stream. The Video Switchers themselves are run in an Active/Active configuration in that they both receive the same inputs and produce an equivalent output. Each Video Switcher produces one output which is then sent to both the Primary and Secondary instances of our Aventus encoder.

Insertion of ads is a critical component since they drive revenue to Turner and CBS. Failure to insert an ad could result in loss of a significant amount of revenue so one could argue this is the most important component. We utilized an Ad Scheduling service with several instances of it residing behind a network load balancer which distributes load and redirects traffic should one of the instances become unavailable. Additionally, a separate repository of ads is maintained in the cloud as well as a local copy on each Video Switcher.

Should problems arise with the Arena or Studio feeds, our personnel controlling the Digital Control Panel application will manually switch source selection to the Backup source. If the Backup is having issues as well, the DR feed will be manually selected.

mm-high-avail-fig4

Video is Encoded for Internet Distribution

Aventus is our cloud-based encoder which takes an input video stream and encodes it for Internet distribution using an 8-layer HLS configuration. For March Madness, we utilized two encoders running in an Active/Active configuration where they are both processing input video. Like the video ingest mechanism we use on the Video Switchers, Aventus receives duplicate copies of the stream from each of the Video Switchers on different ports. Aventus processes video input in an Active/Passive mode by listening on the primary port and then switching to the secondary port if the signal is interrupted…and then back again if ever necessary.

Aventus encodes and publishes all 8 HLS video layers to multiple publishing points on a Content Delivery Network (CDN). Because we are publishing to multiple different locations from different encoders, we can handle disruptions in service if one of the entry points to the CDN is unreachable.

mm-high-avail-fig5

Duplicate Workflow Running in Backup Cloud

While we have a great deal of redundancy built into our main workflow in our Primary Cloud, we aren’t protected should something catastrophic occur to that geographic area. Therefore, we maintain a completely duplicate workflow running in a Backup Cloud. This backup system publishes its output to a different entry point on the CDN.

mm-high-avail-fig6

Fans Enjoy the Games

Finally, the video is distributed down to 15 different platforms for consumption. Each player receives a video stream and selects between one of the 8 video layers that are exposed to it. Each layer has a different video resolution, bitrate and complexity associated with it. The players monitor network bandwidth, CPU utilization and other factors to automatically select the one that will produce the best quality. Continual monitoring of these factors provides the media player the ability to dynamically switch between layers so it’s always delivering the best experience it can. Additionally, should one of the streams become unavailable, the player can be directed to one of several other backup streams.

mm-high-avail-fig7

Success

I am extremely happy to say that throughout the entire tournament there were no major problems… and the few minor bumps introduced by external factors were handled by the system and exposure was mitigated.

We spent a good deal of time and thought engineering this solution. While there are additional costs associated with the level redundancy we created, they are well worth the peace of mind knowing you are prepared for just about anything.

Hopefully you had the opportunity to enjoy the games and experience the high-quality video experience over the Internet that we helped to deliver. Three seconds are left on the clock and your team is down by two points in the National Championship game. This is it! The ball is dribbled the length of the court and a 3-point shot is hoisted in desperation as time expires… SWISH!!!

Dan Penn is a Director of Engineering at iStreamPlanet. His team architected and developed the workflow that powered March Madness Live 2017.

March Madness Live 2017