Since our release of Live Streaming in Azure Media Services last year, you have had access to the same instantly scalable, always available streaming solution that broadcasters have used time and again to deliver live events to millions of customers. You may have read through my colleagues’ blogs on using the Azure Management Portal or our .NET SDK to manage live events, and on how to produce a live feed. However, previously, in order to use Live Streaming, you were required to use an on-premises encoder to produce an adaptive bitrate video stream and push that to the cloud. With the preview release of Live Encoding, you can instead send a single bitrate live feed to Azure Media Services, have it encoded into an adaptive bitrate stream and deliver it to a wide variety of clients in delivery in MPEG-DASH, Microsoft Smooth Streaming, Apple HLS, or Adobe HDS formats.
In this blog, I will provide an overview of this Live Encoding feature, which adds the following capabilities to Azure Media Services:
- Live encoding of a single bitrate live feed into an adaptive bitrate stream
- Ability to ingest a live feed over RTP protocol (MPEG Transport Streams), RTMP and Smooth Streaming
- Ability to control insertion of slates, as well as to signal insertion of advertisements to the client
- Ability to get a thumbnail preview of the live feed
What is Live Encoding?
When you are streaming a live event, your goal is to deliver high quality video, under a variety of network conditions, to every device that your customers could possibly have. The problems of quality and network conditions have a solution: adaptive bitrate streaming. And the solution to the problem of multiple devices (and their capabilities) is a re-packaging system such as dynamic packaging.
Adaptive bitrate streaming works by encoding the video into multiple video streams at different resolutions and bitrates, while keeping them synchronized. In addition, for a live event, you need to keep latency to a manageable number by processing the incoming video in real-time. This real-time video compression is Live Encoding, and it requires a lot of compute cycles. To stream live events, you are now looking at hardware boxes with fast CPUs, perhaps GPU acceleration. In addition, you are producing several (6 to 10) video streams, which means you also need a lot of bandwidth to get those streams to a CDN, which can then deliver the event to your customers. Your infrastructure costs start to add up…
Live Encoding in Azure Media Services is a cloud based workflow that addresses such infrastructure concerns. With this feature, you only need to send in a single (high quality) video feed into an Azure data center , and the service handles the compute-intensive work of encoding it into an adaptive bitrate stream. This means you can run live events from remote locations (paying only for a good, fast WiFi or mobile network), and encoders built into cameras, or less expensive (even free) encoders that require less power. Our services are instantly scalable, which means you can handle spikes in your event schedules paying only for what you use.
How do I use Live Encoding?
You can set up Live Encoding to deliver live events via the following steps:
- Decide on which protocol you will use for the input live feed (see section below)
- Create a live Channel using our APIs or the Azure Management Portal, and choose the settings that meet your live encoding needs
- Set up an on-premises encoder to send in the single (high quality) video feed
- Preview the output stream, via the Azure Management Portal, for example
- Create Programs to manage your events
Note: you can get the details about the APIs, and configuration steps through upcoming blog posts.
Supported formats and codecs
The input protocols supported by Live Encoding are: RTMP, RTP (MPEG TS) and Smooth Streaming. You can send in a live feed where the video is encoded with MPEG-2 (up to 422 Profile), or H.264 (up to High 422 Profile), and the audio is encoded with AAC-LC (up to 7.1 channels), or Dolby® Digital/AC-3 (up to 7.1 channels), or with MPEG Audio (Layer II and III, up to stereo).
The Live Encoder supports chroma subsampling from 4:2:2 to 4:2:0, and de-interlacing, as well as audio channel downmixing, audio resampling and audio dynamic range compression.
On the output, the Live Encoder can encode video to H.264 (up to High 4:2:0 Progressive), and audio to stereo or mono-channel AAC (LC, HE v1, HE v2 Profile).
The Live Encoder also supports pass through of EIA/CEA-708 closed captions, if present in the input video feed.
For signaling advertisements, the Live Encoder supports input via API calls, or, if the input protocol is RTP, in-band SCTE-35 SpliceInsert and TimeSignal commands. On the output, our service can emit HLS Playlist Tags (SCTE-67), Smooth Streaming Sparse Tracks (SCTE-35) and
HDS CueInfo Elements.
Choosing an ingest protocol
You can send the input live feed to a Channel via one of the following:
- RTMP: Most common in prosumer scenarios, where the input feed can be sent over the open internet to a nearby Azure data center, using encoders built into the camera, or using tools like Telestream Wirecast, Flash Media Live Encoder, Tricaster, etc.
- RTP: Catered towards professional broadcasters, with on-premises live encoders from vendors like Elemental Technologies, Ericsson, Ateme, Envivio, etc. The input stream is typically set up in conjunction with an IT department, and using private/dedicated networks such as Microsoft Azure ExpressRoute
- Smooth Streaming over HTTP: Typically used with on-premises live encoders from vendors like Elemental Technologies, Ericsson, Ateme, Envivio, etc. It is usually possible to send the input stream over the open internet to a nearby Azure data center.
Notes on using RTMP
When sending a live feed into a Channel over RTMP, the following constraints apply:
- Video encoded with H.264 at resolutions up to 1080p30 and stereo audio encoded with AAC-LC
- Audio sampling rate should be 44.1 kHz
- Closed GOP and CBR mode encoding recommended
- Available bandwidth should exceed the aggregate bitrate for video and audio
Notes on using RTP
If you are planning to use RTP to send in a live stream, you should expect the following from your network connection:
- High throughput (up to 1.5 times the bitrate of the input stream). Since higher bandwidth may only be required during high profile events, but not year round, your choice for the network should allow easy change of bandwidth commitment for reduced cost
- Low latency (under 150ms), with about 10-15 hops as reported by traceroute
- An SLA on QoS and availability
The two recommended approaches are described below. Regardless of the option chosen, a Tier 1 network provider should be used. The list of Tier 1 network providers can be found here.
RTP over public internet and Border Gateway Protocol (BGP) peering
You can use RTP over the public internet and BGP peering with Microsoft Azure network. In this case, Internet capacity, sometimes called High Speed IP (HSIP), is provided by one or more network providers. Video data goes through public Internet and it requires a cross-connect between a network provider’s Internet IP edge and Microsoft Azure network at a co-location. This co-location depends on network providers and Microsoft and can be found from PeeringDB. The network providers are responsible for Internet delivery services to Azure with industry standard SLAs. This is the approach we used in the current NBC Sports/Sochi Olympics solution, and it often results in lower networking cost.
RTP over a private/dedicated network
You can use a network solution which is designed for general data transfer (instead of video specific data) over a dedicated private network. More often, this option is provided through a managed service package from a network provider. What you need may be only a subset of the services offered in the package. The advantage of such managed services is that end-to-end delivery is provided and managed with enhanced SLA. There are two types of services under this category:
- Microsoft Azure ExpressRoute over either a Network Service Provider (NSP) or Exchange Provider, such as Azure ExpressRoute + Level 3 Cloud Connect Solutions or Azure ExpressRoute + Equinix Cloud Exchange
- Managed video services provided by network provider, such as Level 3 VYVX Solution
If you are sending a live feed over RTP, then the common in-transit encoding/container and protocol used are as below:
- Encoding: H264/AAC
- Container Format: MPEG-2 TS
- Network Protocol – Application Layer: RTP Unicast
- Network Protocol – Transport Layer: UDP
Using slates and signaling advertisements
When your Channel has Live Encoding enabled, you have a component in your pipeline that is processing video, and can manipulate it. In our service, you can have the Channel insert slates and/or advertising signals into the outgoing adaptive bitrate stream. Slates are still images that you can use to cover up the input live feed in certain cases (for example during a commercial break). Advertising signals, as the name suggests, are time-synchronized signals you embed into the outgoing stream to tell the video player to take special action – such as to switch to an advertisement at the appropriate time. See this blog for an overview of the SCTE-35 signaling mechanism used for this purpose. Below is a typical scenario you could implement in your live event (sample code and API details will be available in upcoming blog posts).
- Have your viewers get a PRE-EVENT image before the event starts
- Have your viewers get a POST-EVENT image after the event ends
- Have your viewers get an ERROR-EVENT image if there is a problem during the event (eg. power failure in the stadium)
- Send an AD-BREAK image to hide the live event feed during a commercial break
Getting a thumbnail preview of a live feed
When Live Encoding is enabled, you can now get a preview of the live feed as it reaches the Channel. This can be a valuable tool to check whether your live feed is actually reaching the Channel. You can access the thumbnail via an API.
Summary
In this blog, I have introduced you to the Live Encoding feature in Azure Media Services. In the coming days, there will be many more posts on topics such as using the Azure Management Portal to use Live Encoding, about configuring on-premises encoders to generate the input live feed, how to control slates and ads. In the meantime, have questions about this feature, please contact AMSLiveD@microsoft.com