Livestreaming web audio and video
Livestreaming technology is often employed to relay live events such as sports, concerts and more generally TV and Radio programmes that are output live. Often shortened to just streaming, livestreaming is the process of transmitting media 'live' to computers and devices. This is a fairly complex and nascent subject with a lot of variables, so in this article, we'll introduce you to the subject and let you know how you can get started.
The key consideration when streaming media to a browser is the fact that rather than playing a finite file we are relaying a file that is being created on the fly and has no pre-determined start or end.
Key differences between streamed and static media
In this case, we are using static media to describe media that is represented by a file, whether it be an mp3 or WebM file. This file sits on a server and can be delivered — like most other files — to the browser. This is often known as a progressive download.
Livestreamed media lacks a finite start and end time as rather as a static file, it is a stream of data that the server passes on down the line to the browser and is often adaptive (see below). Usually, we require different formats and special server-side software to achieve this.
One of the main priorities for livestreaming is to keep the player synchronized with the stream: adaptive streaming is a technique for doing this in the case of low bandwidth. The idea is that the data transfer rate is monitored and if it looks like it's not keeping up, we drop down to a lower bandwidth (and consequently lower quality) stream. In order to have this capability, we need to use formats that facilitate this. Livestreaming formats generally allow adaptive streaming by breaking streams into a series of small segments and making those segments available at different qualities and bit rates.
Streaming Audio and Video on Demand
Streaming technology is not used exclusively for live streams. It can also be used instead of the traditional progressive download method for Audio and Video on demand:
There are several advantages to this:
- Latency is generally lower so media will start playing more quickly
- Adaptive streaming makes for better experiences on a variety of devices
- Media is downloaded just in time which makes bandwidth usage more efficient
While static media is usually served over HTTP, there are several protocols for serving adaptive streams; let's take a look at the options.
For now, HTTP is by far the most commonly supported protocol used to transfer media on demand or live.
Real Time Messaging Protocol (RTMP) is a proprietary protocol developed by Macromedia (now Adobe) and supported by the Adobe Flash plugin. RTMP comes in various flavors including RTMPE (Encrypted), RTMPS (Secure over SSL/TLS) and RTMPT (encapsulated within HTTP requests).
Note: Real Time Streaming Protocol (RTSP) controls media sessions between endpoints and is often used together with Real-time Transport Protocol (RTP) and with Real-time Control Protocol (RTCP) for media stream delivery. Using RTP with RTCP allows for adaptive streaming. This is not yet supported natively in most browsers.
Some vendors implement propriety transport protocols, such as RealNetworks and their Real Data Transport (RDT).
RTSP 2.0 is currently in development and is not backward compatible with RTSP 1.0.
Using streaming protocols
The process of using the various protocols is reassuringly familiar if you are used to working with media over HTTP.
<video src="rtsp://myhost.com/mymedia.format"> <!-- Fallback here --> </video>
Media Source Extensions (MSE)
Media Source Extensions is a W3C working draft that plans to extend
Note: Time Shifting is the process of consuming a live stream sometime after it happened.
Video Streaming File Formats
A couple of HTTP-based livestreaming video formats are beginning to see support across browsers.
Note: You can find a guide to encoding HLS and MPEG-DASH for use on the web at Setting up adaptive streaming media sources.
<video> element. So for example, if we detect that the network is slow, we can start requesting lower quality (smaller) chunks for the next segment. This technology also allows an advertising segment to be appended/inserted into the stream.
Note: You can also use WebM with the MPEG DASH adaptive streaming system.
HLS or HTTP Live Streaming is a protocol invented by Apple Inc and supported on iOS, Safari and the latest versions of Android browser / Chrome. HLS is also adaptive.
At the start of the streaming session, an extended M3U (m3u8) playlist is downloaded. This contains the metadata for the various sub-streams that are provided.
Streaming File Format Support
|Firefox 32||✓ ||✓ ||✓ 14+|
|Chrome 24+||✓ ||✓|
|Opera 20+||✓ |
|Internet Explorer 10+||✓ 11||✓ |
|Chrome Mobile||✓||✓ |
|Opera Mobile||✓ ||✓|
Between DASH and HLS we can cover a significant portion of modern browsers but we still need a fallback if we want to support the rest.
One popular approach is to use a Flash fallback that supports RTMP. Of course, we then have the issue that we need to encode in three different formats.
Audio Streaming File Formats
There are also some audio formats beginning to see support across browsers.
Opus is a royalty-free and open format that manages to optimize quality at various bit-rates for different types of audio. Music and speech can be optimized in different ways and Opus uses the SILK and CELT codecs to achieve this.
Currently, Opus is supported by Firefox desktop and mobile as well as the latest versions of desktop Chrome and Opera.
Note: Opus is a mandatory format for WebRTC browser implementations.
MP3, AAC, Ogg Vorbis
Most common audio formats can be streamed using specific server-side technologies.
Note: It's potentially easier to stream audio using non-streaming formats because unlike video there are no keyframes.
Server-side Streaming Technologies
In order to stream live audio and video, you will need to run specific streaming software on your server or use third-party services.
GStreamer is an open source cross-platform multimedia framework that allows you to create a variety of media-handling components, including streaming components. Through its plugin system, GStreamer provides support for more than a hundred codecs (including MPEG-1, MPEG-2, MPEG-4, H.261, H.263, H.264, RealVideo, MP3, WMV, and FLV.)
GStreamer plugins such as souphttpclientsink and shout2send exist to stream media over HTTP or you can also integrate with Python's Twisted framework.
For RTMP transfer you can use the Nginx RTMP Module.
SHOUTcast is a cross-platform proprietary technology for streaming media. Developed by Nullsoft, it allows digital audio content in MP3 or AAC format to be broadcast. For web use, SHOUTcast streams are transmitted over HTTP.
The Icecast server is an open source technology for streaming media. Maintained by the Xiph.org Foundation, it streams Ogg Vorbis/Theora as well as MP3 and AAC format via the SHOUTcast protocol.
Note: SHOUTcast and Icecast are among the most established and popular technologies, but there are many more streaming media systems available.
Although you can install software like GStreamer, SHOUTcast and Icecast you will also find a lot of third-party streaming services that will do much of the work for you.
- HTTP Live Streaming
- HLS Browser Support
- The Basics of HTTP Live Streaming
- DASH Adaptive Streaming for HTML 5 Video
- Dynamic Adaptive Streaming over HTTP (MPEG-DASH)
- MPEG-DASH Media Source Demo
- DASH Reference Client
- Dynamic Streaming over HTTP
- The State of MPEG-DASH Deployment
- Look, no plugins: Live streaming to the browser using Media Source Extensions and MPEG-DASH
- Media Source Extensions (W3C)
- Streaming GStreamer Pipelines Via HTTP
- GStreamer and Raspberry Pi
- Comparison of Streaming Media Systems
- Mozilla Hacks - Streaming Media on demand with Media Source Extensions