Instructions to do WebM live streaming via DASH

This page describes the recommended ways to create, stream and playback live WebM files using DASH.


1) FFmpeg - You will need to download version 2.8 of FFmpeg (or higher) in order for some DASH features to work. You can either download nightly static build from or build FFmpeg yourself from the git repository.

2) A web server.

3) Shaka Player (for playback on Web) -

4) ExoPlayer (for playback on Android) -

5) Dash.js (for playback on Web) -

Creating Live Content

Encode Video and Audio

FFmpeg can be used to create the Audio and Video streams for DASH Live. This will seem familiar if you have used FFmpeg to create VOD (non-live) DASH streams.

For live streaming WebM files using DASH, the video and audio streams have to be non-muxed and chunked. For more information on what this means, see this link.

We are going to use the following encoding settings with libvpx for the VP9 encoder:

VP9_LIVE_PARAMS="-speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1"

Now, the video and audio streams can be generated by using a command as follows:

ffmpeg \

 -f v4l2 -input_format mjpeg -r 30 -s 1280x720 -i /dev/video0 \

 -f alsa -ar 44100 -ac 2 -i hw:2 \

 -map 0:0 \

 -pix_fmt yuv420p \

 -c:v libvpx-vp9 \

   -s 1280x720 -keyint_min 60 -g 60 ${VP9_LIVE_PARAMS} \

   -b:v 3000k \

 -f webm_chunk \

   -header "/var/www/webm_live/glass_360.hdr" \

   -chunk_start_index 1 \

 /var/www/webm_live/glass_360_%d.chk \

 -map 1:0 \

 -c:a libvorbis \

   -b:a 128k -ar 44100 \

 -f webm_chunk \

   -audio_chunk_duration 2000 \

   -header /var/www/webm_live/glass_171.hdr \

   -chunk_start_index 1 \


This command captures the video and audio from the webcam and microphone respectively and encodes them into a Live WebM Stream.

Things to note:

  • This is a never ending command, so this should probably be run in the background.

  • If you get warnings about Alsa or V4L2 thread queue not being enough, try increasing the "-thread_queue_size" parameter.

  • File name of the header should conform to the following format: <prefix>_<representation_id>.hdr

  • File name of the chunks should conform to the following format: <prefix>_<representation_id>_%d.chk

  • The input audio/video sources and their parameters can change based upon the OS and drivers that you have in your machine. See FFmpeg's device documentation for more information on which source and parameters to use.

  • The "map" parameter has to be modified in such a way that the input video stream is routed to the video encoder and the input audio stream is routed to the audio encoder. For more information, see FFmpeg's map option documentation.

  • The video and audio chunks have to be in sync. This is ensured by following these rules:

    • The "keyint_min" and "g" parameters to the video encoder should always be the same.

    • The "keyint_min" parameter should be in sync with the "audio_chunk_duration" parameter passed to the audio encoder (keyint_min is expressed in number of frames whereas audio_chunk_duration is expressed in milliseconds). In the given example, the video frame rate is 30 fps, so keyint_min of 150 means that each chunk is of duration 5000 milliseconds (5 seconds).

  • More than one video/audio stream can be created this way by merely adding another "webm_chunk" output to the above command.

Create the DASH Manifest

FFmpeg can be used to create the DASH Manifest by passing the header file created from the previous step as input. Here's a sample command:

ffmpeg \

 -f webm_dash_manifest -live 1 \

 -i /var/www/webm_live/glass_360.hdr \

 -f webm_dash_manifest -live 1 \

 -i /var/www/webm_live/glass_171.hdr \

 -c copy \

 -map 0 -map 1 \

 -f webm_dash_manifest -live 1 \

   -adaptation_sets "id=0,streams=0 id=1,streams=1" \

   -chunk_start_index 1 \

   -chunk_duration_ms 2000 \

   -time_shift_buffer_depth 7200 \

   -minimum_update_period 7200 \


Make sure that the "chunk_start_index" and the "chunk_duration_ms" parameters are the same as in the previous step. Also, ensure that the previous FFmpeg command has actually written the header file (usually happens instantaneously) before running this. If you are wrapping both the commands in a script, it's recommended to add at least 1 second of sleep between these two commands.

Streaming Live Content

Web (Shaka Player)

Shaka Player is an open source media player built on top of HTML5 video API and it is capable of playing back Live WebM via DASH (provided the browser supports Media Source Extensions and WebM/VP9 playback).

Clone and Build Shaka Player:

The above steps will generate "shaka-player.compiled.js". Copy that to your web-server's directory.

Here's a sample piece of HTML and Javascript that uses Shaka Player for WebM Live Playback:


 <script src="shaka-player.compiled.js"></script>


   function startVideo(mpd) {


     var video = document.getElementById('video');

     var player = new shaka.player.Player(video);

     var source = new shaka.player.DashVideoSource(mpd, null, null);




 <body onLoad="startVideo('glass_live_manifest.mpd');">

   <video id="video" controls autoplay></video>



Android (ExoPlayer)

ExoPlayer is an extensible open source media player built on top of Android's Media APIs. ExoPlayer natively supports WebM Live Streams via DASH. Please refer to the ExoPlayer's sample app to know how to use ExoPlayer to playback Live Streams via DASH.

To playback VP9 videos on Android devices running prior to Kitkat, the native ExoPlayer VP9 Extension can be used.

Web (Dash.js)

Dash.js is an open source media player built on top of the HTML5 video API.

Merely point the URL of your manifest to Dash.js and it will adaptively stream the live content. Here's a sample piece of HTML and Javascript that uses Dash.js:


 <script src="dash.all.js"></script>


   function startVideo(mpd) {

     var video, context, player;

     video = document.querySelector('video');

     context = new Dash.di.DashContext();

     player = new MediaPlayer(context);







 <!-- see note below about manifest URLs -->

 <body onload="startVideo('glass_live_manifest.mpd');">

   <video controls></video>



For more information on Dash.js API and tweaking, you can refer to Dash.js wiki here:

[Optional] Time Sync Between Server and Clients

Note: This section only applies for ExoPlayer and Dash.js

DASH Live playback works better if the server and client clocks are in sync. In order to achieve that, the client requests a page from the server, for which the server responds with 200 OK and the response body as the current server UTC date and time in ISO Format. The strftime format specifier for ISO format is "%FT%TZ". A sample time page can be found here:

Once you have the time server ready, you can include the UTCTiming URL in the manifest by passing:

-utc_timing_url "<url_to_iso_time_page_as_described_above>"