Skip to content

Implement a Chromecast HLS backend

Current implementation

3.0.x Chromecast stream output module use the mkv/webm live capability to expose the media to the Chromecast player. Both in transcoding and standard scenarios, the live mkv container is bufferised in memory. This approach is great thanks to the flexibility of the container in term of codec support and use the same mechanic whith or without transcoding.
A subtitle support was implemented by burning the subtitle into the video forcing an extra transcode step for every videos with subtitle. This implementation isn't activated by default due to the unfortunate power impact it would have on roaming devices.

Downsides and HLS switch motivations

Chromecast devices currently don't support live mkvs subtitles internal tracks, only an extra webm live subtitle file can be exposed on our side. This file must be updated frequently to avoid Chromecast timing out thinking the read reached EOF. This was worked around by a contributor who managed to have live webm extra subtitle working [1]. Unfortunately, these patchs were relying too much on augmenting arbitrarily the input-slave demuxes time barrier to satisfy the Chromecast needs to have the subtitles a little bit in advance in comparison to the video.

An HLS stream solves the above problem, the Chromecast player and the VLC stream output both agrees on what should be played at what time frame. Chunks are fetched when they are exposed as ready by VLC. Exposing extra tracks including subtitles with HLS is easy and supported by both the protocol and the Chromecast.

The fact that the chunks are only fetched when exposed as ready also avoid HTTP callbacks stalling an arbitrarily long amount of time. Which was an issue only worked around in 3.0 [2].

Problematics

HLS segmenting in a Stream Output context

Segmenting multiple tracks in a stream output is quite a hard task with the current API when you start adding subtitles to the mix:

  • Subtitles aren't continuous, it is common to have empty webvtt chunks.
  • ESs are ordered arbitrarily by the demuxer. And data sending order is not standardly defined. It's nearly impossible to know which track should decide that the subtitle chunk ends.

To address this issue we decided to forward the PCR value of the demuxer to the Stream Output [3]. This value gives us a time point where we are certain that no data lower that have a lower time point will arrive afterwise. This is a pretty good tool to base our segmenting decision system on top.

Pacing

Pacing in stream output can't be implementation properly with the current API. Every stream out API call lock the whole stream output. This make waiting synchronously in a Send() call impossible if you want to keep handling flushes in the mean time for instance.

The approach taken by the HLS Chromecast implementation then stays the same as the 3.0 one which used a demux filter to pace the demuxer input and forward metadatas to the Stream output.

However, theses problematics are definitely to be discussed in the next Stream output API rework.

Chromecast default player requirements for HLS

Codec formats

HLS adds restrictions to the codecs we can use with the Chromecast, all codecs not listed here will have to be transcoded.

Video
  • H.264 at least up to profile 4.1
  • VP8
  • VP9 (Starting Chromecast Ultra)
  • HEVC (Starting Chromecast Ultra)
Packed audio
  • AAC
  • AAC-LC
  • AC3
  • E_AC3
  • MP3
Subtitles
  • WebVTT

Secondary HLS tracks

Current HLS stream out design is made around the concept of One ES = One HLS track. Chromecast indeed supports secondary tracks for audio and subtitles but forces the audio to be a packed audio track instead of usuals TS muxed chunks.

Subtitle auto-select

HLS provides handy track parameter to select secondary tracks automatically [4]. Unfortunately, Chromecast implementation does not support it for subtitles which forces us to enable the track manually via a specific Chromecast request after the playback started.

Proof of Concept

During last year I've been prototyping an HLS stream output plugin while taking into consideration Chromecast as a main target.
This work will be cleaned up and available on a branch of my custom repository by the end of the week.

Reusing the current livehttp Access Output

There is already an HLS Access Output implementation inside VLC4.0, this implementation was as little as possible modified. The biggest change was to add an IO interface to allow it to write either on memory or on the filesystem. This access out generate the m3u8 playlist of the track and exposes chunks.

Extra muxers requirements

To handle both packed audio and webvtt, two extra muxers needed to be written. The video track simply uses MPEG TS.

The Stream Output

The HLS Stream Output is the entry point of the HLS plugin. It generates the main HLS playlist and orchestrate one muxer and one livehttp Access Output by track. The Accesses are able to tell the Stream Output instance whenever they've ended a chunk and the Stream Output is then mandated to terminate all the other tracks to keep the HLS Stream consistent.

The Stream Output currently also have the responsability to implement the HTTP callbacks.

The HLS Pipeline

graph LR
    subgraph Stream Output
    dec1[Input Decoder] -- Video --> sout[HLS]
    dec2[Input Decoder] -- Audio --> sout[HLS]
    dec3[Input Decoder] -- Subtitle --> sout[HLS]
    end

    subgraph Muxer
    sout -- Video --> ts[MPEG TS]
    sout -- Audio --> packed[Packed Audio]
    sout -- Subtitle --> vtt[WebVTT Segmenter]
    end

    subgraph Access Output
    ts --> aco1[livehttp]
    packed --> aco2[livehttp]
    vtt --> aco3[livehttp]
    end

The whole Chromecast integration in VLC

As seen in the following graph, VLC Chromecast integration stays relatively the same.

flowchart LR
    subgraph sout[Chromecast Stream Output]
    direction LR
    transcode --> hls[HLS Pipeline]
    end

    demuxfilter[Chromecast Demux Filter] --> demux[Demuxer]
    demux -- Video --> dec1[Input Decoder] --> sout
    demux -- Audio --> dec2[Input Decoder] --> sout
    demux -- Subtitle --> dec3[Input Decoder] --> sout

    sout <-. Commands and HLS output .-> cc[Chromecast Device]
    demuxfilter <-. Metadata and Pacing .-> sout

Merge roadmap

This list describe the global merging status of the feature:

  • Forward PCR to stream outputs
  • HLS stream output
  • Chromecast HLS stream output backend
    • Adapt the current demux-filter to allow HLS to control pacing
    • Rework the current module entry point to use the HLS backend
    • Implement Chromecast requests to enable an a track (used for subs)

[1] https://gitlab.com/nirhal/vlc-cc-spu/-/commits/cc-spu-fix
[2] 518ed638
[3] #27050
[4] https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis#section-4.4.6.1

Edited by Alaric Senat
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information