Broadcasting

Broadcasting serves as a means of transmitting live or pre-recorded content to a wide audience.

We can choose from two approaches to broadcasting the media:

  1. HLS - slight delay, better buffering
  2. WebRTC - lower latency, less reliability

It is up to the integrators to decide, what approach will be used in their apps for the audience to consume the streams.

Call type for broadcasting

Stream infrastructure recognizes few pre-built call types. Among them, the type livestream type is the best suited for broadcasting events. When a livestream call is created, it is set to backstage mode by default.

Starting and stopping the broadcasting

We have the following Call methods at our disposal to start and stop the broadcasting:

call.startHLS();
call.stopHLS();

// or
call.goLive({ start_hls: true });
call.stopLive({ continue_hls: false });

Once started broadcasting, the data source URL is available through playlist_url property accessible through the call state:

// omitted code ...
const call: Call;
// m3u8 playlist URL
const playlistUrl = call.state.egress?.hls?.playlist_url;

To play the video over HLS, a third-party library is required (for example, HLS.js).

If you want to use WebRTC based streaming, your viewers would have to follow the join steps discussed previously.

Sample HLS.js integration

Below is a sample integration of HLS.js with Stream’s broadcasting feature. The integration is broken down into the following steps:

  • Install and initialize HLS.js
  • Get the HLS m3u8 playlist URL from call.state
  • Attach the HLS stream to a <video /> element and start playing
  • Handle HLS.js events and errors
  • Allow the user to select the quality level (e.g., 720p, 1080p)
import HLS from "hls.js";

const call = client.call(type, id);
await call.get(); // load the call state

const unsubscribe = call.state.egress$.subscribe((egress) => {
  // HLS broadcasting is not available, do nothing...
  if (!egress?.hls?.playlist_url) return;

  // will point to an m3u8 playlist URL
  const hlsPlaylistUrl = egress.hls.playlist_url;

  // get the video element where HLS.js will render the video
  const videoRef = document.getElementById("hls-video-target");

  // get the quality selector element
  const qualitySelectorRef = document.getElementById("hls-quality-selector");

  // add an event listener to handle the user's quality selection
  // and switch between available quality levels
  qualitySelectorRef.addEventListener("change", (e) => {
    const selectedLevel = parseInt(e.target.value, 10);
    if (hls.currentLevel !== selectedLevel) {
      hls.loadLevel = selectedLevel;
    }
  });

  // create a new HLS.js player instance
  const hls = new HLS();

  // listen to the hls.js error event and naively attempt to recover
  hls.on(HLS.Events.ERROR, (e, data) => {
    console.error("HLS error, attempting to recover", e, data);
    setTimeout(() => {
      hls.loadSource(hlsPlaylistUrl);
    }, 1000);
  });

  // listen to LEVELS_LOADED event, this will tell you when
  // the available quality levels (720p, 1080p, etc.) have been loaded.
  hls.on(HLS.Events.LEVELS_LOADED, (e, data) => {
    console.log("HLS levels loaded", e, data.levels);
  });

  // listen to the LEVELS_UPDATED event, this will tell you when
  // the available quality levels (720p, 1080p, etc.) have changed.
  hls.on(HLS.Events.LEVELS_UPDATED, (e, data) => {
    console.log("HLS levels updated", e, data.levels);
  });
  // listen to buffer end of stream event, this will tell you when
  // the stream has ended. e.g. the broadcaster has stopped streaming.
  // keep in mind, the viewer may continue to watch the stream.
  // this event signals that there won't be any new data coming in.
  hls.on(HLS.Events.BUFFER_EOS, (e, data) => {
    console.log("HLS buffer eos", e, data);
  });

  // load the m3u8 playlist URL
  hls.loadSource(hlsPlaylistUrl);

  // and attach it to the video element
  hls.attachMedia(videoRef);
});

unsubscribe(); // stop listening to the call state updates

For more advanced integration, please refer to the HLS.js documentation.

Broadcasting via RTMP

Our systems provide first-class support for streaming from RTMP clients as OBS. To connect your OBS project in a Stream Call, please follow the next steps:

RTMP URL and stream key

Our call instance exposes its RTMP address through call.state.ingress and call.state.ingress$ observable. Stream Key in our case is a standard user token.

You can take this information and use it to configure OBS:

const call = client.call(type, id);
await call.getOrCreate();

const { ingress } = call.state;

const rtmpURL = ingress?.rtmp.address;
const streamKey = myUserAuthService.getUserToken(rtmpUserId);

console.log("RTMP url:", rtmpURL, "Stream key:", streamKey);

Configure OBS

  • Go to Settings > Stream
  • Select custom service
  • Server: enter the rtmpURL logged in the console
  • Stream Key: enter the streamKey logged in the console

Press Start Streaming in OBS and the RTMP stream will now show up in your call just like a regular video participant.

© Getstream.io, Inc. All Rights Reserved.