as

Settings
Sign out
Notifications
Alexa
Amazon Appstore
AWS
Documentation
Support
Contact Us
My Cases
Get Started
Design and Develop
Publish
Reference
Support

Play adaptive content using Hls.js Player

The following steps show you how to use Hls.js Player with MSE mode to play adaptive streaming content.

Even though the Hls.js Player patches provided by Amazon are for a specific version of the player, you can port them to any version of Hls.js Player that you want to use. Amazon doesn't prescribe any specific version of Hls.js Player that you must use. You can decide which version of Hls.js Player is most suitable for your requirement. For issues related to Hls.js Player's ability to handle your content, engage with the open source community.

For more details about the Hls.js Player and the different configurations it supports, see Hls.js Player.

Prerequisites

Before you begin to modify your code to play adaptive content on the Hls.js Player, complete the following prerequisites to set up your app and the player:

  • Set up your app to use the W3C Media Player. For more information, see Media Player Setup.

Configure the Hls.js Player for Vega

  1. Download the Hls.js Player for Vega to a known location.

Vega supports the following Hls.js versions:

  1. Expand the Hls.js Player package.

tar -xzf hls-rel-v<x.y.z>-r<a.b>.tar.gz

For example.

tar -xzf hls-rel-v1.5.11-r1.5.tar.gz

  1. Navigate to the /hls-rel/scripts directory.
  2. Run the setup.sh helper script. ./setup.sh The setup.sh script performs a build which generates a directory named hls.js.
    • From the generated hls.js directory, copy the contents of the hls-rel/src/* directory to <app root>/src/*.
    • From the generated hls.js directory, copy the dist directory to <app root>/src/hlsjsplayer/dist.

Play adaptive content

Complete the following steps to load the Hls.js Player when the Video component is mounted.

To play adaptive content with Hls.js Player

  • Open your src/App.tsx and replace the contents with following code.

Copied to clipboard.

/*
 * Copyright (c) 2024 Amazon.com, Inc. or its affiliates.  All rights reserved.
 *
 * PROPRIETARY/CONFIDENTIAL.  USE IS SUBJECT TO LICENSE TERMS.
 */

import * as React from 'react';
import {useRef, useState, useEffect} from 'react';
import {
  Platform,
  useWindowDimensions,
  View,
  StyleSheet,
  TouchableOpacity,
  Text,
} from 'react-native';

import {
  VideoPlayer,
  VegaVideoSurfaceView,
  VegaCaptionsView,
} from '@amazon-devices/react-native-w3cmedia';
import { HlsJsPlayer } from './hlsjsplayer/HlsJsPlayer';

// set to false if app wants to call play API on video manually
const AUTOPLAY = true;

const DEFAULT_ABR_WIDTH: number = Platform.isTV ? 3840 : 1919;
const DEFAULT_ABR_HEIGHT: number = Platform.isTV ? 2160 : 1079;

const content = [
  {
    secure: 'false', // true : Use Secure Video Buffers. false: Use Unsecure Video Buffers.
    uri: 'https://storage.googleapis.com/shaka-demo-assets/bbb-dark-truths-hls/hls.m3u8',
    drm_scheme: '', // com.microsoft.playready, com.widevine.alpha
    drm_license_uri: '', // DRM License acquisition server URL : needed only if the content is DRM protected
  },
];

export const App = () => {
  const player = useRef<any>(null);
  const videoPlayer = useRef<VideoPlayer | null>(null);
  const timeoutHandler = useRef<ReturnType<typeof setTimeout> | null>(null);
  const [buttonPress, setButtonPress] = useState(false);
  const [nextContent, setNextContent] = useState({index: 0}); // { index: number }
  // Track the nextContent state for re-rendering
  const nextContentRef = useRef<number>(0);
  // Render in Full screen resolution
  const {width: deviceWidth, height: deviceHeight} = useWindowDimensions();

  useEffect(() => {
    if (nextContent.index !== nextContentRef.current) {
      nextContentRef.current = nextContent.index;
      // Force Re-rendering of <Video> component.
      initializeVideoPlayer();
      setNextContent((prev) => {
        return {...prev};
      });
    }
  }, [nextContent]);

  useEffect(() => {
    console.log('app:  start AppPreBuffering v13.0');
    initializeVideoPlayer();
  }, []);

  const onEnded = async () => {
    console.log('app: onEnded received');
    player.current.unload();
    player.current = null;
    await videoPlayer.current?.deinitialize();
    removeEventListeners();
    onVideoUnMounted();
    setNextContent({index: (nextContent.index + 1) % content.length});
  };

  const onError = () => {
    console.log(`app: AppPreBuffering: error event listener called`);
  };

  const setUpEventListeners = (): void => {
    console.log('app: setup event listeners');
    videoPlayer.current?.addEventListener('ended', onEnded);
    videoPlayer.current?.addEventListener('error', onError);
  };

  const removeEventListeners = (): void => {
    console.log('app: remove event listeners');
    videoPlayer.current?.removeEventListener('ended', onEnded);
    videoPlayer.current?.removeEventListener('error', onError);
  };

  const initializeVideoPlayer = async () => {
    console.log('app: calling initializeVideoPlayer');
    videoPlayer.current = new VideoPlayer();
    // @ts-ignore
    global.gmedia = videoPlayer.current;
    await videoPlayer.current.initialize();
    setUpEventListeners();
    videoPlayer.current!.autoplay = false;
    initializeHls();
  };

  const onSurfaceViewCreated = (surfaceHandle: string): void => {
    console.log('app: surface created');
    videoPlayer.current?.setSurfaceHandle(surfaceHandle);
    videoPlayer.current?.play();
  };

  const onSurfaceViewDestroyed = (surfaceHandle: string): void => {
    videoPlayer.current?.clearSurfaceHandle(surfaceHandle);
  };

  const onCaptionViewCreated = (captionsHandle: string): void => {
    console.log('app: caption view created');
    videoPlayer.current?.setCaptionViewHandle(captionsHandle);
  };

  const initializeHls = () => {
    console.log('app: in initializePlayer() index = ', nextContent.index);
    if (videoPlayer.current !== null) {
      player.current = new HlsJsPlayer(videoPlayer.current);
    }
    if (player.current !== null) {
      player.current.load(content[nextContent.index], AUTOPLAY);
    }
  };

  const onVideoUnMounted = (): void => {
    console.log('app: in onVideoUnMounted');
    // @ts-ignore
    global.gmedia = null;
    videoPlayer.current = null;
  };

  if (!buttonPress) {
    return (
      <View style={styles.container}>
        <TouchableOpacity
          style={styles.button}
          onPress={() => {
            setButtonPress(true);
          }}
          hasTVPreferredFocus={true}
          activeOpacity={1}>
          <Text style={styles.buttonLabel}> Press to Play Video </Text>
        </TouchableOpacity>
      </View>
    );
  } else {
    return nextContent.index === nextContentRef.current ? (
      <View style={styles.videoContainer}>
        <VegaVideoSurfaceView
          style={styles.surfaceView}
          onSurfaceViewCreated={onSurfaceViewCreated}
          onSurfaceViewDestroyed={onSurfaceViewDestroyed}
        />
        <VegaCaptionsView
          onCaptionViewCreated={onCaptionViewCreated}
          style={styles.captionView}
        />
      </View>
    ) : (
      <View style={styles.videoContainer}></View>
    );
  }
};

const styles = StyleSheet.create({
  container: {
    flex: 1,
    flexDirection: 'column',
    backgroundColor: '#283593',
    justifyContent: 'center',
    alignItems: 'center',
  },
  button: {
    alignItems: 'center',
    backgroundColor: '#303030',
    borderColor: 'navy',
    borderRadius: 10,
    borderWidth: 1,
    paddingVertical: 12,
    paddingHorizontal: 32,
  },
  buttonLabel: {
    color: 'white',
    fontSize: 22,
    fontFamily: 'Amazon Ember',
  },
  videoContainer: {
    backgroundColor: 'white',
    alignItems: 'stretch',
  },
  surfaceView: {
    zIndex: 0,
  },
  captionView: {
    width: '100%',
    height: '100%',
    top: 0,
    left: 0,
    position: 'absolute',
    backgroundColor: 'transparent',
    flexDirection: 'column',
    alignItems: 'center',
    zIndex: 2,
  }
});
  • Use the Vega SDK to build the app, run it on device or simulator and collect logs. For more details about building and running apps on the simulator, see Create a Vega App and Vega Virtual Device.

To integrate the Hls.js player manually

  1. Clone the Hls.js package from https://github.com/video-dev/hls.js.

    Copied to clipboard.

    cd <root dir>
    git clone https://github.com/video-dev/hls.js.git
    
  2. Checkout the git branch based on v1.5.11 to a local branch named v1.5.11-kepler.

    Copied to clipboard.

     cd hls.js
     git checkout -b amz_1.5.11 v1.5.11
    
  3. Download the Vega Hls.js Player package to a known location where you will expand the file.
  4. Expand the file.

    Copied to clipboard.

    tar -xzf hls-rel-v[x.y.z]-r[x.y].tar.gz
    
  5. Apply the Hls.js patches:

    Copied to clipboard.

    git apply hls-rel/hls-patch/*.patch -3
    
  6. Install Hls.js dependencies by running npm install:

    Copied to clipboard.

    npm install
    
  7. Build Hls.js Player by running npm run build

    Copied to clipboard.

     npm run build  
    

Note: If you receive error in which the build is unable to resolve the base-64 and xmldom node modules, run the following commands to install them.

Copied to clipboard.

npm install --save base-64   
npm install --save xmldom

Last updated: Sep 30, 2025