as

Settings
Sign out
Notifications
Alexa
Amazon Appstore
AWS
Documentation
Support
Contact Us
My Cases
Get Started
Design and Develop
Publish
Reference
Support

@amazon-devices/react-native-w3cmedia

The Kepler WC3 Media API provides W3C compliant React Native and Javascript class components, Javascript methods, and Javascript interfaces. Use this API build playback experiences, such as gapless episodic playback and gapless advertisement insertion during playback of main content.

The API implements (polyfills) HTMLMediaElement, Media Source Extension, and Encrypted Media Extension methods.

Get started

Setup

  1. Add the following library dependency to the dependencies section of your package.json file.

    Copied to clipboard.

     "@amazon-devices/react-native-w3cmedia": "~2.1.0 "
    
  2. Open a terminal window and navigate to your app folder. Run npm to install the package in your app,

    Copied to clipboard.

     npm install
    
  3. Update your babel.config.js file, otherwise the app throws a "ReferenceError: Property 'React' doesn't exist" exception during runtime.

    Copied to clipboard.

     module.exports = {
         presets: [ ['module:metro-react-native-babel-preset', { useTransformReactJSXExperimental: true }] ],
         plugins: [
             [
             '@babel/plugin-transform-react-jsx',
             {
                 runtime: 'automatic',
             },
             ],
         ]
         };
    
  4. Add the following privileges in your manifest.toml.

    Copied to clipboard.

     [wants]
     [[wants.service]]
     id = "com.amazon.mediametrics.service" # Required for metrics service
    
     [[wants.service]]
     id = "com.amazon.media.server"
    
     [[wants.service]]
     id = "com.amazon.gipc.uuid.*"
    
     [[wants.service]]
     id = "com.amazon.media.playersession.service"
    
     [[wants.privilege]]
     id = "com.amazon.devconf.privilege.accessibility" # Required for captions
    
     [[wants.service]]
     id = "com.amazon.mediabuffer.service"
    
     [[wants.service]]
     id = "com.amazon.mediatransform.service"
    
     [[wants.service]]
     id = "com.amazon.audio.stream"
    
     [[wants.service]]
     id = "com.amazon.audio.control"
    
     [offers]
     [[offers.service]]
     id = "com.amazon.gipc.uuid.*"
    

Usage

Video react native component

Video is a React Native component that implements the HTMLVideoElement interface that extends the HTMLMediaElement interface.

Use the component to play audio and video content in formats such as MP4, MPEG DASH, HLS containing both audio and video.

The component supports the default media controls GUI.

Add the component to the render tree and store its reference. Then call the HTMLMediaElement and HTMLVideoElement methods on the reference. The component doesn't allow apps to pre-buffer content before displaying the video on screen. For pre-buffering, you can use the VideoPlayer Javascript class component.

Copied to clipboard.

let video: HTMLVideoElement;
<Video
    ref={(ref) => {
        video = ref as HTMLVideoElement; // store the reference of this component
    }}
/>

VideoPlayer component

The VideoPlayer component implements the HTMLVideoElement interface that extends the HTMLMediaElement interface.

HTMLVideoElement is a typescript class and not a React Native component.

It doesn't render video to the screen by default, nor does it render the media controls GUI.

Apps are expected to build their own media controls UI and control the playback experience.

Apps need to create an instance of VideoPlayer and use it to start buffering the content without rendering video on screen.

When the app needs to show the video on screen, it must add KeplerVideoSurfaceView React Native component to the render tree, receive onSurfaceViewCreated event and pass the surface handle obtained in the event callback to VideoPlayer through setSurfaceHandle method.

Copied to clipboard.

// create an instance of video player
let videoPlayer = new VideoPlayer();
videoPlayer.initialize().then(() => {
    // wait for the promise resolution
});

// use it to start prebuffering using MSE APIs.
let mediaSource = new MediaSource();
let mediaSourceUrl = URL.createObjectURL(mediaSource);
videoPlayer.src = mediaSourceUrl;
// wait for open state change from mediasource
.
.
// create a source buffer
let videoSourceBuffer = mediaSource.addSourceBuffer("video/mp4");
// now append the media chunks downloaded from HLS or DASH manifests.
mediaSource.append(byteArray);
.
.
.
// Later when the app wants to render video and start playback add KeplerVideoSurface to
// render tree

// onSurfaceViewCreated event handle passes the surface handle.
const onSurfaceViewCreated = (_surfaceHandle: string) : void => {
    videoPlayer.setSurfaceHandle(_surfaceHandle);
    videoPlayer.play();
}

// when the app goes to background, the surface is destroyed.
const onSurfaceViewDestroyed = (_surfaceHandle: string) : void => {
    videoPlayer.clearSurfaceHandle(null);
}

// add the KeplerVideoSurfaceView react native comonent to the render tree.
<View style={{ backgroundColor: "white", alignItems: "stretch",
    width: deviceWidth, height: deviceHeight}}>
        <KeplerVideoSurfaceView  style={{zIndex: 0}}
            onSurfaceViewCreated={onSurfaceViewCreated}
            onSurfaceViewDestroyed={onSurfaceViewDestroyed}
        />
</View>

Audio react native component

Audio is a React Native component that implements the HTMLAudioElement interface that extends the HTMLMediaElement interface.

Use the component to play audio such as MP3 or MPEG DASH/HLS that contain audio only.

The component supports the default media controls GUI.

Add the component to the render tree and store its reference.

Then call the HTMLMediaElement and HTMLAudioElement APIs on this reference.

The component doesn't allow apps to pre-buffer content before playing audio on device. For pre-buffering, use the AudioPlayer Javascript class component.

Copied to clipboard.

<Audio
  ref={ref => {
    audio = ref; // store the reference of this component
  }}
/>

AudioPlayer component

The AudioPlayer component implements the HTMLAudioElement interface that extends the HTMLMediaElement interface.

AudioPlayer is a typescript class and not a React Native component.

Use the component to play formats such as MP3 or MPEG DASH/HLS that contain audio only.

Apps need to create an instance of AudioPlayer and use it to start buffering the content without starting playback.

It does not render the media controls GUI. Apps are expected to build their own media controls UI and control the playback experience.

Copied to clipboard.

// create an instance of audio player
let audioPlayer = new AudioPlayer();
audioPlayer.initialize().then(() => {
    // wait for the promise resolution
});
.
.
// use it to start prebuffering using MSE APIs.
let mediaSource = new MediaSource();
let mediaSourceUrl = URL.createObjectURL(mediaSource);
audioPlayer.src = mediaSourceUrl;
// wait for open state change from mediasource
.
.
// create a source buffer
let audioSourceBuffer = mediaSource.addSourceBuffer("audio/mpeg");
// now append the media chunks downloaded from HLS or DASH manifests.
mediaSource.append(byteArray);
.
.
.
// Later when the app wants to start playback
audioPlayer.play();

KeplerVideoSurfaceView

KeplerVideoSurfaceView is a React Native component that renders video frames on the screen.

Use this component only when operating in pre-buffering mode using VideoPlayer, where more than one players can be pre-buffered simultaneously with different contents.

The app then attaches the surface handle passed by the onSurfaceViewCreated event callback to the video player instance that needs to be rendered on screen.

Copied to clipboard.

// create an instance of video player
let videoPlayer = new VideoPlayer();

// use it to start prebuffering using MSE APIs.
let mediaSource = new MediaSource();
let mediaSourceUrl = URL.createObjectURL(mediaSource);
videoPlayer.src = mediaSourceUrl;
// wait for open state change from mediasource
.
.
// create a source buffer
let videoSourceBuffer = mediaSource.addSourceBuffer("video/mp4");
// now append the media chunks downloaded from HLS or DASH manifests.
mediaSource.append(byteArray);
.
.
.
// Later when the app wants to render video and start playback add KeplerVideoSurface to
// render tree

// onSurfaceViewCreated event handle passes the surface handle.
const onSurfaceViewCreated = (_surfaceHandle: string) : void => {
    videoPlayer.setSurfaceHandle(_surfaceHandle);
    videoPlayer.play();
}

// when the app goes to background, the surface is destroyed.
const onSurfaceViewDestroyed = (_surfaceHandle: string) : void => {
    videoPlayer.clearSurfaceHandle(null);
}

// add the KeplerVideoSurfaceView react native comonent to the render tree.
<View style={{ backgroundColor: "white", alignItems: "stretch",
    width: deviceWidth, height: deviceHeight}}>
        <KeplerVideoSurfaceView  style={{zIndex: 0}}
            onSurfaceViewCreated={onSurfaceViewCreated}
            onSurfaceViewDestroyed={onSurfaceViewDestroyed}
        />
</View>

KeplerCaptionsView

This is a React Native component that renders closed captions and subtitles on the screen. Use the component only when operating in pre-buffering mode using VideoPlayer and AudioPlayer components.

Apps attach the captions handle passed by the onCaptionViewCreated event callback to VideoPlayer and AudioPlayer through setCaptionViewHandle method so that the player is able to render captions or subtitles on screen.

Currently, the captions are not rendered by default.

To enable rendering, run the following command:

Copied to clipboard.

vdcm set com.amazon.devconf/system/accessibility/ClosedCaptioningEnabled 1

The following example illustrates the usage of the component.

Copied to clipboard.

// onCaptionViewCreated event callback handle passes the captions handle.
const onCaptionViewCreated = (captionsHandle: string) : void => {
    videoPlayer.setCaptionViewHandle(captionsHandle);
}

// add the KeplerCaptionsView react native comonent to the render tree.
<View style={{ backgroundColor: "white", alignItems: "stretch",
    width: deviceWidth, height: deviceHeight}}>
        <KeplerCaptionsView
            onCaptionViewCreated={onCaptionViewCreated}
            style={{ width: '100%',
            height: '100%',
            top: 0,
            left: 0,
            position: 'absolute',
            backgroundColor: 'transparent',
            flexDirection: 'column',
            alignItems: 'center',zIndex: 2}}
        />
</View>

HTMLMediaElement

This is the main player interface that your KeplerScript apps uses to control the playback. The Video and Audio React Native components, as well as the VideoPlayer and AudioPlayer typescript components implement this interface.

For more details, see https://html.spec.whatwg.org/multipage/media.html.

The app can initiate the Playback in two modes depending on the type of content (Adaptive or non-adaptive) as described below.

  • URL Mode playback for non-adaptive streaming formats such as MP4, MP3, MKV etc media files. Audio and Video components support playback of non-adaptive streaming formats like MP4, MP3, MKV etc through "src" attribute.

      - Set the src attribute to a media URL.
    
          Apps set the content URL to the "src" attribute.
    
          `video.src = [some url] or audio.src = [some url]`
    
  • Media Source Extension (MSE mode) for playback of adaptive streaming formats like HLS and MPEG DASH. Adaptive streaming formats like HLS and MPEG DASH are supported through Media Source Extension (MSE) APIs. This API allows apps to inject the media segments of adaptive streaming contents like HLS and DASH to the player. Apps download the manifest and parse it to know about different bitrate quality levels and variants that the content supports. Then the app will download the media segment of a quality level based on the available network bandwidth and pass it to the player through MediaSource and SourceBuffer methods.

     For additional information about how to use these APIs, see [https://www.w3.org/TR/media-source-2/](https://www.w3.org/TR/media-source-2/).
    
     - Create a MediaSource object.<br>
         `let mediaSource = new MediaSource();`
    
     - Create a blob URL for MediaSource:<br>
         `let url = URL.createObjectURL(mediaSource)`
    
     - Attach MediaSource to video or audio component.<br>
         `video.src = url` or `audio.src = url`
    
     - Add a `SourceBuffer` by mime type.<br>
         `let sourceBuffer = mediaSource.addSourceBuffer(mimeType)`
    
     -5 Download and append a media segment to `SourceBuffer`:<br>
    
         ```javascript
         // Download media segment to ArrayBuffer using fetch API
         let response = await fetch(uri);
         let arrayBuffer = await response.arrayBuffer();
    
         // Pass the downloaded media segment buffer to SourceBuffer.
         sourceBuffer.appendBuffer(arrayBuffer);
         ```
    

Common Media operations

Use the Audio or Video React Native component references to control the media playback and other common operations through the HTMLMediaElement methods as follows.

Start playback

video.play() or audio.play()

Pause playback

video.pause() or audio.pause()

Get current time

Read the currentTime attribute to get the current playback position.

console.log('${video.currentTime}'); or console.log('${audio.currentTime}');

Seek to a position

Set the currentTime attribute to seek to the required position.

video.currentTime = seekPosition; or audio.currentTime = seekPosition;

Handling Events

The Kepler W3C Media API supports all the events as described in the W3C media specifications. There are two ways to register an event handler on the Video or Audio.

  • Use the props on the Audio/Video React Native component. Refer to MediaProps for the list of supported props. For example, you can register an event handler for ended as shown the following example.

    Copied to clipboard.

      <Video>
      onended = {onEnded}
      />
    
  1. Use the EventTarget addEventListener() method.

    The Video and Audio component implements two methods of EventTarget interface: addEventListener and removeEventListener.

    Apps can use these methods to register event handlers on the reference of Audio/Video React Native components. For example, you can register the event handler as shown in the following example.

    video.addEventListener("ended", onEnded);

Apart from the Audio and Video React Native components, there are other non-React Native components such as AudioTrackList, VideoTrackList, MediaSource, SourceBuffer, and SourceBufferList that also emit events, and can be handled through the addEventListener method of EventTarget. All these methods support addEventListener.

Note: The EventHandler attributes of these methods are not supported.

For example, The following exmamples show how to handle the addtrack event on VideoTrackList.

video.videoTracks.addEventListener("addtrack", onVideoTrackAdded);

Or to handle the sourceopen event on MediaSource.

mediasource.addEventListener("sourceopen", onMediaSourceOpen)

Encrypted Media Extension to supported DRM protected playback in MSE mode only. For additional information, see https://w3c.github.io/encrypted-media/.

Unsupported features in HTMLMediaElement

  • networkState
  • load
  • canPlayType - Indicates it supports any type.
  • fastSeek - Performs the same seek as setting currentTime.
  • getStartDate
  • addTextTrack
  • preload
  • defaultPlaybackRate, playbackRate
  • preservesPitch
  • loop
  • volume, muted

Unsupported features in Media Source Extension (MSE)

MediaSource API (https://www.w3.org/TR/media-source-2/#mediasource)

  • The EventHandler attributes are not supported. Apps can use the addEventListener method of the EventTarget interface that MediaSource implements to register the event handler.

      EventHandler onsourceopen;
      EventHandler onsourceended;
      EventHandler onsourceclose;
    
  • audioTracks, videoTracks and textTracks

SourceBuffer API (https://www.w3.org/TR/media-source-2/#sourcebuffer)

  • changeType
  • EventHandler - The following attributes. For now apps can use addEventListener method of the EventTarget interface that SourceBuffer implements to register the event handler.

    double              appendWindowStart;
    unrestricted double appendWindowEnd;
    EventHandler        onupdatestart;
    EventHandler        onupdate;
    EventHandler        onupdateend;
    EventHandler        onerror;
    EventHandler        onabort;
    

SourceBufferList API (https://www.w3.org/TR/media-source-2/#sourcebufferlist)

  • The following EventHandler attributes are not supported. FFor now apps can use addEventListener method of the EventTarget interface that SourceBufferList implements.

    EventHandler  onaddsourcebuffer;
    EventHandler  onremovesourcebuffer;
    

Modules


Last updated: Sep 30, 2025