Developer Console

Managing Audio Focus (Fire TV)

Applications playing audio must be carefully designed to cooperate with other applications and services in a user-friendly way. To manage audio playback across applications, Android provides the concept of audio focus. This document outlines how to manage audio focus and other related events on Fire TV.

The Complexity of Audio Focus

Do not underestimate the complexity of handling audio focus. Among others topics, consider the following as you develop your application:

  • If your application is designed to play audio in the foreground or background
  • How other applications are notified when they should stop their playback
  • How your application reacts when other apps start playing
  • How and when your app handles incoming multimedia events (for example, pause, fast-forward, etc.)
  • How you handle system notifications and other short audio interruptions
  • How voice search interrupts your application
  • How voice control solutions interact with your application
  • When is your application allowed to hold on to resources and wake locks

Note that expectations for applications that play video can significantly differ from audio-only applications.

Documentation for multimedia applications

Follow all relevant Android developer pages describing instructions for API level 22 (Fire OS 5), API level 25 (Fire OS 6), and API level 28 (Fire OS 7). Additionally, follow the guidelines and instructions in the Android documentation:

Handling audio focus

As you develop your app, follow all requirements described in the Audio Focus Event Requirements. Note the following:

  • The duration of your playback must exactly match the duration of holding the audio focus and having the MediaSession set to active. This also means you must abandon the a focus request immediately after finishing playback.
  • All audio focus change callbacks must be handled as described in Audio focus pre-Android 8.0.
  • When an application is playing in the background, it must handle multimedia events (continuously while also having audio focus).
  • Receiving any kind of audio focus loss events should cause your application to stop handling multimedia events until the focus is regained.
  • Permanent loss of focus means your application will not get the focus back; it should release all resources.
  • Do not request the audio focus multiple times without abandoning the previous requests (to avoid getting registered in the focus stack multiple times).
  • Ducking volume should be set to a level that does not interfere with other applications or voice-search responses — for example, set the volume at 30-40% of the original level.

Voice Interactions

When users press the Microphone button on the remote control ("near field"), or when devices with microphones recognize the wake-word of the device ("far field"), the system-wide voice capabilities start and cannot be overridden. Both audio focus (AUDIOFOCUS_LOSS_TRANSIENT) and activity lifecycle events (onPause()) will be sent out.

In general, your app should pause (or at least mute the audio) in response to voice interactions. Your application should be prepared to receive the events in any order allowed by Android. Among other options, users can instruct Alexa to start new applications or, for example, users can turn a sound system on or off. As a result, the audio capabilities might change while your playback is paused. You should properly react to all these changes before continuing playback.

Applications correctly implementing MediaSession interfaces can be controlled by voice commands. For details how to enable control through voice, see Voice-enabling Your App and Content. Voice search on Fire TV can also include content from your application in the search results. For this functionality, see Integrating Your Catalog with Fire TV.


Last updated: Oct 10, 2023