Assistive Technologies for Fire OS (Amazon Fire TV)

This page provides a conceptual over view of the assistive technologies that are available from Amazon for your Fire OS apps.

Accessibility Overview

Accessibility is the degree to which a product or service can be used by a customer with a particular disability. Accessible products and services enable users with disabilities to easily and efficiently use those products and services.

Accessible systems have three major components:

  • Application being made accessible
  • Accessibility framework
  • Assistive technology

Assistive technologies help a person with a disability accomplish a task or use a product. Examples of assistive technologies include screen reading software for blind users, screen magnification software for users with low vision, or wheel chairs for users who are unable to walk.

Available Fire OS Assistive Technologies for Amazon Fire TV

Fire OS currently supports the following assistive technology on Amazon Fire TV:

  • VoiceView: Enable a blind user to interact with objects on the screen via speech output and either touch or keyboard input.

Activating VoiceView on Amazon Fire TV

To activate VoiceView on Amazon Fire TV:

  1. Enable VoiceView by holding down the Back and Menu keys on the Fire TV remote for 2 seconds.

    VoiceView has two navigation modes available: Standard Navigation Mode or Enhanced Navigation Mode. You can switch to Standard Navigation Mode by holding down the Menu key. Note that in Standard Navigation Mode, VoiceView’s cursor will only move among actionable items, such as buttons. 

  2. In Enhanced Navigation Mode, press the Right and Left directional keys on the remote to move VoiceView’s cursor (shown as the green focus rectangle) to an item. 

  3. Press the Select key to activate an item.

Differences between VoiceView on Fire OS and TalkBack on Android

While Fire OS’s VoiceView and Android’s TalkBack are both accessibility services that interact with the Android accessibility framework, VoiceView is a completely distinct screen reader from TalkBack, as opposed to a modification of TalkBack. VoiceView differs from TalkBack in the following ways:

  • Focus behavior in new windows: When a new window opens, VoiceView always places accessibility focus somewhere onscreen. TalkBack does not place focus anywhere on a new window, and instead waits for the user to touch the screen and places focus at that location.
  • Linear navigation across windows: VoiceView allows linear navigation across window boundaries, while TalkBack does not. Consider the bottom navigation bar on a tablet, which is actually a window that contains three buttons.
    • In VoiceView, swiping left will move the cursor to the last item in the main content window.
    • In TalkBack, swiping left from the Back button in the bottom navigation bar on TalkBack will not allow you to linearly navigate out of the bottom navigation bar and will produce an “end” earcon.
  • Granular navigation across objects: VoiceView allows granular navigation across objects, while TalkBack does not. Consider a screen containing three objects with titles “Cat, “Dog”, and “Monkey”.
    • When navigating by word, VoiceView moves seamlessly from “Cat” to “Dog” to “Monkey”.
    • Conversely, TalkBack will stop on “Dog” and not navigate to the next word.
    • Similarly, when navigating by character, if you swipe down to move to the next character after landing on the “g” of “Dog”, VoiceView will move to the “C” of “Cat”. TalkBack plays a sound indicating the end of text in this case.
  • Text navigation: When navigating through text, VoiceView announces the character or word after the caret; TalkBack announces the character or word that the caret has passed over.  Note that VoiceView’s behavior is consistent with the behavior of screen readers on the Windows platform, which is the platform with which most blind users are familiar.
  • Sorting on-screen objects: TalkBack typically sorts on-screen objects in a left to right, top to bottom order, based on the coordinates of the objects’ top-left corners. VoiceView typically sorts on-screen objects in a left to right, top to bottom order, based on the coordinates of the objects’ centers.