Last month we announced support for blind and visually impaired customers on Amazon Fire TV through our screen reader, VoiceView. In this blog post, we will take a closer look at the upcoming changes with VoiceView and how you can get started with making your app compatible for visually impaired customers.
VoiceView allows blind and visually impaired customers to discover content, access settings, and control playback in the core Fire TV experience. For an app to be considered accessible, on-screen information needs to be spoken. On-screen information can be bucketed into two major categories: focusable items and static content. Focusable items are items like buttons that a user navigates to and are actionable. This includes menus, login buttons, or titles of movies/TV shows and other content that users can access while browsing content. Static content is on-screen information such as a movie’s description, duration, and rating. Text on-screen providing a URL and code to activate a subscription, for example, could also be considered static content.
To get started with VoiceView, the first thing you need to do is make your app compatible with the Android accessibility framework. For tips and best practices, please look at the current documentation.
With appropriate content descriptions for objects, all focusable items will be spoken by VoiceView. If the focus is on an item that includes the title of a TV show, season, and episode number, it is best practice for all that information to be spoken. If the user navigates to a new row which contains a different genre of movies, the title of the new row should be spoken.
While the above features will allow visually impaired customers basic access to your static content, without any effort on your part, we have API’s available that will allow you to create a truly accessible experience for customers. Some API’s that you might find useful are:
Orientation Text: This will describe what is on the screen and be read the first time a user encounters a screen or object with Orientation Text and anytime menu is pressed.
Usage Hints: Provides a hint of how to navigate the current element such as a list. For example, when landing on a movie detail screen, the Watch Now button is in focus, and the hint "Use Left and Right to move between options" is read to indicate horizontal navigation of buttons.
Described By: Connects focusable items to static content that describes it. For example, when navigating movies in the Fire TV launcher, there is text about the focused item that updates to display the description, rating, etc. Items defined with Described By will be automatically read, after a brief pause, when an item gains focus. This provides a direct way to explicitly define static content for easy access.
Stay tuned to learn more about how you can develop for VoiceView and any upcoming updates to the screen reader.
-Tanisha