Use the Smart Screen Sample Applications
With the release of the Alexa Voice Service (AVS) Device SDK 3.0, Amazon provides two sample applications. As a screen-based Alexa Built-in device maker, you can use these applications to complete your smart screen Alexa integration:
- Inter-Process Communication (IPC) Server Sample Application.
- Alexa Smart Screen Web Components Node.js framework and Sample Web Application available on a separate GitHub release.
Together, by using the IPC Client Framework communications, these applications provide a complete implementation for you to enable Alexa smart screen features on your device.

Configure and run the sample applications
After you complete the required quick start guides to build both the IPC Server Sample Application and Smart Screen Sample Web Application, you’re ready to configure and run each application.
To configure the IPC Server Sample Application
- Configure the IPC Server Sample Application through the two configuration files discussed in the
IPCServerSampleAppConfig
file.
To configure the Smart Screen Sample Web Application
- Configure the Smart Screen Sample Web Application by using its own Sample Web Application configuration.
sampleClientConfig
property of the IPC Server Sample Application is explicitly designed to provide the Smart Screen Sample Web Application configuration through the IPC Framework API configureClient
directive.Used together, these configurations enable you to implement a wide range of device display variations with distinct interaction modalities. For details about the full collection of Sample IPC Server and Web Application client configurations for different device implementations, see Sample IPC Server configurations.
To run the IPC Server Sample Application
- Use the following command to run the IPC Server Sample Application with one of the provided samples.
cd $HOME/sdk-folder/ipc-server-app-build/src
./IPCServerSampleApp \
-C $HOME/sdk-folder/sdk-build/Integration/AlexaClientSDKConfig.json \
-C $HOME/sdk-folder/src/avs-device-sdk/SampleApplications/IPCServerSampleApplication/config/IPCServerSampleAppConfig.json \
-C $HOME/sdk-folder/src/avs-device-sdk/SampleApplications/IPCServerSampleApplication/config/samples/<SAMPLE_CONFIG>.json \
-L INFO
To run the Sample Web Application
-
If you don't have a Chromium-based browser, download, and then install the browser to complete step 2.
-
In your Chromium-based browser, run the Sample Web Application.
cd <path-to-web-components-repo-clone>/alexa-smart-screen-web-components
open ./samples/alexa-smart-screen-sample-app/dist/index.html
Authorize with Login with Amazon (LWA)
Before you can use the IPC Server Sample and Sample Web Applications to interact with Alexa, you must first complete authorization through LWA. This is facilitated by the implementation of the completeAuthorization
IPC API in both applications.
To authorize the applications with LWA
- Make sure both the IPC Server Sample and Sample Web Applications are running and connected.
- In the Sample Web Application user interface (UI) in your Chromium-based browser, follow the instructions.
-
Select and copy the provided authorization code, and then click the link to open the authorization flow in a separate tab.
Authorize with LWA After authorization is complete, the Sample Web Application receives the
setAuthorizationState
IPC message from the IPC Server Application with the stateREFRESHED
, which initiates a transition to the Sample Web Application home screen.Home screen idle
Test the sample applications
When you use the default IPCServerSampleAppConfig
file, you see a Sample Application home screen for a TV-type device that implements windows for both full-screen and lower-third (also called TV overlay) Alexa visual responses. On the home screen, you also see instructions about the input that you need to start an Alexa interaction. In this case, you press and hold down a key, which simulates the action of a user when they push a microphone button on a TV remote control.
To test the Sample Web Application
-
With the Sample Web Application in focus, press and hold the A key on your keyboard to open the microphone, and then begin to talk to Alexa.
-
While you hold the A key, notice the text-based attention system in the lower-right of the screen transition from
IDLE
toLISTENING
. -
While you hold the A key, say, "Tell me a joke."
-
Release the A key.
The attention system text changes from
LISTENING
toTHINKING,
and then Alexa responds with a joke and a full-screen visual card.Full-screen visual card -
Press and hold the A key, and then say, "What’s the weather?"
Alexa responds with the weather report for your local area, and a lower-third overlay card as supported by this device configuration.
Lower-third overlay card
User interactions
The Smart Screen Sample Web Application provides users with a variety of interactions to control the Alexa experience on their device.
Speech input
The Sample Web Application configuration offers you control of the input type and device key that you use to initiate Alexa interactions through the audioInputInitiator
property of the Sample Web Application configuration and the talkKey
property of the deviceKeys
section. These configurations determine the key press that you use to send the associated recognizeSpeechRequest
IPC message to the Server to start speech recognition.
Wake word
If your build of the IPC Server Sample Application uses a wake word engine, no input from the Sample Web Application is required to start an Alexa interaction. Just start by saying the "Alexa" wake word. You can set the audioInputInitiator
property of the Sample Web Application configuration to WAKEWORD
to inform the Sample Web Application that you're using a wake word engine, which updates the home screen accordingly.
Press and hold to talk
The default IPCServerSampleAppConfig
uses the PRESS_AND_HOLD
audio input type, which means you must hold the input button or keyboard key for the duration of your speech interaction, and release the button or key when you are ready for Alexa to respond. By default, this input key is the A key.
The following example shows a speech interaction when the Sample Web Application is open and in focus.
You press and hold the A key.
You: What's the weather like in Portland?
You release the A key.
Alexa: Right now in Portland, it's 77 degrees with sun…
A weather visual card displays on the screen.

For multi-turn interactions, the PRESS_AND_HOLD
audio input type prompts you again to press and hold the input key to answer any follow-up questions Alexa might have.
Tap-to-talk
The Smart Screen Sample Web Application also supports the TAP
audio input type, which is where a single button press opens the microphone for a speech interaction with Alexa, similar to saying the wake word. Again, by default this key is the A key.
The following example uses the Hub Large Landscape Sample configuration to show this interaction type when the Smart Screen Sample Web Application is open and in focus.
You tap the A key.
You: What's the population of Seattle?
Alexa: The population of Seattle is…
An accompanying information card displays on the screen.

For multi-turn interactions, the TAP
audio input type automatically reopens the microphone for you to answer any follow-up questions Alexa might have.
Navigation control
In addition to speech interactions, the sample applications also provide control of basic navigation of Alexa experiences on your device through the IPC Client Framework’s navigationEvent
API. These interactions represent the types of input expected from devices, such as remote control input on a TV, or system graphic user interface (GUI) buttons on a touch display.
Currently Amazon supports two navigation events: back and exit.
Back
The BACK
event enables users to use device-button input to navigate the back stack of any Alexa experiences initiated by the applications.
All the sample IPC Server configurations use the B key to initiate back navigation.
The following example shows how the B key initiates back navigation when the Sample Web Application is open and in focus.
You press and hold the A key.
You: Play some music.
You release the A key.
Alexa: Here’s some music you might like…..
Music is played, and the music card appears on the screen.
You press and hold the A key.
You: What's the weather like in Seattle?
You release the A key.
Alexa: Right now in Seattle, it's 74 degrees with sun….
A weather visual card appears over the music card while music playback is attenuated.
You press the B key one time.
The weather card disappears and the music card appears with music playback at full volume.
You press the B key again.
The music card disappears and the music playback stops.



Exit
The EXIT
event enables users to use device-button input to exit all Alexa experiences initiated by the applications.
All the sample IPCServer configurations use the ESC key to initiate exit navigation.
The following example shows the EXIT
event when the Sample Web Application is open and in focus.
You press and hold the A key.
You: Play some music.
You release the A key.
Alexa: Here’s some music you might like….
Music plays, and the music card appears.
You press and hold the A key.
You: What's the weather like in Seattle?
You release the A key.
Alexa: Right now in Seattle, it's 74 degrees with sun….
A weather visual card appears over the music card while music playback is attenuated.
You press the ESC key.
Both the weather and music card disappear, and all speech and music playback stop.



Captions and DoNotDisturb
The Smart Screen Sample Web Application also enables users to control AlexaCaptions
and DoNotDisturb
states through the Sample Web Application.
Captions
Amazon implements the AlexaCaptions
IPC API in both applications. This implementation enables you to have integrations built with captions enabled so users can toggle and render Alexa captions for supported responses.
All the sample IPCServer configurations use the C key to toggle captions enablement through the captionsStateChanged
IPC message. The Sample Web Application uses the alexa-smart-screen-sample-captions
component to render captions received through the renderCaptions
IPC message.
The following example uses the Hub Large Landscape configuration when the Sample Web Application is open and in focus.
You press the C key.
This action toggles the captions-enabled state to on for Alexa captioning.
You press and hold the A key.
You: What's the weather like in Portland?
You release the A key.
Alexa: In Portland Oregon, it's 77 degrees Fahrenheit with clear skies and sun.
A weather visual card appears and captions are rendered over it. Captions disappear when Alexa speech finishes.

DoNotDisturb
Amazon implements the DoNotDisturb
IPC API in both applications to enable users to toggle the DoNotDisturb
state, and to inform the Sample Web Application when the state changes.
All the sample IPCServer configurations use the D key to toggle DoNotDisturb
on and off. The toggle occurs in the Alexa client through the doNotDisturbStateChanged
IPC message.
The Sample Web Application is also notified of DoNotDisturb
state changes in the Alexa client that are the result of a user voice interaction or an update from the Alexa Mobile Application.
The following example shows the DoNotDisturb
toggle functionality with the Sample Web Application open and in focus.
You press the D key.
The action toggles the DoNotDisturb
enabled state to ON
in the Alexa client, and the Sample Web Application’s text attention system flashes to show DO NOT DISTURB ENABLED
.
You press the D key again.
This action toggles the DoNotDisturb
enabled state to OFF
in the Alexa client, and the Sample Web Application’s text attention system flashes to show DO NOT DISTURB DISABLED
.


Use display windows
With Smart Screen Sample Applications, you can configure and implement a variety of different display and window implementations to render Alexa visual experiences on your device. Together, the applications do this through three key mechanisms:
- Assertion of visual characteristics to the Alexa service through the IPC Server Sample Application configuration.
- Device Display: physical device display characteristics.
- Window Templates: definitions of windows that can be used on the device used for rendering Alexa visual content.
- Interaction Modes: definition of the types of interaction modalities applied to windows and any content rendered within them.
- Dynamic configuration of windows in the Sample Web Application from the asserted visual characteristics defined by the IPC Server Sample Application configuration.
- Coordinated
WindowState
reporting to make sure that the Alexa service is correctly informed of windows instances created in the Sample Web Application. The instances are registered withPresentationOrchestrator
.- This reporting enables responses from Alexa Skills and Amazon sources to target the available windows on your device with their content. For example APL
RenderDocument
directives sent to your device.
- This reporting enables responses from Alexa Skills and Amazon sources to target the available windows on your device with their content. For example APL
The following examples show how to use these mechanisms to achieve different window configurations on your device.
TV overlay window
If you’re looking to implement Alexa visual experiences on your TV-type devices that are similar to those on FireTV devices, consider the TV overlay window. This window is implemented in the default IPCServerSampleAppConfig
for the IPC Server Sample Application.
With this window configuration implementation, certain shorter form Alexa visual responses, like weather and general information requests, for example, "Who is Abraham Lincoln?", appear in a transient overlay window on the lower third of your screen. Longer form Alexa visual responses, like music playback, appear full screen.
Configuration
The sample TV Full-screen And Overlay configuration provides you with an example implementation for a TV-type device with both a full-screen and lower third overlay window for Alexa visual content.
Visual characteristics window configuration | Sample Web Application windows configuration |
---|---|
Configuration notes:
|
Configuration notes:
|
{ "type": "AlexaInterface", "interface": "Alexa.InteractionMode", "version": "1.1", "configurations": { "interactionModes": [ { "id": "tv", "uiMode": "TV", "interactionDistance": { "unit": "INCHES", "value": 130 }, "touch": "UNSUPPORTED", "keyboard": "SUPPORTED", "video": "SUPPORTED", "dialog": "SUPPORTED" }, { "id": "tv_overlay", "uiMode": "TV", "interactionDistance": { "unit": "INCHES", "value": 130 }, "touch": "UNSUPPORTED", "keyboard": "SUPPORTED", "video": "UNSUPPORTED", "dialog": "SUPPORTED" } ] } }, { "type": "AlexaInterface", "interface": "Alexa.Display.Window", "version": "1.0", "configurations": { "templates": [ { "id": "tvFullscreen", "type": "STANDARD", "configuration": { "sizes": [{ "type": "DISCRETE", "id": "fullscreen", "value": { "unit": "PIXEL", "value": { "width": 1920, "height": 1080 } } }], "interactionModes": [ "tv" ] } }, { "id": "tvOverlayLandscape", "type": "OVERLAY", "configuration": { "sizes": [{ "type": "DISCRETE", "id": "landscapePanel", "value": { "unit": "PIXEL", "value": { "width": 1920, "height": 400 } } }], "interactionModes": [ "tv_overlay" ] } }] } } |
[{ "id": "tvFullscreen", "displayWindowConfig": { "templateId": "tvFullscreen", "configurations": { "landscape": { "sizeConfigurationId": "fullscreen", "interactionMode": "tv" } } }, "aplRendererParameters": { "supportedExtensions": [ "aplext:backstack:10" ] }, "supportedInterfaces": [ "Alexa.Presentation.APL", "TemplateRuntime" ], "zOrderIndex": 0 }, { "id": "tvOverlayLandscape", "displayWindowConfig": { "templateId": "tvOverlayLandscape", "configurations": { "landscape": { "sizeConfigurationId": "landscapePanel", "interactionMode": "tv_overlay" } } }, "supportedInterfaces": [ "Alexa.Presentation.APL", "TemplateRuntime" ], "windowPosition": "bottom", "zOrderIndex": 1 } ] |
Example usage
The following example shows the previous configurations applied, and the Sample Web Application open and in focus.
You press and hold the A key.
You: Play some music.
You release the A key.
Alexa: Here’s some music you might like….
Music plays and the music card displays in the full-screen window.
You press and hold the A key.
You: What's the weather like in Seattle?
You release the A key.
Alexa: Right now in Seattle, it's 74 degrees with sun….
A weather card displays in the overlay window above the music card.


Re-orienting windows
If you plan to implement a device that might change orientation from landscape to portrait or vice versa, you likely want to implement Alexa visual windows that can respond to these changes in orientation. Adaptive windows present the best content to your users for your device’s current orientation.
The Smart Screen Sample Applications support these types of windows, and any dynamic content like APL, that can adapt visual presentation based on orientation. These windows and dynamic content use the following systems:
- The three key mechanisms discussed in Use display windows for asserting and configuring windows.
- The
displayMode
ORIENTABLE
property of the Sample Web Application configuration. - The optional display window orientation configuration of the Sample Web Application configuration.
- The Sample Web Application’s implementation of the
alexa-smart-screen-window-manager
component, and its corresponding APIwindowManager.updateDisplayOrientationToWindows(DisplayOrientation.PORTRAIT)
.
Configuration
The sample Hub Orientable configuration provides you an example implementation of a re-orientable touch display device with similar characteristics to an Echo Show 15.
Visual characteristics window configuration | Sample Web Application windows configuration |
---|---|
Configuration notes:
|
Configuration notes:
|
{ "type": "AlexaInterface", "interface": "Alexa.Display.Window", "version": "1.0", "configurations": { "templates": [ { "id": "hubFullscreen", "type": "STANDARD", "configuration": { "sizes": [ { "type": "DISCRETE", "id": "fullscreenLandscape", "value": { "unit": "PIXEL", "value": { "width": 1920, "height": 1080 } } }, { "type": "DISCRETE", "id": "fullscreenPortrait", "value": { "unit": "PIXEL", "value": { "width": 1080, "height": 1920 } } } ], "interactionModes": [ "hub_fullscreen" ] } } ] } } |
"displayMode": "ORIENTABLE", "windows": [ { "id": "hubOrientableFullscreen", "displayWindowConfig": { "templateId": "hubFullscreen", "configurations": { "landscape" : { "sizeConfigurationId": "fullscreenLandscape", "interactionMode": "hub_fullscreen" }, "portrait" : { "sizeConfigurationId": "fullscreenPortrait", "interactionMode": "hub_fullscreen" } } }, "aplRendererParameters": { "supportedExtensions": [ "aplext:backstack:10" ] }, "supportedInterfaces": [ "Alexa.Presentation.APL", "TemplateRuntime" ], "zOrderIndex": 0 } ] |
Example usage
The following examples show the Hub Orientable configuration, and the Sample Web Application open and in focus with a portrait orientation.
You press and hold the A key.
You: What's the weather like in Seattle?
You release the A key.
Alexa: Right now in Seattle, it's 76 degrees with sun….
A weather card displays full screen for portrait orientation.
You adjust your browser window size to reflect a landscape orientation.
The weather card dynamically updates and re-renders to the landscape orientation.


Resizable windows
If your device implementation of an Alexa client requires windows that might dynamically re-size based on user input or other changing device modalities, consider implementing resizable windows in the Smart Screen Web Application.
The sample applications support resizable windows, and any dynamic content like APL, that can adapt visual presentation based on size changes. These windows and dynamic content use the following systems:
- The three key mechanisms discussed in Use display windows for asserting and configuring windows.
- The
CONTINUOUS
WindowTemplate
size configuration type. - The
displayMode
RESIZABLE
property of the Sample Web Application configuration. - The Sample Web Application’s implementation of the
alexa-smart-screen-Window-manager
component, and its corresponding APIwindowManager.updateDisplaySizeToWindows(width, height)
.
Configuration
The sample Resizable Desktop Application configuration provides you an example implementation of an Alexa client, such as what you might find in a desktop application. In this configuration, users can change window size within a defined range.
Visual characteristic window configuration | Sample Web Application windows configuration |
---|---|
Configuration notes:
|
Configuration notes:
|
{ "type": "AlexaInterface", "interface": "Alexa.Display.Window", "version": "1.0", "configurations": { "templates": [ { "id": "pcFullscreen", "type": "STANDARD", "configuration": { "sizes": [ { "type": "CONTINUOUS", "id": "pcFullscreenContinuous", "minimum": { "unit": "PIXEL", "value": { "width": 960, "height": 480 } }, "maximum": { "unit": "PIXEL", "value": { "width": 1920, "height": 1280 } } } ], "interactionModes": [ "pc_fullscreen" ] } } ] } } |
"displayMode": "RESIZABLE", "windows": [ { "id": "pcFullscreen", "displayWindowConfig": { "templateId": "pcFullscreen", "configurations": { "landscape" : { "sizeConfigurationId": "pcFullscreenContinuous", "interactionMode": "pc_fullscreen" } } }, "aplRendererParameters": { "supportedExtensions": [ "aplext:backstack:10" ] }, "supportedInterfaces": [ "Alexa.Presentation.APL", "TemplateRuntime" ], "zOrderIndex": 0 } ] |
Example usage
The following example shows the Resizable Desktop Application configuration, and the Sample Web Application open and in focus.
You press and hold the A key.
You: Where is the Golden Bridge?
You release the A key.
Alexa: The Golden Gate bridge is….
An informational card displays for the current size.
You adjust your browser window size to scale the window vertically.
The informational card dynamically updates and re-renders to the current size.


Debug mode
You can run the IPC Server Sample Application in debug mode to troubleshoot your device and run diagnostic tools. If the IPC Server Sample Application is running in debug mode, you see the following message when it first starts.
SDK Version 3.0.0
WARNING! THIS DEVICE HAS BEEN COMPILED IN DEBUG MODE.
RELEASING A PRODUCTION DEVICE IN DEBUG MODE MAY IMPACT DEVICE PERFORMANCE,
DOES NOT COMPLY WITH THE AVS SECURITY REQUIREMENTS,AND COULD RESULT IN SUSPENSION OR TERMINATION OF THE ALEXA SERVICE ON YOUR DEVICES.
Troubleshooting smart screen sample applications
Issue: Can't hear sound
Symptoms
You don't hear any sound.
Try this
Make sure that both your microphone and speakers are both working.
Issue: Smart Screen Applications are unresponsive
Symptoms
Alexa isn't responding or the IPC Server Sample Application appears stuck, or displaying error messages when you try to speak.
Try this
Refresh your browser widow with the Sample Web Application to reload that application and re-connect to the IPC Server.
For more details about how to troubleshoot other common issues, see Troubleshooting AVS Device SDK Common Issues.
Related topics
- Get Started with Alexa Voice Service Device SDK
- AVS SDK IPC Server Sample Application Quick Start
- About the Alexa Smart Screen Web Components
- Alexa Smart Screen Web Components Quick Start Guide
Last updated: Nov 30, 2022