Understand the APL Lifecycle

Once you Understand APL Architecture and its technical actors, you are ready to learn how the two main actors — the screen device and the Alexa skill — operate and interact with each other within the lifespan of an APL experience. There are mainly three types of action these actors can take to facilitate the experience:

  1. Render new — A skill decides to replace the entire UI on screen by providing a new APL document template to the screen device. This is the first step that the skill takes to start an APL experience, but the step can repeat numerous times within an APL experience. For instance, a user taps on a button to enter a skills’ own settings menu, the APL document sends a message to the skill on the cloud-side. In response, the skill sends a new APL document template to the device in order to replace the current UI with a new settings page.
  2. Update remotely — Instead of replacing the entire UI on screen, the skill decides to reuse an already rendered APL document to update some of its elements from the cloud. For example, a user requests to open the skills’ own settings menu by using voice, a skill instructs the already rendered APL document to reveal a hidden visual component that shows the user a settings panel as an overlay.
  3. Update locally — Dynamic APL documents on screen can also act on their own without interacting with the skill at all. For example, a user taps a button to enter a skills’ own settings menu, the already rendered APL document reveals its hidden visual component that shows the user a settings panel as an overlay.

A single step in an APL lifecycle can be illustrated as follows:

APL lifecycle actors decide between rendering or updating APL documents.
APL lifecycle actors decide between rendering or updating APL documents.

For example, to handle an incoming UI request like a button press, the rendered APL document on a screen device can use any of its two options. It sends an event to the skill for it to decide on making an update to or rendering an entirely new APL document. Or the APL document takes immediate action by updating its own UI with or without sending an event to the skills’ cloud endpoint.

For voice requests, the skill decides between sending instructions back to the device for changing an already rendered UI or replacing the UI entirely by sending a new APL template back to the cloud.

The main reason for you to consider the APL document lifecycle and to understand the different choices that you make to handle incoming user requests is the significant impact that your decisions can have on UX performance.

Further reading

  • For details about the technical effort that stands behind rendering a new APL document on an Alexa screen device and how you can optimize for reducing latency that is associated with this process, see Optimize APL Document Rendering.
  • For details about the most lucrative alternative to rendering new APL documents — which is to update an already rendered APL document that is visible on screen, see Optimize APL Document Updates.
  • For details about a methodology that you can use to decide when it is the right time to render new APL documents versus update already rendered APL documents on screen, see Optimize APL Lifecycles.