Announcing Alexa Presentation Language (APL) 2023.2

Mrudula Singampalli Jun 30, 2023
Share:
Alexa Skills
Blog_Header_Post_Img

We’re pleased to announce Alexa Presentation Language (APL) 2023.2 that includes several new features that allow for better consistency across voice and touch, improved localization and even more dynamic visual UI. Enhancements in this release include dynamic entities that allow you to access up-to-date UI data right from skill requests; vector graphics with dynamic sources that effortlessly reflect state or context changes; absolute seeking on video playback and deferred evaluation. We are excited to also announce tools that boost developer productivity including linting, intelligent suggestions and our handy, easy-to-digest APL cheat sheets. Additional details about each of these enhancements are included below.

Note: As of June 30, 2023 new features have been released to all Echo Show family devices excluding Echo Show 1st Gen, Echo Show 2nd Gen, and Echo Spot. 2023.2 will be released to other devices soon.

Update: As of Aug 23, 2023 new features have been released to all Echo Show family devices including Echo Show 1st Gen, Echo Show 2nd Gen, and Echo Spot. 2023.2.

Do more with dynamic 'entities' property on components

The entities property of APL components is now dynamic. Your Alexa skill can now be aware of the latest state of your APL document, consistently across voice and touch interactions. Dynamic entities now reflect changes of assigned data variables, delivering the most current state in the visual context of Alexa skill requests during an ongoing APL presentation session. 

For instance, consider the example of a Container with an entities property assigned with the elapsedTime variable. With dynamic entities, the visual context always reflects the current value of elapsedTime, allowing Alexa skills to act upon the most recent state of your APL document. Previously, elapsedTime in the visual context would have always shown 0 as the initial elapsed time value during APL document inflation.

Copied to clipboard
{
    "type": "Container",
    "entities": [
        {
            "id": "elapsedTime",
            "type": "timestamp",
            "value": "${elapsedTime}"
        }
    ]
}

 

Entities property values are available under context > Alexa.Presentation.APL > componentsVisibleOnScreen in Alexa skill request for only those APL components that are visible on the device screen when a user action is taken. Here is an example skill request as it would look like for the above example:

Copied to clipboard
{
    "version": "1.0",
    "session": { ... },
    "context": {
        "Alexa.Presentation.APL": {
            "componentsVisibleOnScreen": [
                {
                    "position": "384x300+64+133:1",
                    "type": "mixed",
                    "uid": ":6548",
                    "children": [],
                    "entities": [
                        {
                            "id": "elapsedTime"
                            "type": "timestamp",
                            "value": 21290
                        }
                    ]
                }
            ]
        },
        ...
    },
    "request": { ... }

 

Simplify state changes on your vector graphics

The source property of the VectorGraphic component is now dynamic. This means that you can assign a dynamic value or expression to a vector graphic’s source, which is then reloaded whenever a change is detected to the source name or URL. This allows you to conveniently swap vector graphics to reflect state or context changes such as the displayed charging level of a battery :

Copied to clipboard
{
    "type": "VectorGraphic",
    "scale": "best-fill",
    "fillColor": "white",
    "bind": [
        {
            "name": "batteryLevel",
            "value": 10
        }
    ],
    "source": "https://arl.assets.apl-alexa.com/packages/alexa-icon/1.0.0/icons/ic_battery_${batteryLevel}.json",
    "onPress": [
        {
            "type": "SetValue",
            "property": "batteryLevel",
            "value": "${Math.min(100, batteryLevel + 10)}"
        }
    ]
}

 

This update also applies to the iconName property of the AlexaIcon responsive layout component provided with the alexa-icon import package as this layout uses the VectorGraphic component internally. 

Copied to clipboard
{
  "type": "APL",
  "version": "2023.2",
  "import": [
    {
      "name": "alexa-icon",
      "version": "1.0.0"
    }
  ],
  "mainTemplate": {
    "bind": [
      {
        "name": "batteryLevel",
        "value": 10
      }
    ],
    "items": [
      {
        "type": "AlexaIcon",
        "width": 100,
        "height": 100,
        "iconColor": "white",
        "iconName": "ic_battery_${batteryLevel}",
        "onPress": [
          {
            "type": "SetValue",
            "componentId": ":root",
            "property": "batteryLevel",
            "value": "${Math.min(100, batteryLevel + 10)}"
          }
        ]
      }
    ]
  }
}

 

Streamline media playback control with absolute seeking 

APL’s ControlMedia command now can seekTo an absolute playback position in the Video component, minimizing the need for calculation of relative timestamps and making media control more intuitive. This seekTo complements the existing seek option that allows you to set the playback position relative to the current play timestamp. 

For example, if you want to seek to a specific time in a video that is say 10s long, previously, you would need to calculate the time offset based on current playback position. If the current playback position is 2s and you want to seek to the 5s mark, you need to calculate the offset: 5-2 = 3s, a relative offset of 3000ms. 

Copied to clipboard
{
  "type": "ControlMedia",
  "componentId": "MyVideoPlayer",
  "command": "seek",
  "value":3000
}

 

You can now use seekTo to directly seek to 5000ms. 

Copied to clipboard
{
  "type": "ControlMedia",
  "componentId": "MyVideoPlayer",
  "command": "seekTo",
  "value":5000
}

 

Both media control options also preserve the current playback state so that seeking in an ongoing playback continues to play from the new position without needing to issue another play command. Likewise, playback will pause at the new timestamp if it was in paused state before seeking.  

Achieve optimal data-binding through deferred evaluation 

With the new deferred evaluation feature in APL, you can now run the data-binding algorithm at a later point in your document inflation lifecycle to improve localization on skills among others. 

Data-binding allows you to incorporate user-provided data and styles into your documents. However, sometimes it's necessary to defer evaluation of these resources due to unavailable values such as data sources or bind variables. For example, previously, when building multi-locale skills, developers had to write complex code to handle differences in language or regional variations in phrases:

Copied to clipboard
{
  "type": "APL",
  "version": "2023.2",
  "resources": [
    {
      "strings": {
        "preNameString": "The name is",
        "postNameString": ""
      }
    },
    {
      "when": "${environment.lang == 'hi-IN'}",
      "strings": {
        "preNameString": "",
        "postNameString": "मेरा नाम"
        
      }
    }
  ],
  "mainTemplate": {
    "items": [
      {
        "bind": [
          {
            "name": "userName",
            "value": "John Doe"
          }
        ],
        "type": "Text",
        "text": "${@preNameString} ${userName} ${@postNameString}"
      }
    ]
  }
}

 

With Deferred Evaluation, all you need is the #{} expression to inform APL that the expression is a placeholder and will be evaluated later. For ex: In this code snippet, we use the #{userName} expression in NAME_STRING. When the text component mounts the eval function runs the data-binding expression to get the intended text.

Copied to clipboard
{
  "type": "APL",
  "version": "2023.2",
  "resources": [
    {
      "strings": {
        "NAME_STRING": "The name is #{userName}"
      }
    },
    {
      "when": "${environment.lang == 'hi_IN'}",
      "strings": {
        "NAME_STRING": "#{userName} मेरा नाम"
      }
    }
  ],
  "mainTemplate": {
    "items": [{
      "type": "Text",
      "bind": {
        "name": "userName",
        "value": "John Doe"
      },
      "text": "${eval(@NAME_STRING)}"
    }]
  }
}

 

Similar to previous releases, in order to use the features announced above, set the version property of your APL documents to 2023.2 and use 2023.2 in any conditional statements on your documents. If you choose not to update, your existing experiences will continue to work as before but you will not be able to use the new features. More info on versioning is available on our “What’s New” page.

Author easily with powerful new features and integrations

We are happy to announce updates to the APL Authoring Tool including auto-complete and linting features that enhance your APL authoring experience right from the onset.

Catch and correct syntactical issues in APL documents

Eliminate syntactical errors in your APL documents with the new linting feature. The linter checks for misspelled, missing or extraneous APL properties and more. The feature locates and indicates syntax errors in line, provides descriptive error messages and relevant tech docs for remedial steps. With the linter at your disposal, you can easily and quickly find and correct syntactical issues in your APL documents while authoring them.

Accelerate APL authoring with intelligent suggestions

The Authoring tool now supports automatic completion of APL language constructs, including components, commands, event handlers, and more. The tool suggests APL properties including event handlers, primitive components, commands, as you type. This feature saves you time and effort by reducing the need to refer to APL tech documentation. With the intelligent suggestions, you can easily and accurately author your APL document without syntactical errors.

Unlock your APL potential with time-saving cheat sheets

As an APL developer, have you ever found yourself struggling to remember the syntax or commands for a specific component? Do you wish you had a handy reference guide to help you recall important information about APL while working on Alexa skills or widgets? Look no further because our APL cheat sheets are here to help!

We are thrilled to announce the release of our APL cheat sheets that present the most relevant information about the APL specification and technical documentation in a concise and easy-to-digest format. These cheat sheets serve as a quick reference guide that can improve your productivity and reduce the time it takes to recall important information about APL components and commands, data context, and expression syntax.

Our printable PDF cheat sheets can be accessed from anywhere, whether you print them out to have them handy at your desk or use them directly from your computer. Don't miss out on the opportunity to streamline your APL development process - download our cheat sheets today! Visit our website to learn more: APL Cheat Sheets

To use APL cheat sheets efficiently, we recommend you be familiar with the basic concepts and structure of an APL document. If you’re new to learning APL our technical documentation is a better starting point.

All of these features have been made possible through your continued partnership and feedback. You can learn about all the new features, general updates to APL and more in our technical documentation. If you need help, come find us and other multimodal developers on Alexa Community Slack or on Twitter (@austinvach@smrudula@pkarthikr). We look forward seeing what you build!

Recommended Reading

Driving revenue with rap battles and Alexa
Amazon announces new Alexa Smart Properties features for Senior Living
Earn money with an Alexa Skill

Subscribe