Editor’s Note: The Alexa Presentation Language became Generally Available on September 25, 2019.
Today we’re excited to announce the next version of Alexa Presentation Language (APL) with support for animations, vector graphics, better tooling, and a design system that makes APL skill development for multiple viewport profiles faster. APL 1.1 is available to Alexa developers in all locales and can be used on Echo Show, Echo Spot and the new Echo Show 5 right now. In the coming days, we will also roll out to additional devices including Fire TVs and Fire tablets. To detect whether APL 1.1 is supported on a device, inspect the supportedInterfaces object in your skill request and look for the maxVersion inside the Alexa.Presentation.APL object.
APL 1.1 comes with a new AnimateItem command. With this command, you can animate the position, scale, rotation, or opacity of any APL component or layout. You can also combine AnimateItem with Alexa Vector Graphics (a subset of the Scalable Vector Graphics standard) to create brand new experiences that will help you keep your customers engaged.
In the animation below, the light bulb was created using APL vector graphics, and all the animations were created using the new AnimateItem command. Both animations and vector graphics are fully supported in the skill test simulator built in to the developer console and can be used to test your APL 1.1 based experiences. Here’s an example of what an animated vector graphic and some text look like on an Echo Show 5.
Below, see the skill response that was used to create that example. Notice how we use the new environment.aplVersion property to detect support for APL 1.1 before using the new features. On devices that are still on APL 1.0, this APL document will show static text that says “Welcome to APL 1.0”. On devices that support APL 1.1, customers will see the animation shown above. The light bulb icon is a vector graphic that is expressed using the new Alexa Vector Graphics (AVG) component. AVG is a subset of the Scalable Vector Graphics (SVG) specification. We animate the light bulb and some text using a number of AnimateItem commands that are run in parallel using the Parallel APL command.
{ "directives": [ { "type": "Alexa.Presentation.APL.RenderDocument", "token": "lightbulb", "document": { "type": "APL", "version": "1.1", "theme": "auto", "graphics": { "lightbulb": { "type": "AVG", "version": "1.0", "height": 48.0, "width": 48.0, "parameters": [{ "default": "white", "type": "color", "name": "fillColor" }], "items": [{ "type": "path", "fill": "${fillColor}", "stroke": "pink", "strokeWidth": 1, "pathData": "M15.001,15c-0.032,0-0.064-0.001-0.096-0.004c-0.55-0.053-0.953-0.541-0.9-1.091c0.449-4.682,4.189-8.425,8.895-8.9 c0.56-0.052,1.041,0.346,1.096,0.895c0.055,0.55-0.345,1.04-0.895,1.096c-3.759,0.379-6.747,3.365-7.105,7.1 C15.946,14.613,15.51,15,15.001,15z M30.5,43c0-0.553-0.448-1-1-1h-11c-0.552,0-1,0.447-1,1s0.448,1,1,1h11 C30.052,44,30.5,43.553,30.5,43z M30.5,47c0-0.553-0.448-1-1-1h-11c-0.552,0-1,0.447-1,1s0.448,1,1,1h11 C30.052,48,30.5,47.553,30.5,47z M37.906,19.845c-0.54,1.272-1.177,2.482-1.821,3.652c-0.432,0.784-0.879,1.594-1.272,2.4 c-1.158,2.374-1.962,4.901-2.389,7.512c-0.041,0.251-0.075,0.513-0.11,0.779C32.074,35.988,31.676,39,28.304,39h-8.607 c-3.372,0-3.769-3.012-4.007-4.812c-0.036-0.267-0.068-0.528-0.109-0.779c-0.427-2.611-1.227-5.139-2.385-7.513 c-0.394-0.805-0.833-1.616-1.265-2.399c-0.645-1.17-1.296-2.38-1.836-3.652c-1.772-4.171-1.285-8.833,1.284-12.788 C13.964,3.076,18,0.457,23,0.059V0.008L23.938,0l0.006,0.003L25,0v0.059c5,0.398,9.036,3.017,11.622,6.998 C39.191,11.012,39.678,15.673,37.906,19.845z M31.781,28H16.219c0.589,1.651,1.047,3.348,1.331,5.087 c0.044,0.271,0.082,0.553,0.12,0.839C17.98,36.271,18.3,37,19.696,37h8.607c1.396,0,1.716-0.729,2.026-3.074 c0.038-0.287,0.075-0.568,0.12-0.839C30.734,31.348,31.191,29.651,31.781,28z M35.003,8.146C32.592,4.434,28.414,2.082,24.1,2.008 l-0.118-0.002L23.9,2.008c-4.314,0.074-8.492,2.426-10.903,6.138c-2.201,3.389-2.599,7.368-1.091,10.917 c0.5,1.179,1.113,2.291,1.762,3.469c0.443,0.805,0.902,1.638,1.317,2.487c0.157,0.322,0.29,0.654,0.435,0.981h17.16 c0.145-0.327,0.278-0.659,0.435-0.98c0.416-0.85,0.874-1.683,1.317-2.488c0.649-1.178,1.262-2.29,1.762-3.468 C37.602,15.514,37.204,11.535,35.003,8.146z" }] } }, "import": [{ "name": "alexa-layouts", "version": "1.0.0" }], "mainTemplate": { "parameters": [ "payload" ], "item": { "type": "Frame", "width": "100%", "height": "100%", "backgroundColor": "black", "item": { "type": "Container", "width": "100vw", "height": "100vh", "items": [ { "type": "VectorGraphic", "when": "${environment.aplVersion == '1.1'}", "id": "imageId1", "source": "lightbulb", "fillColor": "yellow", "position": "absolute", "width": "30vw", "height": "30vw", "left": "10vw", "top": "30vh", "opacity": 0 }, { "type": "Text", "text": "Welcome", "color": "teal", "textAlign": "center", "fontSize": 38, "id": "textId1", "opacity": "${environment.aplVersion == '1.1' ? 0 : 1}" }, { "type": "Text", "top": "20dp", "text": "to", "color": "grey", "textAlign": "center", "fontSize": 50, "id": "textId2", "opacity": "${environment.aplVersion == '1.1' ? 0 : 1}" }, { "type": "Text", "text": "${environment.aplVersion == '1.1' ? 'APL 1.1' : 'APL 1.0'}", "color": "crimson", "textAlign": "center", "fontSize": 92, "id": "textId3", "opacity": "${environment.aplVersion == '1.1' ? 0 : 1}" } ] } } } }, "datasources": {} }, { "type": "Alexa.Presentation.APL.ExecuteCommands", "token": "lightbulb", "commands": [{ "type": "Parallel", "when": "${environment.aplVersion == '1.1'}", "commands": [ { "type": "Sequential", "commands": [{ "type": "AnimateItem", "easing": "ease-in-out", "duration": 6000, "componentId": "imageId1", "value": [{ "property": "opacity", "to": 1 }, { "property": "transform", "from": [{ "translateX": "100vw" }, { "rotate": 720 } ], "to": [{ "translateX": 0 }, { "rotate": 0 } ] } ] }, { "type": "AnimateItem", "easing": "ease-in-out", "duration": 6000, "componentId": "imageId1", "value": [{ "property": "opacity", "to": 1 }, { "property": "transform", "from": [{ "translateX": 0 }, { "scale": 1 } ], "to": [{ "translateX": "25vw" }, { "scale": 2.5 } ] } ] }, { "type": "Parallel", "commands": [{ "type": "AnimateItem", "easing": "ease-in-out", "duration": 4000, "componentId": "textId1", "value": [{ "property": "opacity", "to": 1 }, { "property": "transform", "from": [{ "translateX": "100vw" } ], "to": [{ "translateX": 0 } ] } ] }, { "type": "AnimateItem", "easing": "ease-in-out", "duration": 6000, "componentId": "textId2", "value": [{ "property": "opacity", "to": 1 }, { "property": "transform", "from": [{ "translateX": "-100vw" } ], "to": [{ "translateX": "0vw" } ] } ] }, { "type": "AnimateItem", "easing": "ease-in-out", "duration": 8000, "componentId": "textId3", "value": [{ "property": "opacity", "to": 1 }, { "property": "transform", "from": [{ "translateX": "100vw" }], "to": [{ "translateX": 0 }] } ] } ] } ] } ] }] } ] }
Alexa-enabled devices with screens come in different sizes and shapes. As the number of touch points with customers increases across a diverse line of Alexa-enabled devices with screens, we are introducing an updated Alexa Design System to help you create APL skills faster and reach more customers. The design system accelerates your design work, allowing you to create visual experiences for all devices, while only designing a few.
For example, if you import alexa-styles in to your APL document and use the style textStyleBody to style your text, the size of the text will automatically adapt based on the viewing distance typical for a device. These sizes came out of user experience research we conducted to identify appropriate base sizes for Alexa devices. The animation below shows how the Alexa Design System styles use a larger base size on TVs so that, despite the longer viewing distance, customers can experience text and other visuals at about the same size as they would appear on other devices.
In addition to resources and styles, the Alexa Design System for APL comes with new responsive components and templates you can use in your APL documents. These responsive components and templates also make it easy to build for multiple devices by responding based on device mode, size, and shape. Examples of responsive components include AlexaButton which is a touchable/selectable element that has the relevant states needed for touch and TV devices and AlexaImage which presents an image.
Responsive templates combine responsive components and primitive APL UI elements (such as ScrollView and Pager) to present a standalone pattern that takes up the entire viewport. Examples of responsive templates include AlexaTextList which can be used to display a scrolling list of text items and AlexaBackground which can be used to render splash screens and backgrounds. A list of responsive components, template, and other elements of the Alexa design system can be found here.
We plan to continue releasing new responsive components/templates and add support for new viewport profiles as we and our partners release new Alexa devices in the future. To help you build APL documents that support current and future Alexa devices, we have a new guide that shows how to build responsive APL documents using the Alexa Design System here.
With APL 1.1, we have deepened our investment in the APL authoring experience, available here. You can now select elements in the design surface and see them highlighted in the component hierarchy. This makes it easier to associate what you see in the design view with the actual component and update its properties.
In the coming weeks, we plan to release another update to enable dragging and dropping of APL components in the APL authoring experience. Finally, the APL 1.1 authoring tool shares rendering, data binding, and expression evaluation code with the runtime that runs on Alexa devices. This has significantly lowered the number of discrepancies between authoring experience and what you will see on actual devices.
The APL authoring experience and skill test simulator add a new viewport profile called Small Landscape Hub to support the newest addition to the Alexa device family — Echo Show 5. To make sure customers can still experience your skills while you optimize them for Echo Show 5, the new device will automatically scale your APL responses to fit on its screen. We strongly recommend you optimize your APL for the Echo Show 5. To learn more, see this blog post.
Once you are done optimizing and have tested your skill in the Small Landscape Hub on the simulator or with an Echo Show 5, you can check the box shown below to opt out of the automatic scaling. Next, submit your skill for certification and we will go to work certifying it on the Echo Show 5 for you. If your skill was certified before 3-July-2019, then you will need to resubmit it after checking the box even if you have already optimized your visuals.
With the launch of APL 1.1, we are also expanding transformers that you can use with the SpeakItem and SpeakList commands to have Alexa highlight and speak text blocks and lists of items. Specifically, we are adding a new textToSpeech transformer. This new transformer can be used to transform text into speech. Starting today, the existing ssmlToSpeech transformer will only accept valid Speech Synthesis Markup Language (SSML) inputs. Any inputs to this transformer must be wrapped in <speak></speak> tags and be valid XML. In addition, the Progressive Response API will also accept only valid SSML as input. If your skill uses invalid SSML or raw text, you will receive an error from the API.
If you run in to any issues or have questions, tweet me at @aruntalkstech and I’d be happy to take a look, or share your feedback on the Alexa developer forum. Make sure you add the "APL" topic to your post.
We can’t wait to see what you build with the new APL 1.1 features!