Voice technology is poised to deliver truly ambient experiences—that is, experiences where a customer isn’t tethered to a single device or even a single voice service. As passionate as we are about Amazon’s Alexa voice technology, we see a world with plenty of room for other voice services to succeed. That means customers will have the freedom to choose the voice service or services that provide the capabilities they want in ways best suited to their individual needs. We think that’s a win for everyone.
For everyone to win, makers of voice-enabled devices—from smart speakers to smart home appliances to car infotainment systems and beyond—can create multi-agent experiences that allow customers to have a choice and flexibility to use their preferred voice assistant in a given device to make those choices easily and seamlessly. Let’s explore a few of the reasons and design considerations for creating multi-agent experiences for our customers.
Ambient computing experiences shine in environments where you’re occupied with multiple tasks. Using voice technology, ambient computing can strip away complexity and minimize distractions to make those other tasks more productive and entertaining—whether you are working, cooking, or even navigating traffic in your car.
We’re committed to allowing customers to be able to use available voice agents simultaneously via dedicated wake words. In addition, these agents should cooperate to provide a delightful experience for customers. The fundamental principle of Amazon’s Voice Interoperability Initiative (VII) is that a customer using voice-enabled devices like smart speakers, screens, and even automobiles should be able to choose which voice services to use. Further, customers should have the freedom to use multiple voice agents simultaneously, selecting their preferred voice service for each interaction. Already, customers can experience devices that support multiple voice agents via Facebook Portal, Garmin DriveSmart 65, DT Magenta speaker, LG 2020 and 2021 TVs, and the Vodafone ES smart speaker.
Also, don’t forget to enable universal device commands (UDCs). UDCs are commands a customer can use with a compatible agent to control available device functions, regardless of whether the agent was used to initiate the action. Device global commands can be initiated by each agent on its own, such as increasing or decreasing the device’s audio volume. Cross-agent commands often require state information to be shared from the device to enable agents to interpret the request correctly. An example of this might be for one agent to stop the alarm on a timer initiated by another agent. UDCs enable customers to interact naturally and instinctively with multi-agent devices.
As we move closer to an ambient future, supporting simultaneous agents on voice-enabled products is just one way device makers can build a more satisfying experience for customers. Amazon also offers the Alexa Custom Assistant (ACA), a solution that lets companies create intelligent assistants that are built on Alexa technology and work in cooperation with Alexa.
Device makers and service providers can use ACA to build intelligent assistants into any capable device, including automobiles and consumer electronics like smart displays, speakers, set top boxes, home appliances, fitness devices, and more. ACA provides these companies a comprehensive, managed voice solution that reduces the cost and complexity of building it from the ground up. And for their customers, ACA can provide a familiar yet unique experience for setup and operation of the device. Custom skills could be used to control a device like increasing the incline on a treadmill, changing the channel on a set top box, or starting a robot vacuum. They can also automate and scale customer interactions, such as providing troubleshooting guidance and helping customers learn more about device capabilities. In fact, thus far, a number of big-brand companies have partnered with us to integrate ACA into their products, including FCA/Stellantis (Amazon’s first ACA integration partner), Qualcomm (with its Snapdragon Automotive Cockpit Platform), and Garmin (a Tier 1 supplier of vehicle infotainment systems).
Device makers and service providers can even customize their brand’s assistant with a unique wake word, voice, and capabilities – providing customers the benefits of an intelligent assistant that is their product and services expert, while seamlessly coexisting and cooperating with Alexa. In short, ACA makes it even easier to support seamless multi-agent experiences.
Need another example of a device maker using tools like these to provide a seamless experience between the user and their devices?
With Alexa Custom Assistant, all sorts of custom experiences can be programmed into devices, enabling a natural dialog between the driver, the custom assistant, and Alexa to fulfill the driver’s request. Alexa’s advanced AI ensures each request is routed to the assistant that can provide the most relevant and delightful experience. For example, if a customer asks Alexa to roll down a car window, or why their check engine light is on, the request will be seamlessly routed to the brand’s assistant. Check out this deep dive with an Amazonian who works on creating great experiences for customers specifically in vehicles. If a customer asks the brand’s assistant to play an audio book, the request will be routed to Alexa.
In June, automotive software and hardware supplier Elektrobit and Continental announced a major milestone in automotive embedded voice experiences: the first in-vehicle integration of Alexa Custom Assistant, a comprehensive solution that lets automakers access Alexa’s advanced AI to create their own branded, intelligent assistants. The collaboration brings together Continental’s automotive electronics and Elektrobit’s software expertise, integrating Alexa Custom Assistant into Continental’s Cockpit High Performance Computer with software and integration services provided by Elektrobit.
By integrating Alexa Custom Assistant into an actual production vehicle, automakers can now truly experience the solution as part of a fully integrated digital cockpit. It gives them insights into the solution’s true value proposition and encourages the imagination of new, differentiating use cases. By tightly integrating Elektrobit and Continental's hardware and software with Alexa’s AI and intelligent assistants, automakers can now fast track the addition of industry-leading, customized, and highly differentiating voice experiences to their vehicles.
Want to know more about how multi-agent experiences are revolutionizing devices that leverage voice technology? Register now for Alexa Live to join the Alexa community of developers, device makers, business leaders, startups, and industry experts on July 21. We can’t wait to see you there! And follow us on Twitter, LinkedIn, and Facebook to join the #AlexaLive conversation online. We’ll dive deeper into multi-agent experiences and other topics specifically for makers of voice-enabled devices, and also share a demonstration of Elektrobit and Continental’s in-vehicle integration of Alexa Custom Assistant.