We are always working to improve the in-cabin experience and our vision is to allow customers to converse with Alexa as if they are speaking with family or a friend. Today we are taking a step closer to that vision by announcing a new enhancement for Car Control that allows customers to personalize their in-vehicle experience with their own individually-tailored voice commands, like “Alexa, my windshield is foggy,” or “Alexa, set the ambient lights to my favorite color.”
The personalized Car Control capability leverages Alexa’s Teachable AI to provide a new level of customization for Alexa in-vehicle experiences. For example, when a customer expresses a feeling like “Alexa, I’m hot”, rather than returning an error, Alexa will infer their intention and ask what action should be taken based on the vehicle’s capabilities, such as: “I can decrease the temperature by 5 or roll down the window. What should I do?” Similarly, customers can teach Alexa personalized phrases to use with their familiar voice commands, like “Alexa, set the AC to full blast” or “Alexa, set the seat to my position.” In this case, Alexa will recognize the supported voice command pattern and seek clarification from the customer to learn what ‘full blast’ or ‘my position’ means instead of failing the customer request. Taught concepts are retained for future use and can be deleted at anytime by saying, “Alexa, delete the last thing I taught you” or “Alexa, delete everything I have taught you.” This feature turns the tables on traditional voice command by allowing customers to define the experience rather than learning specific commands to achieve their desired outcome.
Mimicking human learning behavior
When humans converse, we recognize the participants, are able to connect the dots to understand the context, adapt to the conversational style of our audience, ask for clarifications, and contribute to the discussion with our own suggestions. We envision conversations with Alexa having the same intelligence and personality as if speaking with a family member or friend sitting in the adjacent seat. Today’s launch takes us one step closer towards realizing this vision. Now, customers can tailor Alexa to their natural way of speaking, thereby personalizing Alexa to suit their everyday vocabulary. When Alexa encounters an unfamiliar voice command, the AI will recognize the opportunity to learn, initiate a teaching session to understand the customer's preferred outcome and retain the learning for future use.
Democratize vocabulary expansion
Making AI assistants understand voice commands is a long, resource-intensive process, which involves teaching spoken language grammar, writing language models to handle fuzziness, and training how to associate voice commands with the desired action. More complexity is involved in teaching Alexa new voice commands when the customer intent is implied or the relevant context like the desired setting is missing from the request. Owing to this complexity, vocabulary expansion typically follows the path of first supporting syntax-based voice commands with explicit targeting and gradually improving in sophistication leading up to natural language utterance support. This new capability democratizes vocabulary expansion and brings about a step function change and agility in how Alexa learns new concepts.
Inherent understanding of colloquial variations
In addition to learning the customer’s custom commands, Alexa will also identify and associate taught preference with similar meaning variations of the utterance behind the scenes. For example, when teaching what “I’m hot” means, Alexa will automatically associate the same action for other variations like “Alexa, I’m boiling,” “Alexa, it is hot in here,” and “Alexa, it is scorching hot today.”
Reduce friction and cognitive load for the driver
Alexa’s goal for on-the-go experiences is to make driving easy, stress-free and enjoyable experience. This feature minimizes the burden of memorizing supported voice commands and thus, reduces cognitive load while driving. Customers now have more flexibility in how they interact with Alexa to complete popular tasks while keeping eyes on the road and hands on the wheel.
What does a typical teaching session look like?
This launch supports two forms of teaching: 1) Declaratives, wherein a customer expresses a feeling or a state; and 2) Slot-concepts, wherein a customer uses a personalized phrase with a supported voice command.
The new personalized Car Control capability is now available for all existing and new Car Control cloud integrations using Auto SDK 2.3 and higher. At launch it will support HVAC, wipers and interior lighting with additional features and use cases planned for future updates.
Make sure your vehicles are upgraded to Auto SDK 2.3 or above and integrated with Car Control to provide this functionality to your customers.