In August, we released Custom Interfaces, the newest feature available in the Alexa Gadgets Toolkit. Custom Interfaces enable you to connect gadgets, games, and smart toy products with immersive skill-based content—unlocking creative ways for customers to experience your product. Using the Custom Interface Controller, you can design voice experiences that are tailored to your product’s unique functionality.
Want to start building with Custom Interfaces, but don’t know where to start? To demonstrate the process for building a prototype, we created our own Alexa-connected musical keyboard using the Alexa Gadgets Toolkit. Alexa lights up a sequence of keys on the keyboard corresponding to a given song. When user plays that sequence back, Alexa provides feedback on whether the user pressed right sequence of keys or not. Here’s a video of the experience:
The prototype for this musical keyboard uses the Color Cycler sample provided in the Alexa Gadgets Raspberry Pi Samples Github repository, and builds upon the sample to enable new and unique functionality to teach people how to play different songs.
The Color Cycler sample uses a single RGB LED and a simple button for the hardware, and uses an Alexa skill to respond to a button press before the experience ends. For the keyboard experience, we needed multiple LEDs to indicate what keys should be pressed, and multiple buttons – a single button for each key. Once the new hardware has been added, it looked something like this without the keyboard overlay:
As you can see, each LED is aligned to its corresponding button used for each key. With the updated hardware in place, the keyboard can light up when a customer chooses a song from within the skill.
With the hardware assembled, LEDs can be illuminated to teach the customer which keys to press to play the song. When the skill starts, the Enumeration API is used to verify there is a gadget paired to the Echo device. If so, the customer can select a song they want to learn to play. Based on the chosen song, a Custom Directive is sent to the paired Alexa Gadget via a Custom Interface that has been defined. The JSON sent from the skill looks like this:
The payload specifies which notes should be played, the time between each note, and a delay that controls when the sequence should start playing. On the gadget side, the payload is parsed and used to illuminate the LEDs in accordance to the song that was chosen.
Illuminating the keys to indicate which notes should be hit is only half of the experience. In order to learn the song, the customer must press the keys in the correct order – otherwise, Alexa will adapt the experience accordingly. Using the Custom Interface we defined previously, events from the Alexa Gadget can be sent to the skill, giving Alexa the opportunity to respond and customize the experience.
On the gadget side, two types of events are sent to the skill: an event that lets the skill know that the LED sequence has stopped playing and the skill should start listening to key presses, and an event that is sent to the skill on each key press. The event for the individual key press looks like this:
As each key press is sent to the skill, the sequence can be compared to the master sequence stored as a session attribute within the skill. If the customer presses all the right keys, it should match the session attribute, and they can continue through the skill. If they make a mistake, the sequence will not match what’s in the session attribute, and Alexa can jump in to help.
These highlighted elements of the musical keyboard are unique to this type of product. There’s so much more to dive into with Custom Interfaces, Alexa skills, and building an Alexa Gadget that really showcases the capabilities of your product. Check out these resources to start building your prototype: