No results found
When the Alexa Skills Kit (ASK) launched in late 2015, developers began building engaging experiences for voice, ranging from simple to innovative. Today, an interdisciplinary team of students from Carnegie Mellon University’s Entertainment Technology Center (ETC) is pushing the boundaries of what we can achieve. Meet Audrey Higgins (writer), Mohammed Tauseef (AWS and Unity integration), Na-Yeon Kim (2D/3D artist), Longyi Cheng (Unity Gameplay programmer), and Shuang You (3D artist).
Their class assignment: build a prototype, in two weeks, of a fully immersive virtual world. Specifically, the team created A.L.Ex.A. (The Assistant Linked Extemporization Array), a VR experience that follows a talkative repair drone destined to help users (or “guests” as they’re known in the VR world) stranded on remote system Planet 532.
From right to left, Shuang You, Audrey Higgins, Mohammed Tauseef, A.L.Ex.A, Na-Yeon Kim, Longyi Cheng. Illustration created and provided by Na-Yeon Kim.
There are fundamental difficulties associated with building a truly immersive virtual world. The smallest in-world interruptions can impact the guest’s experience. Interruptions can lead to the guest asking questions such as, “Did the bad guy mean to jump there?” or “Was the hero supposed to make noise there?” or “Are the decorations supposed to jut through the walls like that?” Ensuring consistent content and gameplay mechanics can be a difficult task. Creating an uninterrupted experience takes a slew of skill sets from different people who all need to come together to execute a single, unified vision.
The team’s choice to build their world with Amazon Alexa was a no-brainer, according to Audrey Higgins, the writer in the group.
“We liked Alexa because she made our world more immersive, which meant, in turn, we could make the world more immersive for guests and blur the lines between reality and the experience we were creating,” Audrey says.
For the team, creating a truly immersive world meant giving guests choices and turning Alexa into their best friend throughout their experience in the virtual world. This allowed guests to explore the world by simply using their voice.
“We wanted to create an evocative experience to make guests feel something for Alexa throughout the experience,” says Audrey.
“From a technical standpoint, we were excited by Alexa’s ability to interpret a large array of utterances,” says Tauseef. “If the guest said something that we did not plan for, it would still trigger a change of state in the world. Alexa is pretty good at figuring out what someone meant to say, even if they didn’t say it correctly.”
The team created the virtual world as a way to fully immerse guests in a virtual world. As guests progress through the world, the talkative repair drone helps them navigate through Planet 532. The assistant was built using four primary components:
The integration was built in stages to simplify the development process. For each stage, the team first arranged the narrative within Unity and then built the voice interaction model for the Alexa skill using an AWS Lambda function.
To build out the narrative for Alexa, the team used Amazon’s open-source editor Interactive Adventure Game Tool and “saved hours of work,” says Tauseef.
“This helped us quickly draw connections between speech stages in the narrative and automatically create the program feed needed to build the Alexa skill,” says Tauseef.
The team used Amazon DynamoDB to store the game state and update Unity as the guest progressed through the game. Each interaction with Alexa updated in DynamoDB. Then, using Amazon Cognito, the team queried DynamoDB every second to update the game state in Unity. Doing so enabled the team to directly control the player’s state in the world and create the illusion of an immersive world. This led to Alexa’s real-time communication with the player into real consequences in the world, according to Tauseef.
“For example, when a guest asked a question, a door could or could not open depending on their answer,” says Tauseef. “This lent realness to the experience that had previously been unexplored.”
Looking back, the team was happy to have built their project with Alexa according to Longyi, the Unity programmer.
“We could have built characters from scratch using other technologies, but that would have required voice acting, sound design, asset creation, rigging, debugging, and many other steps,” says Longyi. “Using Alexa allowed us to create a unique, fun, and immersive experience.”
In March, we announced the Alexa Fund Fellowship program to drive adoption of Alexa in curriculum. In October 2016, we announced the Amazon Alexa Prize, a $2.5 million competition to advance conversational AI. And we’ve taught thousands of students at developer events and hackathons how to build their first Alexa Skills. As we often say at Amazon, it’s always day one and there is so much more to do.
The Alexa Skills Kit (ASK) enables developers to easily build capabilities, called skills, for Alexa. ASK includes self-service APIs, documentation, templates, and code samples to get developers on a rapid road to publishing their Alexa skills. Browse through more than 10,000 skills, then start building your own.