Recently, Alexa went into space aboard the Orion spacecraft as part of Artemis I, the first of a series of NASA missions that will pave the way for human exploration to the moon and Mars. Alexa joined the mission as part of Callisto, a technology demonstration payload embedded into NASA’s Orion spacecraft and built in collaboration with engineers from Amazon, Cisco, and Lockheed Martin. In this series, we are exploring how Amazon’s brightest minds have developed innovative solutions to support Alexa’s journey.
Because Artemis I is an uncrewed mission, Amazon worked with partners at Lockheed Martin and Cisco to build a virtual crew experience at NASA’s Johnson Space Center in Houston, Texas. Based in the Mission Control Center, the experience provided remote access to Callisto, and allowed Amazon to simulate interactions between Alexa and future astronauts.
For the mission, the team had to deal with issues related to the lack of Internet connectivity.
Typically, Echo devices use on-device keyword spotting designed to detect when a customer says the Alexa wake word. This on-device buffer exists in temporary memory. After the wake word is detected, the device streams the audio to the cloud for speech recognition and natural language processing.
However, for the Artemis I mission, the ability to connect to the cloud was severely constrained. NASA’s Orion spacecraft uses the Deep Space Network to communicate back to earth.
Samuel Edwin is a senior software development manager at Amazon, managing the technical team that builds Alexa for third-party devices. Samuel has been with Amazon for over eight years and worked on content development for Kindle before joining the Alexa team in 2019.
"Ever since I was a child, I have been fascinated with high speed travel — be it the Concorde supersonic jets, that I learned about while watching the movie Airport, or space travel from Star Trek. NASA’s Artemis mission was very meaningful to me on a personal level, and it was exciting opportunity to be part of the future of space travel. "
In this article, Samuel explains how Amazon engineers used Local Voice Control to circumvent limited internet connectivity, and send Alexa to the moon. While designing these features, Samuel had to think beyond the uncrewed Artemis Mission I mission to future missions, where Alexa could potentially interact with astronauts aboard the spacecraft.
Local Voice Control — and so much more
To accurately respond to questions posed by customers, Alexa uses a combination of complex technologies, including machine learning (ML), speech recognition, and natural language processing.
“The audio first needs to be processed, following which the reply is formed using Alexa’s artificial intelligence,” says Samuel. “The reply is then sent back to the device to answer the user’s query.”
This audio processing and machine learning are usually processed in the cloud. However, cloud processing cannot be performed where there is little to no connectivity, such as aboard the Orion spacecraft. To overcome this issue of limited Internet connectivity, Alexa’s engineers implemented Local Voice Control to process audio messages in low-connectivity scenarios. Alexa Local Voice control allows select devices to process voice commands locally, rather than sending information to the cloud.
To strengthen Local Voice Control for the Artemis I mission, Samuel’s team implemented three innovative solutions.
First the team developed local skills for cabin control. telemetry, and music to respond immediately to requests from mission control.
The cabin control skill that acts in near real time, is critical to respond to requests from astronauts on future missions. Because the data for these queries is available in the spacecraft’s cabin, Local Voice Control can handle and process them locally, allowing astronauts to control lights and other cabin systems. For example, they might ask “Alexa, change the cabin lights to blue.“
Samuel’s team developed a telemetry skill to encompass data covering different facets of the spacecraft. These include things like temperatures inside and outside the cabin, available fuel supply, and time left to return to earth. Lockheed Martin provided a TQE (Telemetry Query Engine) library which allows the transmission of telemetry-related information. Alexa then parses the information and converts it into a human readable form output. The final step is to render it in speech using Text-to-Speech (TTS) engine.
Alexa engineers also built a music skill to run locally inside of the Orion spacecraft. This skill provides the ability to play audio files from a wide variety of genres on demand.
When Local Voice Control cannot process data locally, a long tail query is formed. For these scenarios, such as a request to provide a sport score, the team leveraged the Alexa Client Service (ACS), an intermediate hybrid system that runs in the Mission Control Center for the Orion spacecraft. The query is sent to the Alexa Client Service, processed, and routed to the appropriate skill to enable a fast and accurate response.
The team leveraged protocols designed for smart devices on earth to overcome the issue of low bandwidth. An Alexa device typically connects to the cloud through a Hyper Text Transfer Protocol (HTTP) connection. To reduce latency between the Orion spacecraft and the ground network, Samuel and his team created a Message Queuing Telemetry Transport (MQTT) bridge that connects the ground server to the Orion spacecraft through the Deep Space Network. MQTT is a lightweight and widely adopted messaging protocol that is designed for Internet of Things (IoT) devices.
Here’s how everything comes together. When a user aboard Orion asks Alexa a question, the query streams in Real Time Protocol (RTP) payload to the ground server. This is essentially the Alexa Client Service running in Houston’s Mission Control Center.
“Using the ACS, the ground server maintains a specific connection to Alexa running in the Orion spacecraft,” says Samuel. “A reply is produced, which is translated into an MQTT message and sent back to the spacecraft.”
Innovating in space to deliver improved experiences on earth
Through the Callisto project, Alexa’s engineers have demonstrated Alexa’s utility for critical space missions and other time-sensitive activities. As the technology evolves, Samuel looks forward to seeing how it can support new initiatives, projects, and applications. In particular, he believes that these innovations can help customers stay connected to Alexa, even in places of little to no connectivity, such as in a tunnel or an elevator.
“Alexa isn’t just for entertainment or basic questions. This technology goes beyond that and can be used to make your work life very simple,” says Samuel. “We can use this solution to spark other innovations that can be used at home and on the go.”