Recently, Alexa went into space aboard the Orion spacecraft as part of Artemis I, the first of a series of NASA missions that will pave the way for human exploration to the moon and Mars. Alexa joined the mission as part of Callisto, a technology demonstration payload that was embedded into NASA’s Orion spacecraft and built in collaboration with engineers from Amazon, Cisco, and Lockheed Martin. In this series, we are exploring how Amazon’s brightest minds have developed innovative solutions to support Alexa’s journey.
Especially exciting for customers around the world, Alexa was able to access real-time telemetry data and respond to thousands of mission-specific questions onboard Orion, including questions like “Alexa, how fast is Orion traveling?” or “Alexa, what’s the temperature in the cabin?”
In this article, we will explore In this article, we will explore how Amazon’s engineers and scientists made real-time ingestion of telemetry data possible and allowed customers to follow Orion's mission in near real-time simply by saying “Alexa take me to the moon.”
Bringing Telemetry Data from Space Down to Earth
Lacey Williams is an engineering leader and software development manager at Amazon. In her role at the company, Williams currently leads three engineering teams. Since joining the Alexa team a year and a half ago, she played a critical role in the integration of Alexa into the Callisto payload on the Orion spacecraft, in addition to developing solutions to ingest near real-time telemetry data from deep space to customers on Earth.
Telemetry is the automatic process of collecting near-real-time data from different systems. Telemetry data can be found in a variety of places in our day-to-day lives, from mobile GPS systems to in-car devices installed by insurance companies. For the Artemis I mission, the Alexa team collected various data points about Orion in near real time, such as data about cabin pressure, distance from Earth, and temperature.
“To be able to ingest telemetry data from space is especially interesting,” says Williams. “We were literally collecting data from out of this world.”
Making near real-time data from space available to Alexa’s customers
Before diving deep into the development process for Alexa Skills, it was important for Williams’s team to identify the types of questions that it could answer: a process that required extensive collaboration with NASA.
“We were developing an interactive skill that can answer hundreds of questions about the mission,” says Williams. “We had to work very closely with NASA to understand the kinds of telemetry data that we would have access to.”
Williams and her team identified multiple telemetry data points from Orion to Alexa customers. The process to ingest the telemetry data was powered by Amazon Web Services (AWS) technologies. First, NASA published a file to a bucket on Amazon Simple Storage Service (Amazon S3), an industry-leading object storage service. The Alexa team pulled the file from Amazon S3 every sixty seconds, following which it was copied and stored in a different internal Amazon S3 bucket.
To automate the retrieval and copying of the file, Williams’s team used AWS Cloud Development Kit (AWS CDK), which accelerates cloud development using common programming languages, and AWS Lambda, a serverless, event-driven compute service. The team also configured Amazon EventBridge, a serverless event bus, to activate an AWS Lambda function once every minute that automatically saves the file from NASA. The NASA and Amazon teams also worked together to test the solution, troubleshoot roadblocks, and develop robust fallback scenarios for situations where the data was corrupted or missing.
As a result of this process, customers had access to the most up-to-date telemetry values through the Alexa Skill. Because the teams had accounted for fallback scenarios, the Alexa Skill was able to provide a response even if an unexpected event occurs.
“Customers could ask questions about Orion at any point in time, and we always need to have an answer,” says Williams. “If, for whatever reason, NASA’s Amazon S3 bucket is unavailable or the file isn’t there, we will always have the latest version of the data that exists.”
Helping people around the world learn about space missions
Alexa’s inclusion in the Artemis I mission will serve as a major leap for ambient intelligence, helping to bring near-real-time data about deep space to people around the world.
“We’ve introduced a really cool user experience on our multimodal devices and speakers to convey information from aboard the rcoket,” says Williams.
Williams is optimistic about voice technology’s future use in space, both for communicating information to Alexa customers and supporting astronauts on their missions. Williams foresees a future where astronauts who are constantly on the move will be able to access up-to-date data about their spacecraft, such as information about velocity, cabin temperature, or distance from Earth.
Using Alexa, Williams believes that astronauts will be able to access important information without having to use their hands or step away from their tasks. “Having the ability to ask questions and get near-real-time responses is powerful,” says Williams.
The Artemis I mission is a historic event, marking NASA’s first return to the moon in 50 years. It will pave the way for future space innovations, such as sending astronauts to Mars. The fact that this initiative will put the first woman on the moon is particularly exciting to Williams. “This really is a first for Alexa and for Amazon but also for people, which I think is just beyond exciting,” continues Williams. “We’re honestly scratching the surface with what we’ll be able to do.”