In September 2018, Amazon’s principal solutions architect Philippe Lantin received a call from his manager.
“He said that there was something unique on the horizon, and that their team was being roped into a one-in-a-lifetime opportunity,” says Lantin.
This was no understatement: on the horizon was an opportunity for Amazon to collaborate with Lockheed Martin Space, and integrate Alexa into NASA’s Orion spacecraft. Orion is the first human-rated spacecraft to visit the moon in more than 40 years.
“NASA is trying to engage the public more as we enter this new era of space travel, where we are setting the stage for extra-planetary exploration,” says Lantin. “Given that over 100 million Alexa-enabled devices have already been sold, having Alexa answer questions like “Alexa, how far to the moon?” and “Alexa, how fast is Orion going?” is a great way to get people around the world involved in NASA’s missions.”
Setting up an Echo device on earth is simple: all you need is a Wi-Fi connection and the Alexa app. However, things are far more complicated in space.
“We had several constraints we had to contend with,” says Lantin.
The Alexa team had to operate within a key physical constraint: the shape of the device. The contours of a smart speaker greatly influences it acoustics. To give just one example, the round shape of the Echo Dot offers a full cavity behind the woofer for a better bass response.
However, when it came to NASA’s Orion spacecraft, Alexa’s acoustic engineers had to work with what was provided by Lockheed Martin and NASA.
“We were somewhat limited by the form factor, which was a small briefcase-like enclosure that was 1.5 feet by one foot and about five inches in depth.” says Lantin.
There were other physical constraints. Equipment developed for the mission had to be resilient to extreme shocks and vibrations, be at least minimally resistant to radiation emissions in space, and utilize highly specific and custom-built components such as power and data cables.
Limited Internet connectivity
The team also had to deal with issues related to the lack of Internet connectivity. Typically, Echo devices use on-device keyword spotting designed to detect when a customer says the wake word. This on-device buffer exists in temporary memory. After the wake word is detected, the device streams the audio to the cloud for speech recognition and natural language processing.
“However, for the Orion mission, our ability to communicate with the Alexa cloud was severely constrained,” says Lantin. “NASA’s spacecraft uses the Deep Space Network to communicate with earth. The bandwidth available to us on the downlink connection is slightly better than dial-up modem speeds with latencies of up to five seconds. To further complicate matters, NASA prioritizes traffic for navigation and telemetry for the first payload – traffic for Alexa was consigned to the secondary payload.”
The team also wanted to demonstrate a fully autonomous experience, one that can be used in future missions where Earth connectivity is no longer a practical option for real-time communications. They used Alexa Local Voice Control to get around the limited internet connectivity. Alexa Local Voice control allows select devices to process voice commands locally, rather than sending information to the cloud.
Lantin says that while the team was motivated by demonstrating technology leadership and scientific innovation in a very challenging environment, the real motivator was making a difference in the lives of millions of customers at home on earth.
“At Amazon, we take pride in delivering customer-focused science,” says Lantin. “That was a huge motivator for us at every step along the way. Consider the innovations we drove to Alexa Local Voice Control. These improvements will allow people on earth to do so much more with Alexa in situations where they have limited or no Internet connectivity. Think about when you are in a car and passing through a tunnel, or driving to a remote camping site. You can do things like tune the radio, turn on the AC and continue to use voice commands, even if you have a feeble signal or no cellular connection.”
Lantin says that the acoustic innovations enabled for Orion will also translate directly into improved listening experiences for people interacting with the mission on earth.
“We are planning to have celebrities, politicians, STEM students and a variety of other personalities interacting with Alexa,” says Lantin. “ And so, we also spent a good deal of time thinking about what people might want to ask Alexa about during the mission.”
The nuances of acoustics aboard Orion
Scott Isabelle is a Solutions Architect at Amazon. Prior to Amazon, Isabelle was a distinguished member of the technical staff at Motorola, where among other projects, he developed systems for enhancing voice quality in mobile devices, methods for generating adaptive ringtones, and a two-microphone system for noise suppression.
“One of the most important things for a voice AI is being in an environment where it is able to pick up your voice,” says Isabelle.
However, this is easier said than done on Orion, where the conical shape of the space capsule, and its metallic surfaces result in increased reverberation.
“The voice can keep bouncing around losing very little energy. This wouldn’t happen in a typical room where soft material like curtains and sofa cushions can absorb some of the sound. In the capsule, the reverberations off the metal surfaces can play up the wrong frequencies that are critical to automatic speech recognition. This can make it really difficult for Alexa to pick up wake word invocations. ”
Alexa also has to contend with increased noise levels aboard Orion.
The ideal signal to noise ratio (SNR) for systems involving intelligent voice assistants is 30 decibels (db.) To place this in context, a SNR of 35 db. is what you would find in a face-to-face conversation between two people standing one meter apart in a quiet room. The SNR onboard the Orion capsule is well in excess of 60 db. – which is what you experience in an airplane during take-off.
To enhance the comfort of astronauts during crewed missions, NASA would ordinarily place acoustic blankets to damp down the reverberation in the hard-walled cabin, and some of the noise created by engines and pumps.
“However, because this is an uncrewed mission we have to work within an environment with more reverberation and noise than we would like,” says Isabelle.
There’s another challenge that results from the lack of humans on board. For Orion, commands to Alexa have to be sent from ground control. The low-bandwidth connections utilized for the transmission can make it challenging to transmit voices at the wide range of frequencies essential for differentiating between sounds.
During a typical phone call, our voice is typically transmitted in the narrow band, which ranges from 300 HZ to 3,000 HZ. For Alexa to make out individual words aboard the noisier environment of the space capsule, the voice would have to be transmitted at 8,000 HZ.
“Voice commands from mission control are transmitted to Alexa via a speaker,” says Isabelle. “Flight-qualified speakers are typically designed for narrow-band communications. And so for this mission we were required to use a speaker that could operate in the flight environment.”
The team relied on what Isabelle calls “brute force” to overcome these acoustic challenges.
“We designed the speaker playback system to play at extremely loud volumes, which allowed us to increase the SNR to where we wanted it to be.”
The team also took advantage of the physical form factor of Alexa on board to overcome the challenges presented by the noisy environment. The speakers, the light ring and the microphones in the briefcase-like enclosure for Alexa are close to each other, which allows acoustic engineers to overcome some of the obstacles presented by the background noise and reverberation.
Finally, the team deployed two microphones in combination with an array processing algorithm. The latter combined the signals from the two microphones in a way that helps Alexa make sense of the commands being issued from mission control. Because the speakers and microphones are in fixed positions relative to each other – as opposed to a room, where people can be located in any number of locations – the algorithms could be more easily designed to distinguish between speech and the surrounding noise.
While the Orion mission will not have any crew members on board, the initial mission will lay the groundwork for Alexa to be integrated into future crewed missions – to the moon, Mars, and beyond. Having Alexa onboard in these future missions would allow crew members to be more efficient in day-to-day tasks, and benefit from the comforts of having Alexa on board such as the ability to play relaxing music and to keep in touch with family and friends back home.
Future crewed missions would have their own unique set of challenges, where Alexa would have to respond to commands from astronauts, who might (literally) be free-floating at multiple points within the capsule. Isabelle and Lantin are already looking forward to overcoming the challenges enabled by crewed missions.
“For someone who grew up watching Star Trek, working on this project has been a dream come true,” says Lantin. “It’s great to be able to build the future. But it’s just as exciting to be able to draw on all of this great work, and be able to enjoy all these new Alexa capabilities during my next vacation, and my day-to-day life right here at home.”