Alexa Blogs

Alexa Blogs

Want the latest?

alexa topics

Recent Posts

Archive

Showing posts tagged with Alexa science

June 13, 2019

Stan Peshterliev

New approach to "active learning", or automatically selecting training examples for machine learning, improves the performance of natural-language-understanding system by 7% to 9%.

[Read More]

June 11, 2019

Young-Bum Kim

Natural-language-understanding system that includes both a generic model for a language and several locale-specific models improves accuracy by an average of 59% over individual locale-specific models.

[Read More]

June 06, 2019

Arpit Gupta

A new Alexa system adopts a simpler, more scalable solution to the problem of "reference resolution", or tracking references through several rounds of dialogue, by overwriting referring words with their referents.

[Read More]

June 05, 2019

Alexa Science Team

At Amazon's re:MARS conference, Alexa VP and head scientist Rohit Prasad announced new AI technology that will enable Alexa to engage in multiturn dialogs that span skills, shifting the cognitive burden of complex task coordination from the customer to Alexa.

[Read More]

May 21, 2019

Viktor Rozgic

The combination of an autoencoder, which is trained to output the same data it takes as input, and adversarial training, which pits two neural networks against each other, confers modest performance gains but opens the door to extensive training with unannotated data. 

[Read More]

May 16, 2019

Ming Sun

Text normalization is the process of converting particular words of a sentence into a standard format so that software can handle them. Breaking inputs into component parts and factoring in syntactic information reduces the error rate of a neural text normalization system by 98%.

[Read More]

May 13, 2019

Judith Gaspers

Cross-lingual transfer learning improves performance of named-entity recognizer by up to 7.4% versus a system trained from scratch in the new language.

[Read More]

May 03, 2019

Young-Bum Kim

In the past year, we’ve introduced name-free skill interaction for Alexa, which allows customers to invoke skills without mentioning them by name. Y. B. Kim explains how we add new skills to the skill selection model without causing "catastrophic forgetting."

[Read More]

May 02, 2019

Rahul Goel

Transfer learning, copy mechanism improve performance of "semantic parser".

[Read More]

April 25, 2019

Jakub Lachowicz

The ability to build new text-to-speech models with relatively little speaker-specific training data could enable a wide variety of customizable speaker styles.

[Read More]

April 22, 2019

Xing Fan

The wake word provides an acoustic profile that can be used to identify utterances from the same speaker.

[Read More]

April 18, 2019

Ming Sun

Alexa scientists use semi-supervised learning and "pyramidal" neural networks to address the problems of sound identification and media detection.

[Read More]

April 11, 2019

Jun Yang

Novel reconfigurable-filter-bank design enables more precise control of signal waveforms.

[Read More]

April 08, 2019

Quynh Do

Transfer of a model co-trained on intent classification and slot (variable) tagging halved the data required to achieve a given level of performance.

[Read More]

April 04, 2019

Hari Parthasarathi

To make it computationally feasible to train a speech recognizer on a million hours of speech, Alexa scientists used an array of techniques that could generalize to other large-scale machine learning projects.

[Read More]

Want the latest?

alexa topics

Recent Posts

Archive