Keine Ergebnisse gefunden

Probeer een andere of meer specifieke zoekopdracht
Vielen Dank für deinen Besuch. Diese Seite ist nur in Englisch verfügbar.
Alexa Blogs

Alexa Blogs

Want the latest?

alexa topics

Recent Posts

Archive

Showing posts by Zoey Collier

April 21, 2017

Zoey Collier

Back in January, Alexa shared the keynote stage with two leaders form Acumatica, a leading innovator of cloud ERP and CRM solutions, at the Acumatica Summit 2017.  Together, they showed Alexa clearly has a head for business. 

[Read More]

March 02, 2017

Zoey Collier

In 2012, brothers Maurice and Marcel Eisterhues built a smartphone app for their father. TorAlarm—German for GoalAlert—had a simple purpose: to help dad keep up with the scores for his favorite football teams. (That’s soccer for readers in the USA.) Toralarm_brothers.jpg

What started as a fun project turned into a true opportunity for the two German entrepreneurs. TorAlarm’s popularity grew steadily, until in 2014, the brothers and their father founded a company with the same name. Today, TorAlarm is among Germany’s most popular apps for tracking the scores and schedules of football matches across the country,with over a million users in Germany alone.

Maurice and Marcel knew instantly voice would be the next step in TorAlarm’s evolution when they saw the upcoming launch of Amazon Echo in Germany.

“We were both totally amazed when we first saw the Amazon Echo,” says Maurice. “We’re always interested in new technology, so we decided very quickly we wanted to be part of this launch.”

[Read More]

February 23, 2017

Zoey Collier

Andy Huntwork has worked at Amazon for over 10 years, the last three as a principal engineer. He’s developed front-end and backend services for technologies ranging from websites to payment systems and everything in between. But when the Amazon Echo came out in 2015, he saw a new doorway open. Alexa was an exciting way to bring voice-based experiences to the world, and Andy wanted to part of it.

“So I joined the Alexa team,” Andy says, “and immediately started playing around with the Alexa Skills Kit (ASK).” Only a few months after Amazon released ASK, Andy and his wife, Laura, created their first skill. The skill recited public domain works, like Abraham Lincoln’s speeches and The Jungle Book, but the Huntworks wanted to build something more interactive and engaging.

Laura recalls wondering, “What would you ask Alexa to open that’s exciting, even magical? Wouldn’t it be fun to open a magic door?”

From that simple idea grew The Magic Door, an adventure with Alexa guiding you through a growing number of original, interactive stories. Today, The Magic Door skill is a sophisticated adventure framework, hosting 10 adventure storylines, 30,000 spoken words, numerous character voices and hundreds of sounds effects.

Walking through the Magic Door

TheMagicDoor_map.png

To enter a faraway land of magical creatures, perplexing riddles and hidden prizes, just say Alexa, open The Magic Door. Suddenly, you’re off on an adventure with Alexa as your personal guide.

[Read More]

February 17, 2017

Zoey Collier

When Amazon first introduced the Echo, Nick Schwab was intrigued. He’d always loved voice commands in his car, but hNickSchwab_small.JPGe wasn’t sure he wanted to buy another cool device just yet. Then the Echo Dot came out, and once again, Nick couldn’t resist a good deal. He ordered his own Dot, dug into the Alexa Skills Kit (ASK). Right away, he started working on Bargain Buddy, an Alexa skill to relieve him of a daily surf to find daily deals. 

Two days after the Bargain Buddy was certified, Nick received his Echo Dot in the mail—his first Alexa device. That’s right, he developed, tested and released his first Alexa skill, before he even had his first Echo Dot.

That was early in 2016. These days, Nick has become a force to be reckoned in the Alexa developer community.

[Read More]

January 27, 2017

Zoey Collier

Just Eat has grown a lot since its humble beginnings in a Danish basement in 2001. Now headquartered in London, Just Eat is listed on the London Stock Exchange and is the world’s leading marketplace for online food ordering and delivery. Its goal, simply put, is to revolutionize the way people find, order and enjoy food.

Just Eat is making good on that mission. Today, it connects more than 62,000 restaurants across 100 cuisines in 15 countries, with an audience of over 15 million people.

Craig Pugsley is a principal designer in Just Eat’s Product Research team. He says the UK has a long tradition of delivery and takeout meals. Just Eat’s apps let diners explore exciting new cuisines at nearby restaurants. With menus for over 27,000 restaurants in the UK alone, it’s easy to find a new favorite flavor anytime.

Research quickly showed Pugsley’s team that diners tend to order their favorites again and again. So when Amazon brought Echo and Alexa to the UK, Just Eat saw a new opportunity. The Just Eat Alexa skill would make reordering a tasty new fave even easier, with just a few words:

“Alexa, tell Just Eat to re-order Dim sum.

No phone calls. No fumbling for a smartphone app. And no digging out credit card details. Just quick delivery of your favorite comfort food.

[Read More]

January 18, 2017

Zoey Collier

Earlier in the summer, Ashwin Karuhatty reached out to a group of connected home integration professionals in the Custom Electronics Design and Installation Association (CEDIA). Karuhatty, part of Amazon’s Smart Home business development team, wanted to encourage integrators to develop new Alexa skills for the connected home. CEDIA’s annual conference was an ideal place to start.[Read More]

December 23, 2016

Zoey Collier

jeopardy.jpg

Watchers of Jeopardy!, America’s favorite TV game show, just can’t get enough of the show. For nearly 35 years, Jeopardy! has tested the trivia knowledge of contestants and viewers of all ages. It presents contestants first with answers (clues), then contestants frame their guesses in the form of a question. Home audiences have become so engaged in the TV game’s play, they often shout out the answers to their televisions.

Public fascination with Jeopardy! has led to a long line of off-the-air versions. These started with board games and card games, then electronic versions for game consoles and personal computers, and more recently mobile apps. In all these formats, though, one magical component was always missing. They lacked a way to let players answer out loud (in the form of a question, of course) and have that response validated.

Up until December of last year.

One year ago, on December 30, 2015, Sony Picture Television launched a new version for Amazon Echo users, Jeopardy! J6, built with the Alexa Skills Kit (ASK). On that day, Alexa became the new host of the first all-voice version of Jeopardy!

The genesis of Jeopardy! J6

Sony Television’s games division wanted a way to keep its Jeopardy! fan base growing and engaged, especially younger audiences. To do that, they created a new online version of the game, called Jeopardy! J6, or “J!6” for short.

The premise of J!6 is simple. In addition to the five clues presented for each quiz category on the show, the writers create a backup sixth clue. Most of these clues are never used on the show. With J!6, however, die-hard Jeopardy! fans can play those extra clues. And since the J!6 categories change along with those on the show, playing online feels like an authentic extension of the episode the player just watched.

Geremie Camara, head of the Games Group at Sony Pictures Television, says using high-caliber material from the show makes the J!6 experience authentic and engaging. However, many online quiz games present a multiple-choice list of possible guesses. Somehow, they never quite feel the same as the show…

Alexa gets an instant green-light

When Alexa came along, Camara said they’d found the missing piece —a way to present the studio’s high-quality material in an authentic, interactive experience.

The idea for the skill first came from an intern’s summer project. After Amazon shipped two Echo devices to Sony’s R&D group, a intern built a rapid prototype in two short weeks. Though it was full of hard-coded clues and questions, Camara said there was no doubt: they were onto something big.

“We were all blown away at how good the Echo and Alexa technology was,” says Camara, “but also at how good it felt for the brand. We worked very closely with Jeopardy!’s Supervising Producer Rocky Schmidt to ensure that the Echo experience would live up to the TV show’s high standards." 

From concept to a quality production

Though an intern did the first pass, a senior engineer built Jeopardy J!6 skill from the ground up. It is written in Java and runs on AWS Lambda, which makes it simple to manage, scalable and very lightweight.

[Read More]

December 23, 2016

Zoey Collier

customerpicks_12-22.png

Today’s guest post comes from the Alexa Skills Marketing Team.

This year, developers have created thousands of skills for Alexa, expanding Alexa’s capabilities and⁞ offering consumers novel new voice experiences. As the year draws to a close, we're pleased to share the top customer picks for Alexa skills this year. From home automation to bringing zen into your life, here's a list of the most popular, innovative, and entertaining skills that customers enabled in 2016.

Smart Home
Control a wide range of connected devices with smart home skills. Set the temperature in your living room with SmartThings skill or adjust the lighting with Wink skill. See all customer picks >

Lifestyle & Fitness
Start your day right with exercise and intention. The Focus Word skill provides an inspirational word or phrase and 7-Minute Workout skill lets you get in your exercise quickly and efficiently. See all customer picks >

Music, Movies & TV
Relax at night while listening to Rain and Ocean sounds or catch up on your favorite series, like you can with The Voice skill. See all customer picks >

Education & Reference
Learn more and gain knowledge in new areas of interest with the help of these Alexa skills. Expand your vocabulary with Daily Buzzword skill or learn all about canines with Dog Facts skill. See all customer picks >

Food & Drink
Planning dinner is a lot easier with the help of the Campbell’s Kitchen skill. Don’t forget the cocktail with the help from The Bartender skill. See all customer picks >

[Read More]

December 22, 2016

Zoey Collier

edf-bluelab-amazon-echo-1.jpg

EDF Energy is one of the UK’s largest energy companies and its largest producer of low-carbon electricity. It produces around one-fifth of the nation's electricity from its nuclear power stations, wind farms, coal and gas power stations.

Bhavesh Limani is a project manager at Blue Lab, EDF Energy’s innovation accelerator near Brighton in the UK. Launched in 2015, Blue Lab monitors emerging technologies that help shape EDF Energy’s customer experience. One of its primary focus areas is the connected home, including how customers can manage their energy accounts and energy consumption.

When Amazon Echo launched in the United States, it grabbed Blue Lab’s attention. In collaboration with EDF Energy’s R&D UK Centre, the Blue Lab team obtained two Echo units in late 2015. It then began to explore linking voice technology to energy account functionality. Blue Lab wanted to be ready whenever Amazon released Echo and Alexa in the UK.

When Amazon started shipping Echo to UK customers on 28th September, EDF Energy was one of the first UK-specific skills made available to UK customers.

From proof-of-concept to an effective VUI design

Over the last few years, EDF Energy has worked to give customers more direct access and control of their energy accounts. They first created an online sales and service portal, followed by smartphone apps for iOS and Android users.

“Our customers expect digital solutions now,” says Stuart Roberts, Head of Digital Operations at EDF Energy. “We used Alexa as an opportunity to develop a voice channel to extend the online account management experience to voice.”

As the EDF Energy project team refined their proof of concept, they identified four use cases to meet core customer needs and provide a stand-out experience:

  • check account balance
  • check when next payment is due
  • check the contract end date
  • submit a meter reading

The EDF Energy team established an initial voice user interface (VUI) framework and collaborated with Amazon to refine the VUI. Investing time up front was key to minimizing changes and risks later in development.

“I would say most of our voice interface was well-developed from our first cycle,” says Bhavesh. “The Amazon team was absolutely brilliant in helping us to evaluate the various options.”

[Read More]

December 16, 2016

Zoey Collier

Today's guest post comes from Jim Kresge from Capital One Engineering.

In March 2016, Capital One became the first company to offer its customers a way to interact with their financial accounts through Alexa devices. With the Capital One skill for Alexa, customers can access in real time all of their Capital One accounts -- from credit cards to bank accounts, to home and auto loans. The skill is highly rated on the Alexa app, with 4/5 stars. CapitalOne_TechCaseStudy_234.jpg

The Capital One team has continued to update the skill since launch, including a recent update to the skill called “How much did I spend?” With the update, Capital One customers can access their recent spending history at more than 2,000 merchants.  Customers who have enabled the skill can now ask Alexa about their spending for the past six months--by day, month, or a specific date range--through questions posed in natural language such as:  

Q:  Alexa, ask Capital One, how much did I spend last weekend?
A:  Between December 9th and December 11th, you spent a total of $90.25 on your Venture Card.

Q:  Alexa, ask Capital One, how much did I spend at Starbucks last month?
A:  Between November 1st and November 30th, you spent a total of $43.00 at Starbucks on your Quicksilver Card. 

Q:  Alexa, ask Capital One, how much did I spend at Amazon between December 1 and December 15?
A:  Between December 1st and December 15th, you spent a total of $463.00 at Amazon on your Quicksilver Card.

The building of the skill was a collaborative effort between product development, engineering and design teams at Capital One. I have the privilege of representing the great work of the entire team in this blog post to give a behind the scenes look at the building of the Capital One skill.

A Beta is Born

In summer 2015, a group of engineers at Capital One recognized the potential to develop a skill for accessing financial accounts using Amazon Echo. We got together for a hackathon, worked our way through several possibilities, and began building the skill. The Beta version included a server-side account linking mechanism that we built ourselves. We were able to use an enhanced beta version of the Capital One mobile app to provide the account linking interface and created some AWS infrastructure to support it. We then demoed the Beta at the AWS re:Invent conference in October 2015.

Evolving the Beta

Having proved out the Beta version of the skill, we became really driven and focused on building the first skill for Alexa that would enable people to interact with their financial accounts.

We began working on a production version in December, 2015, with the goal of delivering a product by March, 2016. Working in an iterative design model, we found that coding the skill for Capital One financial accounts was relatively straightforward. But, as with anything game-changing, we realized that what we were attempting involved some things no one had done before. First, we were attempting to integrate sensitive data with Alexa, which no company with a skill on Alexa had done yet. It was also the first time we had built a conversational UI. And, the Ask Alexa software was still maturing and evolving as we were building the skill, which meant that we needed to be flexible in quickly making adjustments to code.

We started with the premise that in the first iteration, Capital One credit card and bank customers can ask Alexa things like their current account balance, their recent transactions, and when their next bill is due.

Data security is always top of mind for us, as was creating an experience for customers that was friction-free and simple.

With Amazon, we worked through possible solutions within the Alexa infrastructure to build in a security layer that ensures data integrity while still providing a simple, hands-free experience. In addition to using OAuth to securely link accounts, we added a security solution that involves an in-channel spoken “personal key.”  As users set up the Capital One skill and pair their accounts using OAuth, Alexa asks the user if they would like to add a “personal key,” a 4-digit personal identification code.

In addition to wanting users to be able to control access to their account information, we wanted the language Alexa uses in her conversations with customers to be warm and humorous at times. We learned a lot through testing and are using that feedback as we fine tune tone and wording along the way.

Some Creative Technical Work

We built the Capital One skill using node.js. We also use AWS to host our skill and internal APIs to get customer account information. The basic engineering work is straightforward and the Amazon developer portal documentation makes it easy to learn. Here are a few of the creative technical solutions we added on top of the basic engineering work to help us move fast with high quality:

The Capital One utterance compiler

We created a tool that automatically generates an expansive set of utterances from just a few input parameters.  This allows us to avoid maintaining a huge list of individual utterances for our skill. For example, in our "AccountBalance" intent, we have many ways of asking for the balance on an account. To this already long list we then added account types (e.g. checking, savings, etc). After that we added product names (e.g. Venture credit card, Quicksilver credit card). Our list of utterances for that intent is now huge when you incorporate all the different ways customers can ask for their balance across account types and product names.  Our utterance compiler makes it simple to generate and maintain all these utterances.

[Read More]

December 06, 2016

Zoey Collier

On November 18, the first episode of The Grand Tour series marked the most-watched premiere in Amazon’s video streaming service’s history. British car enthusiasts Jeremy Clarkson, Richard Hammond, and James May returned to the screen for an all-new series of globetrotting adventures. Each episode takes Amazon Prime Video viewers to another exotic location.

For Amazon Alexa users, watching The Grand Tour is only half the fun. Prior to the series premiere, Amazon debuted a companion skill built by PullString on the Alexa Store, available to its US and UK customers.

Each Thursday, prior to the show’s Friday airtime, The Grand Tour skill provides a new clue about what to watch for in the upcoming video episode. On Saturday, if viewers are truly “on the tour” and answer three trivia questions correctly, they’ll unlock exclusive video content.

The fun aside, what makes the skill unique is another first: the PullString Platform on which it was developed.

Developing conversational experiences with Alexa

Mike Houlahan, head of PullString’s enterprise partner program, explains Oren Jacob and Martin Reddy co-founded the company in 2011. The two Pixar Animation veterans’ vision was to build lasting emotional connections between characters and audiences using two-way computer conversations. They noted an absence of professional toolsets for building conversational experiences between a character and its audience, and they set about filling that gap.

The PullString Platform is an all-in-one environment that lets developers and authors create award-winning conversational experiences, like the Lt. Reyes chatbot from Call of Duty and Hello Barbie.

Now, the company makes the power of the PullString Platform available to Alexa developers. “We are very excited to launch The Grand Tour skill,” Houlahan said. “We are simultaneously announcing the availability of PullString for the Alexa Developer Community to build their own Alexa skills.”

The PullString Platform includes:

  • A professional conversation authoring and debugging environment
  • A conversational AI engine to interpret and drive the interaction
  • Text message and bot conversation support
  • A platform to host the experience
  • Direct publishing to the Alexa environment

Learn more about the PullString Platform.

Creating The Grand Tour skill

With the PullString Platform, a creative writer can prototype, develop, test and deploy an entire skill without writing a single line of code. That’s just what Danielle Frimer did.

Frimer is the creative writer who scripted the voice interaction model (VUI) for The Grand Tour Alexa skill using PullString. She worked with Amazon Prime Video to get the show’s actors into the recording booth to record dialog, and put it all together using the PullString Platform.

“I am not a developer in any way,” says Frimer. “With the platform, I could focus my attention on the creative aspects of it—the lines, the flow of things, the overall design—not on the underlying nuts and bolts of it.”

The skill’s design mimics the flow of The Grand Tour’s episode rollout. The voice interaction, of course, is peppered with the recorded dialog, making the experience even more engaging.

Frimer says PullString’s templates and documentation give developers a quick-start on different types of conversation projects. In all cases, it relieves both authors and developers of the complicated logic involved with a complex VUI model.  

[Read More]

December 02, 2016

Zoey Collier

Tushar Chugh is a graduate student at the Robotics Institute at Carnegie Mellon University (CMU). There he studies the latest in robotics, particularly how computer vision devices perceive the world around them.

One of his favorite projects was a robot named Andy. Besides having arms, Andy could discern colors and understand spatial arrangement. Andy could also respond to voice commands, like “pick up the red block and place it on top of the blue block.” Andy’s speech recognition, a CMU framework, was about to change.

When Amazon came to give some lectures at CMU, they had a raffle drawing. Chugh won the drawing and took home a new Amazon Echo as a prize. Over three days and nights without sleep, he completely integrated Andy and Alexa using the Alexa Skills Kit (ASK).

When he saw Hackster’s 2016 Internet of Voice challenge, he knew he had to enter. And in August 2016, Chugh’s Smart Cap won the prize for the Best Alexa Skills Kit with Raspberry Pi category.

The inspiration and genesis of Smart Cap

According to Chugh, there are about 285 million visually-impaired people in the world. In 2012, he worked on a project to help the visually impaired navigate inside a building. His device, a belt with embedded sensing tiles, won a couple of prizes, including a Wall Street Journal Technology Innovation Award. It was ahead of its time, though, and it wasn’t yet practical to develop the technology into a commercial product.

A lot can change in four years, including Chugh’s discovery of Alexa. Besides dabbling with Alexa and Andy the robot, he has also worked with Microsoft Cognitive Services for image recognition. Chugh now saw a chance to bring a new and better “seeing device” to light.

“When I saw Alexa, I thought we can extend it and integrate [Alexa] as a separate component,” says Chugh. “I talked with a couple of organizations for the blind in India, and they agreed this kind of system would be very, very useful. That was my main motivation.”

Chugh says the hardware for the Smart Cap is basic. He used a Raspberry Pi (RPi), a battery pack, a camera and a cap on which to mount it. As for the software, it included:

  • Alexa Skills Kit (ASK)
  • Amazon Web Services (AWS)
  • DynamoDB
  • Microsoft Cognitive Services (MSCS)
  • Custom Python code to run RPi to interface with the camera and MSCS

The goal was straightforward. A visually-impaired user could ask Alexa what is in front of them. Alexa would vocalize the scene, allowing the person to navigate safely wherever he or she may be.

How the Smart Cap works

How do the pieces all fit together?

Chugh says there are two distinct parts.

First, the image capture and analysis:

  • As the Smart Cap wearer walks down the street, a Python script on the RPi directs the camera to take pictures every two seconds.
  • Another program sends the image to MSCS using RPi’s WiFi / phone connection.
  • MSCS returns a text description of the image with relevant keywords.
  • The description is stored on RPi, then sent via AWS to be stored on DynamoDB.

Now comes the Alexa skill:

  • The wearer says “Alexa, ask Smart Cap to describe the scene” or “Alexa, ask Smart Cap what is in front of me”.
  • The skill uses AWS Lambda to retrieve and parse the latest value from DynamoDB.
  • Alexa responds with the description and keywords via the speaker or bone conduction headphones.
[Read More]

November 23, 2016

Zoey Collier

In early 2014, Mandy Chan was attending a business conference when she discovered her true calling: emerging technology. A business analyst by training, Mandy started searching for a way she could break into the tech industry, to immerse herself in the technologies disrupting every industry. She researched online, frequented bookstores, and eventually decided to teach herself JavaScript programming.

Barely a year later, Mandy attended the Technica Hack ‘15—and won the JP Morgan Chase prize for best mobile app. For Mandy, that event started a new career. After participating alongside so many eager and helpful programmers, she knew what she wanted to do.

Just 11 months later, Mandy has won three more hackathons, including NY TechCrunch ‘16 and Manhattan AngelHack ’16. But unlike her first contest, all these prizes were for Alexa skills.

A hackathon skill helping users around the world

Mandy first discovered Amazon Echo while attending a 2016 developer conference in San Francisco. She watched a team working with an Echo, and it instantly appealed to her interests in both back-end software development and artificial intelligence. “It was like having my code right in front of me. I talked to my code, and the code kind of talked back to me,” says Mandy.

The day before TechCrunch New York in May, Mandy dove into all the online Alexa Skills Kit documentation she could find. The next day, nervous but determined, she created the prototype that became the Dr. Speech skill—and won the Best Use of Alexa prize.

Mandy, originally from Hong Kong, wanted to help others improve their pronunciation of challenging words, so she created a skill called Dr. Speech, which helps non-native English speakers pronounce words accurately, thereby giving them more confidence without expensive speech therapy sessions.

Mandy gets tweets from people around the world thanking her for how Dr. Speech has been instrumental in improving their pronunciation and has also inspired other developers to build self-improvement skills. Similarly, a user of Mood Journal—another of Mandy’s Alexa skills—wrote to say it helped him in battling anxiety and depression.

Humbled, Mandy repeats she loves to write software that helps people. “Every skill I write is an extension of me. Dr. Speech is about improving speech, because I strive to be a great speaker. I never imagined how my skills would have touched so many people.”

[Read More]

November 18, 2016

Zoey Collier

Developers have created thousands of skills for Alexa, expanding Alexa’s capabilities and offering consumers novel new voice experiences. We recently unveiled a new way for customers to browse the breadth of the Alexa skills catalog by surfacing Alexa skills on Amazon.com.

Today we are introducing a new program that allows you to nominate your favorite Alexa skills to be featured in our Community Favorites campaign. Skills that are nominated and meet the selection criteria will be featured in the Alexa app and on Amazon.com in December. This is a great way to help customers everywhere discover new, intriguing and innovative skills on their Alexa-enabled devices.

What’s your favorite Alexa skill? Take a minute to tell us your favorite Alexa skill and help others discover an engaging and innovative skill to try.

 


Get Started with Alexa Skills Kit

Are you ready to build your first (or next) Alexa skill? Build a custom skill or use one of our easy tutorials to get started quickly.

Share other innovative ways you’re using Alexa in your life. Tweet us @alexadevs with hashtag #AlexaDevStory.

 

November 16, 2016

Zoey Collier

Magic mirror, on the wall—who is the fairest one of all?

Probably the most memorable line from Disney’s 1937 classic, Snow White and the Seven Dwarfs, it may soon become a household phrase again. Modern-day magic mirrors are taking a number of forms, from toys to high tech devices offering useful information to their masters. Now, Darian Johnson has taken that concept an enormous step farther.

Darian, a technology architect with Accenture, has worked in software solution design for 17 years. Today he helps clients move their on-premise IT infrastructure into the cloud. With a recent focus solely on Amazon Web Services (AWS), it’s only natural other Amazon technologies like Alexa would pique his interest.

One night, Darian was pondering what he might build for Hackster’s 2016 Internet of Voice Challenge. He was surfing the web, when he happened on an early concept of a Magic Mirror and realized he could do even better than that. He did. In August 2016, Darian’s new Mystic Mirror won a prize in the Best Alexa Voice Service with Raspberry Pi category.

A smarter mirror with the Alexa Voice Service

Darian says his morning routine consists of running between bedroom and bathroom, trying to get ready for work. He doesn’t have an Amazon Echo in either, but he does, however, have mirrors there. That’s another reason why an Alexa Voice Service (AVS)-enabled mirror made sense.

He set his budget at a mere $100. That covered a Raspberry Pi (RPi), a two-way mirror, a refurbished monitor and speaker, some wood planks and a few other assorted items. He determined that his device would:

  • Give the mirror-gazer access to all the skills available through Alexa
  • Provide unique visual capabilities in the mirror face via a custom Alexa skill
  • Display information only for a finite amount of time before it fades away (to make it mystical—and because Darian is light-sensitive when he sleeps)

You can build your own Mystic Mirror using the details on the Hackster site. But it was his software and Alexa that brought it to life.

Darian decided to voice-enable his Raspberry Pi, microphone and speaker with the Alexa Voice Service (AVS). That meant the Mystic Mirror’s master would have access to the built-in power of Alexa and over 4,000 third-party skills, developed using the Alexa Skills Kit (ASK). With just a word, they could control smart home devices, ask for a Lyft ride, play music from Amazon Prime accounts and much more. Best of all, since Alexa is getting smarter all the time, the mirror’s capabilities would constantly evolve, too.

[Read More]

Want the latest?

alexa topics

Recent Posts

Archive