Explore the future of customer support with generative AI

Staff Writer Apr 09, 2024
Share:
Alexa Skills
Blog_Header_Post_Img

The advent of generative AI presents both tremendous potential and considerable challenges for organizations, prompting many to question where to begin or what to prioritize. In the inaugural Alexa Insider Webinar series, Bret Kinsella, founder and CEO of Voicebot.ai, along with Emerson Sklar, Amazon Alexa's Chief Evangelist, delved into strategies for businesses to navigate the upcoming surge in generative AI technologies. They outlined both immediate and long-term strategies for success.

Integrating generative AI into customer support

Business leaders curious about the vast capabilities of generative AI are encouraged by Kinsella to consider its fundamental strengths. He highlights the technology's proficiency in analyzing text, which can revolutionize organizations inundated with textual data—whether through extensive documentation, customer interactions in call centers, or through generating research, reports, and financial records. These entities stand to gain significantly from integrating generative AI into their operational processes.

Elevating customer experiences with generative AI

Kinsella advocates for an "augment first, automate second" approach when incorporating generative AI tools into customer support frameworks. Despite their sophistication, these tools are not yet viable replacements for human agents across all scenarios. A prudent initial step is to leverage them in enhancing team efficiency and improving customer service quality.

Practical applications include:

Agent assistance: By utilizing LLMs, agents can access necessary information and tools promptly, improving their interaction with customers. LLMs could, for example, suggest follow-up questions during a call or furnish agents with comprehensive caller profiles for personalized service.

Summarization: Post-call, LLMs can provide agents with concise summaries, significantly reducing the time spent per summary from ten minutes to approximately ninety seconds. This capability has been shown to improve operational efficiency dramatically.

Kinsella recommends focusing on achievable enhancements through LLMs that can substantially benefit user and employee experiences, especially for those new to this technology.

However, integrating generative AI tools into customer support systems is not without challenges. Organizations need robust policies and governance to oversee the use of these tools. Additionally, fostering partnerships is crucial to overcome knowledge and skill gaps, emphasizing the importance of both technological and business strategies.

Implementing generative AI tools

For businesses poised to adopt generative AI, Kinsella offers several tips:

Start with low-risk options: Given the extensive knowledge base of LLMs, including sensitive information, it's advisable to begin with safer applications to monitor tool performance and develop effective policies.

Begin small, think broadly
: While starting with manageable, low-risk implementations is wise, planning for long-term, omni-channel integration is essential. Consider the evolving needs and behaviors of your user base and potential future integrations.

Conduct thorough research: Choosing the appropriate LLM for your organization requires careful consideration of your needs, strengths, weaknesses, and goals.

Kinsella and Sklar emphasize viewing generative AI not as a static solution but as a dynamic capability within customer support strategies. Success hinges on management's ability to adapt and integrate generative AI flexibly across the organization.

Watch the webinar to discover how Alexa Smart Properties can position you at the forefront of AI, LLMs, and ambient intelligence advancements.

Recommended Reading

3 questions with Labinot Bytyqi, CEO and Founder of virtual healthcare delivery platform hellocare.ai
Learn more about Alexa Smart Properties
New developer tools to build LLM-powered experiences with Alexa

Subscribe