Enhancing a Telecom's Virtual Assistant with

Scaling the 5G network provider's assistant to support millions of customers.

The telecom, America's Un-carrier, has a nationwide 5G network covering 315+ million Americans. They partnered with Master of Code to improve their AI team's assistant with road mapping, prioritizing use cases and NLU design to meet containment and engagement targets.

Learn more

Challenge

As more use cases were added to the virtual assistant, user traffic grew to hundreds of thousands of customers each week. To continually scale and optimize the telecom’s virtual assistant, the Master of Code team needed an NLU design solution to address the assistant’s growing needs.

icon

Ambiguity in customer queries

  • Customers often used ambiguous language and expressed multiple intents within one query, which made it difficult for the chatbot to process and understand their requests accurately.
icon

Model performance reporting

  • There was no dedicated space to gather and compare the model performance before and after the updates.
  • Using the K-Fold approach to analyze the model’s performance would take 4-5 hours alone, slowing down the NLU tuning process significantly.
icon

Data management

  • The brand had more than 1.5M conversations each month, which meant millions of transcripts had to be analyzed to inform use case and tuning prioritization. Manual review of conversations was not an option.
  • The firm didn’t have the analytics in place to understand trending monthly requests or spikes in new topics around seasonal deals and promotions. This gave little visibility of the scope of queries their NLU was covering.
  • Conversation Designers and Product Managers did not have access to the training data that was used for the NLU model which became a blocker for using data to inform use case roadmaps.
icon

Model performance testing

  • Testing batches were stored in CSVs but there wasn’t a convenient way to update them with fresh data and run quick testing batches to assess the performance and receive performance metrics.
icon

Scalability challenges

  • The client wanted to expand the chatbot capabilities to handle more services, but the current NLU system performance was dropping whenever new intents were added due to conflicts between the existing and newly introduced intents.
  • Sampling data for new intents was also laborious since the necessity to sift through vast amounts of data to find good training phrases i.e. most frequently used and versatile.
icon

Sampling synonyms for entity values

  • Customers were using a great number of variations for products and services namings, with all kinds of mistakes and misspellings which were hard to track.
  • Customers also didn’t know domain terminology well, often confusing or mixing the names of services and products in their requests, which made it difficult for the NLU model to interpret.

What We Did

Recognizing we needed a top-tier solution to support the high-volume data to effectively optimize the conversational solution, we partnered with HumanFirst.

After successfully onboarding onto the HumanFirst platform, we used their solution to address each pain point in the following ways: 

head

Data Management

We integrated the telco virtual assistant’s data into HumanFirst Studio that allowed for automatic, continuous uploads where it would be organized into a structured format. This allowed our AI Trainers to easily sort, cluster, and analyze utterances on mass. HumanFirst offered fuzzy search, semantic search and similarity, uncertainty and entropy filters which accelerated the data exploration and labeling process.

Additionally, training data sets for intents were easily accessible to review, update, and edit for all team members across the Virtual Assistant teams, from Product Managers, to engineers, and data scientists. This increased visibility allowed for better alignment of our NLU strategy across the team.

HumanFirst helped our team reach productivity gains between 5-10X across key data management tasks.

NLU Development

HumanFirst’s machine learning assisted workflows used to analyze customer data, significantly sped up our process of collecting data for new intents and topic discovery, identifying potential intent conflicts, and easily gathering and running batch test data sets. It also allowed the team to easily estimate the current model intent coverage across all topics that users enquired about via the chatbot.

The platform also allowed for AI Trainers to easily run cross-validations, store versions, and compare across multiple cross-validations, used by AI Trainers to understand which NLU conflicts exist.

nlu

Results

After implementing HumanFirst’s solution our AI Trainers on the Virtual Assistant team were able to achieve fast, effective results including:

All of these efficiencies together accelerated the telco’s chatbot tuning efforts by more than 10X!

  • Build out an extensive taxonomy of more than 140+ topics based on user inputs.
  • Cross validation and batch testing was reduced from 4 hours from running locally to less than 10 minutes on the HumanFirst platform.
  • Intent creation and testing went from 8-10 hours per intent to less than 2 hours.
  • Improved intent accuracy, with Precision scores increasing by 25% to 91%, Recall scores increasing by 30% to 90%, and F1 scores increasing by 35% to 91%.
  • 10X

    Accelerated chatbot tuning efforts

  • 25+%

    Precision scores increased

  • 96%

    Reduced Batch testing and cross validation time

  • 30%

    Recall scores increased

  • 35+%

    F1 scores increased

  • 80%

    Reduced intent creation and testing effort

Through HumanFirst, the Master of Code team was able to seamlessly leverage user data to launch more than 40 use cases and 100+ new intents. This created a contextual, feature rich virtual assistant that drove containment and conversions for millions of customers.

Your Business Vision Meets Technology Mastery Now

Want to discuss your project or digital solution?
Fill out the form below and we’ll be in touch within 24 hours.


















    By continuing, you're agreeing to the Master of Code
    Terms of Use and
    Privacy Policy and Google’s
    Terms and
    Privacy Policy