ChatGPT and its implications for customer service

Check out all Intelligent Security Summit on-demand sessions here.


OpenAI opened the beta version of ChatGPT in late November 2022, resulting in the most powerful natural language processing (NLP) AI model to date. It quickly went viral, attracting a million users in the first five days.

Will models like ChatGPT completely replace chatbots?

The basic premise of this question is whether large language models (LLMs) such as ChatGPT will change the reputation of chatbots from clumsy, impersonal and flawed to algorithms so meticulous that (a) human interaction will no longer be needed, and (b) traditional ways building chatbots is completely obsolete now. We will explore these premises and give our opinion on ChatGPT’s impact on the CX space.

In general, we distinguish between conventional chatbots and chatbots like ChatGPT built on generative LLMs.

Traditional chatbots

This category covers most of the chatbots you’ll come across in nature, from DPD delivery status checking chatbots to customer service chatbots for international banks. Based on technologies such as DialogFlow, IBM Watson or Rasa, they are restricted to a specific set of topics and are unable to respond to input from outside of those topics (i.e. they are closed domain). They can only generate answers that have been previously written or approved by a human (i.e. they are not generative).

Event

The intelligent security peak on demand

Explore the critical role of AI and machine learning in cybersecurity and explore industry case studies. Watch on-demand sessions today.

Watch here

LLM-based chatbots

They can respond to a wide range of topics (i.e. they are open-ended) and generate responses on the fly instead of just selecting answers from a pre-written list (i.e. they are generative). These include Google Meena, Replika.ai, BlenderBot, ChatGPT and others.

Table generated by ChatGPT

LLM-based chatbots and conventional chatbots serve slightly different purposes. Indeed, for many CX apps, the open nature of LLM is less helpful and more of a hindrance when creating a chatbot that can specifically answer questions about your product or help a user solve a problem they are experiencing.

Realistically, LLM companies will not be allowed into the CX domain tomorrow. The process will be much more complex. The name of the game will combine the expressiveness and fluidity of ChatGPT with the fine control and limits of conventional chatbots. This is something research-focused chatbot teams are best suited for.

Where can you use ChatGPT today to build chatbots?

There are many aspects of building and maintaining a chatbot that ChatGPT is not suitable for in its current state, but here are a few for which it is already suitable:

  • Brainstorm potential questions and answers for a given closed domain, either based on its training data or tuned based on more detailed information – either by OpenAI releasing the ability to tune when ChatGPT becomes available through the API, or by incorporating the desired information through rapid engineering. (Disclaimer: It’s still hard to know for sure where the information comes from, so this development process will still require a human being in the loop to check the output.)
  • Chatbot training: ChatGPT can be used to paraphrase questions a user might ask, especially in different styles, and even generate sample conversations, thus automating much of the training.
  • Testing and quality control. Using ChatGPT to test an existing chatbot by simulating user input shows great promise, especially when combined with human testers. ChatGPT can specify the topics to be tested with different levels of detail and, as with training data generation, the style and tone it uses can vary.

We see that the next generation of CX chatbots are still based on conventional, non-generative technology, but generative models are heavily used in the development process.

Chatbots aim to equalize the current CX space

LLM’s key impact on consumer expectations will include increased chatbot visibility, increased urgency to include them in CX, increased chatbot reputation, and higher standard. In other words, chatbots are shining!

We’ve all experienced them – clumsy chatbots with very limited dialogue options that bots type out painfully (if they understand anything at all). While underperforming chatbots are on their way out, standards will now be shooting through the roof to avoid this experience, and the transition from human to AI will continue rapidly.

A recent report predicts that the number of customer-call center interactions powered by artificial intelligence will increase from 2% in 2022 to over 15% by 2026 and then double to 30% by 2031. However, given the rapid adoption and exponential progress in artificial intelligence over the past three to five years, we anticipate that the actual growth will be much greater.

Brands like Lemonaid, Oura, AirBnb, and ExpressVPN have paved the way for excellent 24/7 support — so much so that today’s customers now just expect a seamless experience. The consequences of losing great service are no joke. Poor customer service can have a significant impact on brand retention rates, causing potential buyers to look elsewhere: According to Forbes, poor customer service costs companies a total of $62 billion a year.

Risks of using today’s LMM-based chatbots

ChatGPT is certainly in the hype phase, but there are significant risks involved in using it as it stands. We believe that most of the current risks stem from the unpredictability of ChatGPT, which creates reputational, branding and legal concerns. While the hype surrounding ChatGPT is good, you must not overlook the risks involved and the importance of choosing the right partner to avoid any pitfalls.

In particular, we see the following risks for large companies that implement LLM directly into their customer journey:

  1. It’s bad for the brand image – sharing offensive content
  2. Misleading customers – sharing fake content
  3. Opponent’s ability to attack — people trying to crack the chatbot to damage their reputation
  4. False creativity – Users confuse “stochastic parrot” with true human creativity/connection
  5. False power — ChatGPT produces authoritative-sounding text that people are notoriously bad at rejecting.
  6. Data security and data ownership and confidentiality – OpenAI has visibility and access to all data shared via ChatGPT, opening a huge floodgate of risk in case of confidentiality breaches.

In other words, “Just because you can doesn’t mean you should”

Startups and established organizations will inevitably try to put in place safeguards and other measures to mitigate some of these risks. However, many companies, including many that we work with, still want (or are legally required to) maintain full control over their content. Our legal and FCA regulated clients are a good example. With generative LLMs like ChatGPT that retain full content, control is impossible.

When it comes to chatbot development itself, gamers using open source stacks like Race or Botpress will have an agility advantage due to the flexibility and versatility these open systems allow. In the short to medium term, chatbot developers with experience in NLP and using LLM will be the ones to bring this technology to the chatbot market as they are able to effectively leverage and fine-tune the models to their (or their clients’) needs and use cases.

In the long term, small businesses will still be better equipped to implement changes quickly than large, established platforms like ChatGPT. However, with the current volatility in the financial markets, we anticipate a potential player consolidation over the next 12-24 months, with bigger players taking over smaller players and, as is common in the chatbot space, customers buying their chatbot providers.

Which industries will be the first to implement ChatGPT in their CX processes?

Even though ChatGPT is still in beta and doesn’t have an API available yet, there are tons of exciting use cases posted, including many browser extensions, mostly via Twitter.

As long as ChatGPT is available to the public (we expect a volume-based pricing model to emerge, similar to previous models such as GPT-3), small players will continue to push the boundaries with cutting-edge apps.

Victoria Albrecht is the co-founder and managing director of Springbok AI.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is a place where experts, including data technicians, can share insights and innovations related to data.

If you want to read about cutting edge ideas and current information, best practices and the future of data and data technology, join us at DataDecisionMakers.

You might even consider writing your own article!

Read more from DataDecisionMakers

Leave a Reply

Your email address will not be published. Required fields are marked *