Measuring success and satisfaction in chatbots

Chatbots are hot right now. They even have their own awards. But making a great chatbot with awesome UX is not easy.

One of our partners is working on multiple chatbot projects and they want to understand the UX of their chatbots and communicate this to their clients (I can’t say who it is, it’s still NDA). They tried three different chatbot analytics but were unsatisfied with all three because none told them what they needed know.

I offered to do a bit of a hackathon with them to see if we could adapt the UX metrics of UXprobe to chatbots. After a day of work we came up with something that is very very useful!

The intent measured by tasks success & failure

With UXprobe we measure the most import thing in a chat – task success and failure (in chatbot lingo: the intent).  The point of a chatbot is not to have nice little conversations with customers – no – it’s to let your customers get something done – find out their flight schedule, order a concert ticket, find the nearest tire dealer. The core of UXprobe is the usability metric of task success. We adapted task success and failure to the intents of the chatbot AI and very quickly we could see how usable the chatbot was. We could determine how often a chat ends in a success for a user, how long it takes users to complete and what path they use to get there.

The point of a chatbot is not to have nice little conversations with customers – it’s to let your customers get something done

The error metrics

We didn’t stop there though. Next up was incorporating error metrics. When you are developing a chatbot, it’s not easy developing all the flows and having all the proper entities and synonyms in the chat dictionary. It’s easy for chatbots to get stuck and that’s what you have to fight against.
By measuring every time a chatbot got to a dead end, we could filter and group all the error cases. Looking at only those sessions with errors, we easily found the bugs in the chatbot logic and missing synonyms from the dictionary. End result was improved quality and completeness of the bot.

It’s easy for chatbots to get stuck and that’s what you have to fight against.

The icing on the cake: feedback

And third, since satisfaction surveys are built right into UXprobe we added some simple user feedback. A quick smiley/frowny survey was added and gave us a thrifty satisfaction metric to track against user success.
Success and satisfaction is the definition of user experience (UX). Understanding both, in context, is important.

 

Wow. Not bad for a day’s work. At UXprobe we didn’t start out with conversational interfaces in mind but it shows that the fundamental UX metrics of UXprobe are widely useful in any type of UI. UXprobe is great for measuring the UX of chats.

Working on a chatbot and want to understand how good a UX it delivers? Send me an email and see if we can hack your case too.

 

ABOUT THE AUTHOR

Paul Davies
Co-founder
paul@uxpro.be

chatbot improve user experience

Don’t want to miss any of our blog posts?