Hack Day 2019: Jenny the Support bot

Hackday XI – Jenny the Support Bot

The theme for Hackday XI was ‘Talking with Robosaurs’ – a focus on artificial intelligence and communication. We created a team from our crack Support desk squad- Toby Catlin, Tom Brook and Andy Fernandes.

Pre-planning

One issue we sometimes have on the support desk is capturing comprehensive and accurate information on the problem being reported. We wanted to see if a chatbot could improve this experience for our customers. We wanted this bot to be equipped to handle incoming support calls, capture the relevant details and raise an appropriate JIRA ticket for the team to investigate.

Tom found Twilio, who provide tools for natural language processing and text to speech. They have a number of API’s one of which is called AutoPilot which allows you to define tasks and phrases that will trigger those tasks. So in the basic example they define a task “tell_joke” which would be triggered by the input “hey, tell me a joke”, “make me laugh” etc. The more phrases you have mapped to a task, the better the model becomes. It provides support for voice calls via phone or Amazon Alexa, GoogleVoice, as well as text via SMS, Slack etc.

This seemed perfect, so Tom spent some time becoming more familiar with the tools and running through the “Getting Started” tutorials, showing the rest of the team how easy it was to create a functional model.

Getting Started with Twilio Autopilot

Hackday starts – First meeting

Our aim for the chatbot was to remain as close to a natural conversation as possible – we really did not want the whole “press 1 for X” dialog. We therefore needed to construct a conversation that collected the information required to create a Jira issue. We started by looking at our existing “Create Issue” form that we currently use, and stripping it right down to the minimum. 

That left us with this list:

  • Name of individual
  • Company
  • Product & version affected
  • Title of problem
  • Description of problem

A task can be set to listen after it has been triggered and the result is stored in a field variable. So we asked who was calling and then created a task called identify_caller and started inputting all the possible responses we could think of such as “This is {First_Name}” and “Hi, its {First_Name} from {Customer}”.

Initially, you had to respond with an exact phrase otherwise it just terminated the call. As we added more and more phrases, it became much more reliable at capturing your name. We then added the ability to check if you had already stated the company (and if not, ask that question). You can also add synonyms, for example the voice recognition would often pick up “Caplin” as “Kaplan” or similar, so adding as many of these as we could really improved reliability. There is a query log which shows each stage of the call process and what was understood by the system. This really helped fill in gaps in the model making it much more consistent.

Tom and Andy continued to create more tasks, add further samples, and write more complex functions. Unfortunately, Andy lost a lot of time trying to get call forwarding to work – it turned out this was not possible with the beta section of Twilio that we were using. We very much wanted to be able to set our default fallback state to call our managers personal mobile phone just so it would ring him in the demo and leave angry messages!

The bot would now collect all the required information, if you said exactly the right things in the right way. So we split up, Toby already had experience with the JIRA API and setup a webhook to receive a JSON payload sent from the Twilio API containing all the necessary information. This used the Python micro framework Flask which is lightweight but powerful and had a suitable Jira API code available. 

Tom wrote a nodejs function in the Twilo API that could be called at the end of the voice chat, this would then collect up all the information into a JSON object and post it to whatever URL we needed. For the demo this was a laptop.

First step in creating the Jira issue was to find the users Jira account, which was surprisingly hard based off just the information in the call. People are often inconsistent with their names, saying “it’s Tom” when their full name is Thomas.  Or the Twilo system would hear “Tom” as “Thom”. A wild card database query would get most of these but it really wasn’t good enough, especially with our international customers (we are still not sure how to pronounce the South African name Nkhulang). 

To improve this fuzzy searching for the name, we implemented a Levenshtein distance function. This calculates how similar two words are by counting the minimum number of single-character edits (insertions, deletions or substitutions) required to change one word into the other. Therefore “Thom” to “Tom” is only one deletion. In the end, a combination of wild card query plus filter by company normally left a maximum of 10 possible users. The one with the smallest Levenshtein distance on their full name was almost always the one we wanted.

Once we had the Jira user name, the rest of the users information could be confirmed. Then it was just a case of posting a suitable JSON payload to the Jira API with all of the info collected. There were a few issues with getting the format right for custom fields, which burnt a couple of hours.

Testing

Early testing did not fill us with confidence, although it did lead to a lot of laughs. Jenny (as we eventually christened our chatbot) interpreted one report of Liberator disconnections as “I love Jesus team disconnecting from other burrito”… This made us want to change our team name to Jesus Burrito!

Other example issue titles included:

  • My liver rights of wine stock.
  • live right across from stop
  • feeling tissue

Improving this was a process of making a call, looking at the query log and then adding as many phrases and synonyms as we could. We added all of our customer companies, along with synonyms for them. If we had more time, it would have been good to add all of the customer names too – this would really help the model determine the caller. The more we talked to it and added data, the better it got at processing our input.

Demo

The demo went surprisingly well. With very little warning, we picked a person from the audience to call our chatbot number. There was an initial hiccup but the second call went all the way through, with our chatbot ending the call with “Holy S*&t! I can’t believe that worked!” This got a solid laugh from all in the room. We showed the Jira page – sure enough it had created a Jira, and it even got an appropriate title: “Production Transformer crash”.

Next Steps

Clearly this platform could be expanded to provide many other features. Right away, we can see that it could be extended to reset a customer password, or redirect to a specific team member without needing to know the extension.

I think everyone was impressed with the progress that was made in 24hrs, but it was clear that we were a long way from being able to use this in production. We would need to spend a lot of time accounting for different conversation paths, and training the model to be more consistent.

Leave a Reply

Your e-mail address will not be published. Required fields are marked *