Hi this time I wanted to investigate in the possibilities of using a chat bot to help any e-commerce website to answer product related questions (if possible even more complicated questions) and also from a back end perspective have a tracking of what is happening and how our bot is performing, what are the most requested products, how consumers feels about the products and of course all of this in real time !
So let’s start from the basics, what is the product that we want to sell ?
Since I’m a huge fallout fan and I’m playing Fallout 4 right now, I want to introduce you to the fantastic world of … Nuka Cola!
Yeah you will love it! The smell, the flavor and the radiations will make it the next most wanted beverage in the world!
We can have different types of Nuka Cola (Wild, Orange,etc..) and we want to explain the characteristics of each variant to the users of our bot.
I will leverage several technologies mainly Microsoft based but can achieve the same with many other bot/analytics technologies (check here ).
Let’s start with the bot itself that we will build with Microsoft Bot Framework.
You need at least : 1 Hotmail, Live, Outlook.com account, 1 azure subscription linked to it and some dev account into the channels you want to use for your bot (Facebook developer account for example).
I reused a VM already with Visual Studio 2015 and after upgrading it to the latest patch level, I installed the bot framework visual studio project template and I also installed the bot emulator to test the bot locally.
The procedure is well explained here.
Now that we have a bot running locally we want to add “some intelligence” right?
The intelligent service that will help us to make a bot that is able to understand human language is LUIS , where will introduce what actually our bot is able to understand and what are the concepts that he is able to distill from a message.
First we need to create a new LUIS application (that will became later a simple api endpoint that we have to call) and we need to define at least three things: intents , action parameters and entities, but these concepts can be easily explained with our example:
Intent: I want to understand which type of Nuka Cola (Action Parameter) the consumer is interested .
Action Parameter: the parameter NukaColaType has to match an Entity.
Entity: List of all possible Nuka Cola variants
So what we will do on LUIS is the following:
- Create an Entity called NukaColaVariants
- Create an Intent called ProductInfoRequest
- Define inside ProductInfoRequest a mandatory Action Parameter called NukaColaType that matches the Entity NukaColaVariants
Here an screenshot:
“Tell me more about nuka cola dark” or
“What about nuka cola orange?”
After you record an utterance you will define the intent for it and then highlight the part of the phrase where the action parameter is defined and the matching entity.
In this way , utterance after utterance, LUIS will automatically be able to answer, but more importantly he will be smart enough to understand some like “how does nuka cola quartz taste like?” even if we never typed this utterance inside LUIS.
You can also look at LUIS errors and help him to understand some phrases that he was not able to classify properly.
Once we think that our LUIS model is kind of ok (I inserted 3-4 different utterances to start), you can publish the LUIS app and this action will give you a full api URL with inside your application id and subscription key.
Now bringing all this “Intelligence” to our bot program will be super complicated right? Api calls, object mappings, parameters in, parameters out, try catch…..
Now our bot using the emulator is answering amazingly in the emulator, but we want to use it live,right?
So from VS2015 we publish the bot to our Azure Subscription , register the bot on the bot developer portal and add/configure the channels that you want (I opted for telegram and facebook).
All the steps are described here.
Ok now we have our bot answering to facebook and telegram like a champion but if this was really a bot answering hundreds of requests per hour we cannot watch all the chats and have an idea of what is happening.
We need some analytics that in real time processes the information and gives us an idea of what is happening.
I created this PowerBI Dashboard:
Let’s see how to do that.
First step Logic Apps :
What we do here is to create an endpoint listening that will accept calls from our bot, call the Text Analytics Api on Azure to understand the sentiment from the text typed by the customer and finally send all the information to an event hub.
In order to send from Logic Apps info to an event hub you have to deploy the EventHubApi App published here using the magic “Deploy to azure” button and then discover the API with the “Show APIs for App Services in the same region” option when you add an action.
You can use the start a free ride of Text Analytics Api really simply clicking this link (provided you are logged in your azure subscription).
Since it is essentially a simple http endpoint processing json inputs and outputs you can use a simple Http connector.
A stream analytics job will in real time process the info coming from the event hub and populate the data on Power BI.
The query we used is the following:
System.Timestamp as time,
COUNT(*) AS [Count],
AVG(score) AS [Score]
InputFromHub TIMESTAMP BY [timestamp]
So every 5 sec we take all the messages , we group them by nukacola type and channel and we compute the count of events and the average of the score.
That’s what we will see in real time on the dashboard.
Why we added Logic Apps into this? Can we call directly the event hub from the bot? Why use the score only for analytics while it can and should be used to provide better feedback to the consumer in real time?
All of these and many other are great questions, some answers:
- I like Logic Apps because it exposed one end point and behind the scenes with 0 code I can create “monster” workflows that can do as many things as I want
- Using the score in real time it’s a great idea , but still I’m not able to think a way to have Luis and sentiment playing together nicely (I have to study more probably)
Now some clarifications related to data, data retention:
- I do not store any user identifier (my scope is only to understand if the bot is responding well I do not care who is the person actually writing)
- All the data is stored in powerbi at aggregated level (as you have seen with the query)
- The detailed data into the event hub is cleared automatically every day without any backup policy.
So at the end of the day I only observe stats about bot responses and I can look into LUIS errors and improve the responses of the bot.