artificial intelligence

Where’s the Killer Chatbot?

by John Onorato

Photo by Andy Kelly on Unsplash

Let’s face it:  Most of the chatbot experiences today are pretty wretched. 

They’re stilted, artificial and in some cases downright affected.  Natural language processing is still in its infancy, and has a long way to go before sounding actually “natural.”  Or truly understanding natural speech, for that matter.

This is due in part to the difficulty of designing a user interface around a conversation, which is non-hierarchical in nature.  When talking to another person, the steps don’t always flow naturally from one to the other.  This kind of design is also fundamentally different than either a mobile or web interface. 

Additionally, we have yet to develop a general-purpose AI which can accept a user’s open-ended input.
It is incumbent on chatbot creators, therefore, to pick out engaging patterns of interaction.  Building on and around these will enable developers to create whole experiences that will delight the users.

So how do we work around the limitations of a conversational UI, knowing the above?


About the UI

Up until now, User Interfaces have been crafted for a linear experience, not a random one.  In other words, after the user comes to the page, a specific sequence of events typically happen, at least in terms of ecommerce. 

First they search for an item or two.  Those items are then added to the user’s cart.  They enter payment information, check out and leave the site. 

A chat based UI is completely different from either a web or mobile interface.  One of the biggest stumbling blocks is that the customer can initiate the procedure in different places.  Say they want to buy tickets for a movie.  The customer can ask a bot “What’s playing around 8pm?”  Another valid starting point can be “I want three tickets to Trolls at the Regal on Little Texas Lane and Congress.”

So we see that a big challenge for anyone wanting to design a chatbot is that the path a customer will use to reach their goal (in this case, to purchase tickets) is not known beforehand.  The chatbot has to assist the user and provide the desired answers without needing a discussion to progress in a straight line.


The AI Factor

Photo by BENCE BOROS on Unsplash

The next big stumbling block for chatbot developers is that a true AI that works on a variety of inputs is still a long way off.  AIs themselves are not especially new, but they are new to the consumer marketplace.  One AI-like construct that bot creators use a lot is the Simple Linear Tree, which forces the user down a predetermined path.  New AI routines might also be used, but these are not true AI.  They simply match patterns against pre-programmed conditions, in an effort to determine a user’s intent.

Generally speaking, these work well enough when there are a finite set of ways a user can interact with a bot.  But as developers are finding out, user input can be totally random.  This leads to situations where a bot can get unexpected input that it can’t handle.  So without better tools, a better AI, it’s all a matter of hunt and peck.  Or worse, finding the linguistic needle in a haystack of possibilities. 


The Solution:  Modify, Publish, Iterate, Repeat

So how does a bot developer succeed with the limited tools they have?  The best path is not already defined, given the variety of inputs.  Neither the number of inputs nor their content is known.  There has to be a quick, iterative path to successful completion, and it has to be low-cost as well.  A developer needs to be aware of how their bots are responding to the inputs provided by the user.  With this knowledge, they can then iterate on what is already there.  Any blocks between the user and their goal need to be addressed.

Experience has shown that the best tools for the iterative method are bot native.  This means they are able to understand the complexity and nuance of a conversational interface, and are able to translate them into clear metrics.  Conversely, it also means the user is not simply dumped into meaningless dialogues or dashboards.

Marketing teams can use these tools to pinpoint groups of similar users, then connect with them through personalized messages.  Creative and editorial teams can use them to address messaging that may be off-brand or that doesn’t have the desired tone.  Business leaders can use them to provide a detailed picture of their efforts without the use of an engineering team and a data scientist just to “run the numbers.”

It’s important to have a conversational UI that’s easy to understand.  It’s also important to iterate quickly on this.  Being able to do these things will assist business leaders to grow differentiated bot-native arms that can leverage the great power found behind the conversational interface./hea

Posted by John Onorato in Chatbots, Portfolio, Technology, 0 comments
Empathy and Chatbots:  Not So Exclusive

Empathy and Chatbots: Not So Exclusive

by John Onorato

I have a friend who is a salesman in a high-end clothing store.  

I recently asked him how he does it so well.  “Think of it like a sixth sense,” he replied. “I can tell how a person is feeling right when they walk in.  In five seconds or less (usually less), I can tell if a customer is happy, stressed, or sad.”

How does he do it, though?

“I watch the way they walk.  I look at their eyes.  I can tell if they came in to browse, if they have something in mind, or if they want to talk.  And I know just how to respond so I can make my commission.”

Compare this with an experience I had recently with a chatbot created for a national florist.  A different friend had a good experience with it, and encouraged me to try it out.  It took me through my order and was quite efficient about it.  As I was taking out my credit card, it said “Have a colorful, fantastic day!”

Ordinarily this would be considered friendly and perhaps even pleasing.  Of course, I had just spent the better part of the last hour looking through floral arrangements … for a funeral.

Sure, this came from a chatbot hosted on Facebook Messenger.  (edit:  Since this writing, the company has taken funeral arrangements off of the chatbot interface.  I did not contact them, so I do not think there is any causal relationship there.)  It had no idea what actions I might have taken on the company’s website. 

Chatbots are extremely popular right now, though, and more are coming.  Facebook released the chatbot API in April 2016; in June there were over 11,000 chatbots on that platform alone.  As of September there were over 30,000. 

These bots are supposed to represent artificial intelligence.  They don’t.  Right now they offer scripted, highly structured experiences. 

Wouldn’t it have been more appropriate for the florist’s chatbot to wish me condolences, after watching my shopping habits?  This should be a no-brainer for ecommerce folks.  It ought to be easy for a bot to see what I’m doing and respond accordingly.

Of course this still wouldn’t be actual artificial intelligence.  The easiest way to make this happen would be through a script.  But still.  When chatbots actually do get intelligent, things are going to get awfully interesting.

What happens when a bot can examine a user’s actions, derive their most likely mindset, and be able to respond accordingly?

Perhaps more importantly, what will happen when they can empathize with us?

Understanding the Users

The term “digital body language” refers to a person’s combined digital activity.  My digital body language with the florist chatbot should have prompted an offering of condolences, as opposed to the cheery thanks it did offer.  It’s hugely important to understand what users do online, and not just record what they say.

So why should digital body language be so important to ecommerce vendors and chatbot developers? 

Because digital interactions are based in large part on nonverbal communication, just like the real-world interactions we have every day. 

When interacting with people in the physical world, we continually assess and process thousands of nonverbal cues.  Just a few examples include eye contact, gestures, tone of voice, and facial expressions.  As anyone who has gotten into an argument over text knows, it’s impossible to know what an interaction truly means unless we have access to these signals. 

In the burgeoning age of AI and chatbots, it’s just as important for a website — or a chatbot — to be able to interpret these signals.

Sadly, even as important as digital body language is, it is still underutilized by ecommerce.  For the most part, it remains an umbrella phrase, covering profile-based personalization and after-the-fact analysis.  Chatbot vendors have attempted to humanize their products, and they’ve as yet to succeed. 

To date they have failed to assess, examine and fully parse the aspect of human communication that’s most powerful and meaningful:  The unspoken.

This is soon to change, however.  Utilizing and exploiting the power of digital body language is hardly science fiction.

Using digital body language

So what’s the breakthrough?  It might sound like it’s science fiction, but it’s not.  In the same way as we infer another’s nonverbal signals when we are in the offline world, we can use innovative customer experience technology that can infer the mindset of a customer.  In real time.

With the help of these advanced solutions, it is possible to keep track of real-time digital activities, such as hesitation, click-through rates, scrolling speed, browsing behavior, navigation use, and more.  This allows retailers to stay ahead of the curve and stop using behavioral models based solely on past behavior.  Instead they can capture, utilize and respond to actual current digital behavior.  They can quickly zoom in on the psychological needs of each shopper, in order to be more effective when assisting them with the decision-making and buying process.

Machine learning makes it possible to develop models which are able to assess and categorize the mindset each customer has when they visit the site.  As they assess this per shopper data, these algorithms would be able to categorize a user’s intent.  To do this, they would simply look at the user’s actions.  Then using this knowledge, a brand can alter their offerings. 

Where do chatbots come in?

The answer is simple.  If we can look at a user’s behavior on a website, if we can quantify their mindset — and if we can then offer the user customization based on that data, if we can then personalize their experience — then we can code a chatbot that will do the same thing.

Looking at the example with the florist, their chatbot would determine that it should offer me condolences, based on the fact I was looking at funeral arrangements.  Not only that, but it would also assess the actions I took on the website — what page I visited, the movements of my mouse, which pages I looked at, which I passed over, which images I lingered on.  It would use this data to infer my mindset as I browse. 

A savvy chatbot would be able to see that I was simply looking at all the choices on the site, and offer to assist me by narrowing down my options.  It would also be able to tell if a more focused user came to the site, ready to buy.  It would then engage a subroutine to help guide them through the process as fast as possible.  It may also be able to tell if a user would be open to suggestions on an order:  For instance, if I might be willing to go with a wreath versus a more traditional arrangement.  Either way, it would then suggest some popular options.

Simply put, a well-coded chatbot would be able to do what my salesperson friend can do with his customers.  It would sense my mindset and be able to react to it.  It would behave in an empathic manner, even if it is not able to empathize in the human sense. 

My friend was not happy to hear about the information in this article.  “Next thing you know,” he said, “chatbots will be able to tell your waist size, just by looking at you.”

That’s just science fiction, though.  For now.

Posted by John Onorato in Blog, Technology, 0 comments