Empathy and Chatbots: Not So Exclusive

by John Onorato

I have a friend who is a salesman in a high-end clothing store.  

I recently asked him how he does it so well.  “Think of it like a sixth sense,” he replied. “I can tell how a person is feeling right when they walk in.  In five seconds or less (usually less), I can tell if a customer is happy, stressed, or sad.”

How does he do it, though?

“I watch the way they walk.  I look at their eyes.  I can tell if they came in to browse, if they have something in mind, or if they want to talk.  And I know just how to respond so I can make my commission.”

Compare this with an experience I had recently with a chatbot created for a national florist.  A different friend had a good experience with it, and encouraged me to try it out.  It took me through my order and was quite efficient about it.  As I was taking out my credit card, it said “Have a colorful, fantastic day!”

Ordinarily this would be considered friendly and perhaps even pleasing.  Of course, I had just spent the better part of the last hour looking through floral arrangements … for a funeral.

Sure, this came from a chatbot hosted on Facebook Messenger.  (edit:  Since this writing, the company has taken funeral arrangements off of the chatbot interface.  I did not contact them, so I do not think there is any causal relationship there.)  It had no idea what actions I might have taken on the company’s website. 

Chatbots are extremely popular right now, though, and more are coming.  Facebook released the chatbot API in April 2016; in June there were over 11,000 chatbots on that platform alone.  As of September there were over 30,000. 

These bots are supposed to represent artificial intelligence.  They don’t.  Right now they offer scripted, highly structured experiences. 

Wouldn’t it have been more appropriate for the florist’s chatbot to wish me condolences, after watching my shopping habits?  This should be a no-brainer for ecommerce folks.  It ought to be easy for a bot to see what I’m doing and respond accordingly.

Of course this still wouldn’t be actual artificial intelligence.  The easiest way to make this happen would be through a script.  But still.  When chatbots actually do get intelligent, things are going to get awfully interesting.

What happens when a bot can examine a user’s actions, derive their most likely mindset, and be able to respond accordingly?

Perhaps more importantly, what will happen when they can empathize with us?

Understanding the Users

The term “digital body language” refers to a person’s combined digital activity.  My digital body language with the florist chatbot should have prompted an offering of condolences, as opposed to the cheery thanks it did offer.  It’s hugely important to understand what users do online, and not just record what they say.

So why should digital body language be so important to ecommerce vendors and chatbot developers? 

Because digital interactions are based in large part on nonverbal communication, just like the real-world interactions we have every day. 

When interacting with people in the physical world, we continually assess and process thousands of nonverbal cues.  Just a few examples include eye contact, gestures, tone of voice, and facial expressions.  As anyone who has gotten into an argument over text knows, it’s impossible to know what an interaction truly means unless we have access to these signals. 

In the burgeoning age of AI and chatbots, it’s just as important for a website — or a chatbot — to be able to interpret these signals.

Sadly, even as important as digital body language is, it is still underutilized by ecommerce.  For the most part, it remains an umbrella phrase, covering profile-based personalization and after-the-fact analysis.  Chatbot vendors have attempted to humanize their products, and they’ve as yet to succeed. 

To date they have failed to assess, examine and fully parse the aspect of human communication that’s most powerful and meaningful:  The unspoken.

This is soon to change, however.  Utilizing and exploiting the power of digital body language is hardly science fiction.

Using digital body language

So what’s the breakthrough?  It might sound like it’s science fiction, but it’s not.  In the same way as we infer another’s nonverbal signals when we are in the offline world, we can use innovative customer experience technology that can infer the mindset of a customer.  In real time.

With the help of these advanced solutions, it is possible to keep track of real-time digital activities, such as hesitation, click-through rates, scrolling speed, browsing behavior, navigation use, and more.  This allows retailers to stay ahead of the curve and stop using behavioral models based solely on past behavior.  Instead they can capture, utilize and respond to actual current digital behavior.  They can quickly zoom in on the psychological needs of each shopper, in order to be more effective when assisting them with the decision-making and buying process.

Machine learning makes it possible to develop models which are able to assess and categorize the mindset each customer has when they visit the site.  As they assess this per shopper data, these algorithms would be able to categorize a user’s intent.  To do this, they would simply look at the user’s actions.  Then using this knowledge, a brand can alter their offerings. 

Where do chatbots come in?

The answer is simple.  If we can look at a user’s behavior on a website, if we can quantify their mindset — and if we can then offer the user customization based on that data, if we can then personalize their experience — then we can code a chatbot that will do the same thing.

Looking at the example with the florist, their chatbot would determine that it should offer me condolences, based on the fact I was looking at funeral arrangements.  Not only that, but it would also assess the actions I took on the website — what page I visited, the movements of my mouse, which pages I looked at, which I passed over, which images I lingered on.  It would use this data to infer my mindset as I browse. 

A savvy chatbot would be able to see that I was simply looking at all the choices on the site, and offer to assist me by narrowing down my options.  It would also be able to tell if a more focused user came to the site, ready to buy.  It would then engage a subroutine to help guide them through the process as fast as possible.  It may also be able to tell if a user would be open to suggestions on an order:  For instance, if I might be willing to go with a wreath versus a more traditional arrangement.  Either way, it would then suggest some popular options.

Simply put, a well-coded chatbot would be able to do what my salesperson friend can do with his customers.  It would sense my mindset and be able to react to it.  It would behave in an empathic manner, even if it is not able to empathize in the human sense. 

My friend was not happy to hear about the information in this article.  “Next thing you know,” he said, “chatbots will be able to tell your waist size, just by looking at you.”

That’s just science fiction, though.  For now.

Print Friendly, PDF & Email
0 0 vote
Article Rating

Posted by John Onorato

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments