The robots are here. Machine learning and predictive analytics are transforming every industry, creating new opportunities–and disrupting old models–faster than you can say “S-Curve.” In more instances than not, the decisions we make as consumers are “nudged” for us, by algorithms and targeted communication technologies. What if we could talk back?
Enter, chatbots–the next wave of human-AI interaction. The evolution of “conversational commerce” will open many avenues for transforming consumer relationships–but they must be designed with caution and scientific expertise. And, just like ecommerce opened up a new frontier for presenting information to people, conversational commerce will bring new opportunities to test out framing and choice architectures to nudge behavior. Learnings from behavioral science will likely spur the adoption of, and comfort with, this technology, by helping marketers design for how people really behave.
Maritz is leading the transformation of customer service with behavioral science and technology–especially in the domain of consumer loyalty and reward programs. With over 100 million members engaged in our loyalty programs, Maritz knows a little something about creating lasting consumer relationships. Hint: there’s a lot more to it than rewards. However, rewards are one crucial part of the habit loop, as psychologically speaking, they are a mechanism to reinforce desired behavior. When a consumer engages with a brand, a desirable reward can leave that customer with a positive association and an inclination to return.
That’s why there is nothing sadder than points left unredeemed. Check the member accounts of your favorite financial, hospitality, or airline providers. Are you leaving earned points on the table? The average US household belongs to 29 loyalty programs, but is actively engaged in fewer than half of them. Technology helps us address this “engagement cliff.”
For example, Maritz recently conducted a pilot program with HSBC which leveraged AI to accurately predict which rewards would be most appealing to individual customers. In an email test of 75,000 loyalty program customers, we found that 70 percent of those who redeemed points chose the rewards that were suggested by the AI algorithm.
Another partner in the hospitality industry leverages beacon technology, to proactively suggest reward options for their guests, triggered by their geo-location. Imagine sitting pool side and being reminded that you can use points for a bottle of champagne. What’s another room night to a weary business traveler. I’ll take the cocktail!
As AI evolves, chatbots are becoming powerful tools that allow organizations to handle certain frontline interactions on-demand and at scale.
With over 100 million members engaged in our loyalty programs, Maritz knows a little something about creating lasting consumer relationships
Introducing a chatbot, however, can be fraught if consumers are used to interacting with a person. We can learn from behavioral science to design chatbot experiences that delight customers. Some lessons to consider:
1. More Bro, Less Bot: Anthropomorphism is our friend
Making autonomous agents feel like humans has high potential to increase trust and uptake of those technologies. For example, Waytz et al. found in a 2014 study that an autonomous vehicle with a name, gender, and voice was rated as having more humanlike mental capacities as a vehicle with the same technical features. Participants reported trusting it more, and were even more relaxed in a (simulated) accident, and blamed their vehicle less for an accident caused by another car.
Furthermore, research suggests the more humanlike a robot is perceived to be, the less aggressive people tend to be toward it, so there may be some benefits to consider in those infamously frustrating customer experiences that inevitably arise. In a 2015 lab experiment conducted by Kate Darling et al., when asked to strike a robot, participants hesitated significantly more when it was introduced through anthropomorphic framing like a name and backstory. Thus, people may be more forgiving of Siri or Alexa than they would a nameless block of text.
2. Be transparent: Honesty is best – and control is sacred
New technology intended to personalize customer service can sometimes be met with backlash–reactance, in psychological terms. Research on targeted advertising, for example, indicates that people sometimes react negatively to messaging that they perceive as unnecessarily personalized. This is where transparency is key. Tiffany Barnett-White and her colleagues found that when customers were told why the video store ads they were viewing included mention of their zip code (only customers in that area were subject to a discount), they were more likely to click on the ad. In this case and others, explicit justification can mitigate feelings of reactance customers have towards newly personalized experiences. When introducing a new service technology like a chatbot, try alleviating the creep factor with transparency justification, e.g. “We are using a chatbot because it will make things faster for you.” This is especially true in Maritz’ context of consumer loyalty and reward sites, where trust is higher than the average ecommerce site.
Along the same lines, it’s important to remind customers that they are in control. Research on Facebook advertising finds that people are more likely to click on targeted ads after they are reminded that they own their data, and that have the right to change their permissions at any time. In case of a chat bot, customers should be reminded of their right to pass on the bot and wait on a real person.
3. Don’t overdo the automation – make the bot “show their work”
In their research on operational transparency, Ryan Buell and his colleagues uncovered an intriguing phenomenon: people like to see how the sausage is made. Among other contexts, the researchers demonstrated this when students in a lab participated in a simulated online dating site. Students filled out input fields describing their dream partner, submitted the forms, and waited for their results to load. In one condition, the dating algorithm produced results immediately. In the other, the computer “showed its work,” and took significantly more time to load the identical results, while indicating to the subjects that it was reviewing potential matches one step at a time, based on a sequence of criteria such as location, height, and interests. Though it took longer, the students preferred the second condition, in which they felt the system put more thoughtful effort into choosing their matches. This preference even manifested in their ratings of the matches, which though identical in either condition, were considered superior in the transparent condition.
The lesson here is that just because a service can be automated and produce immediate results, doesn’t mean it should. Thanks to the labor illusion, it can be wise to consider the perspective of the customers and insert a signal like, “Hold on, I’m researching that for you,” to ensure they feel you’re putting in some work for them. Whether they are interacting with humans or a bot, customers value effort. Remind them of the work behind the scenes to show you care.