A team of researchers at the Georgia Institute of Technology’s Schiller College of Business has conducted experimental studies to see if positive emotional displays improved customer service. The team found that emotive AI is only appreciated when the customer expects it, meaning it might not always be worth the investment for companies.
The research titled “Bots With Feelings: Should AI Agents Express Positive Emotion in Customer Service?” was published in Information Systems Research.
The new insight comes as AI chatbots continue to take over online commerce, with estimates showing by 2025, 95% of companies will have an AI chatbot.
Han Zhang is the Steven A. Denning Professor in Technology & Management.
“It is commonly believed and repeatedly shown that human employees can express positive emotion to improve customers’ service evaluations,” Zhang said. “Our findings suggest that the likelihood of AI’s expression of positive emotion to benefit or hurt service evaluations depends on the type of relationship that customers expect from the serviced agent.”
Conducting Three Separate Studies
The team carried out three studies to better understand emotional AI in customer service transactions. Each study had different participants, and the AI chatbots used positive emotional adjectives like excited, delighted, happy, or glad. They also used more exclamation points.
In the first study involving 155 participants, researchers sought to understand the impact of positive emotion on customer satisfaction. Subjects were asked to respond as if they had ordered an item from a retailer and it was missing; then they encountered agents (both human or bot) exhibiting either neutral or positive emotions. The findings revealed that when supported by human agents, customers responded more favorably towards emotional positivity than in scenarios with bots alone–raising interesting implications for businesses about how best to interact with their clients virtually.
The second study revealed that customer expectations can significantly shape the reaction to an emotion-based bot. When asked to rate their orientation on a scale, 88 participants who imagined returning textbooks demonstrated different preferences based on if they were communally or transactionally inclined. The results showed that when customers projected more of an exchange mindset, positive emotional bots actually had a negative impact – highlighting how vital it is for companies to understand user intent in order provide effective engagement methods.
“Our work enables businesses to understand the expectations of customers exposed to AI-provided services before they haphazardly equip AIs with emotion-expressing capabilities,” Zhang said.
Utilizing 177 undergraduate students, the final study looked into the reasons why bots with positive emotion may not always have the desired effects. Customers don’t necessarily anticipate machines exhibiting emotions and can sometimes be turned off by robots displaying them too heavily.
All of the results demonstrate that it can be challenging for businesses to use positive emotion in chatbots since they don’t know a customer’s biases and expectations.
“Our findings suggest that the positive effect of expressing positive emotion on service evaluations may not materialize when the source of the emotion is not human,” Zhang said. “Practitioners should be cautious about the promises of equipping AI agents with emotion-expressing capabilities.”
Credit: Source link