|

Why Customer Service Needs Less Emotionally Savvy AI

A little human emotion is good for your company’s customer-facing tech. Too much can make it susceptible to manipulation.

Photo of customer service agents
Photo by Getty Images via Unsplash

Sign up for smart news, insights, and analysis on the biggest financial stories of the day.

Emotion AI wants to meet people where they are. But sometimes, people aren’t being so honest. 

When a customer wants better service and is talking to a chatbot, they might act more upset than they actually are in order to get better treatment. Then, the AI is doling out limited resources, like support or refunds, to the wrong people. 

And that makes the AI a less efficient helper. 

“AI has become central to customer service because of its scalability and emerging emotional intelligence. Traditional customer care, through call centers or in-person interactions, faces limits in cost, speed and capacity,” said Yifan Yu, an assistant professor of information, risk and operations management at the McCombs School of Business at the University of Texas-Austin. “AI systems, by contrast, can process millions of customer messages or reviews in real time.”

To explore emotion AI’s efficacy, Yu created a model with McCombs postdoctoral researcher Wendao Xue that’s “focused on how companies, especially in customer service, use emotion-based AI to decide who gets what.” 

The study looked into the fallibility of emotion AI, including whether it’s being gamed by smart users, or has its own errors.  

A Matter of Nuance

“Firms shouldn’t just plug in emotion AI and assume it will make the service more empathetic,” Yu said. “They have to plan for how people will react to it.”

What good deployment looks like for an enterprise: 

  • Combines technology with smart policy. Users should build transparent systems, and make sure any of the AI’s decisions “are fair and easy to explain.”
  • Regulates reliance on emotional data. In the “emotion AI paradox,” people believe that a stronger and more precise AI is better, but for customer service, a weaker AI that “doesn’t overreact” might provide fairer treatment. 

The “weaker AI” benefit came as a surprise to Yu: The resesarchers found it “can sometimes increase social welfare” because it keeps customers from exaggerating their feelings and distracting the AI from efficiently addressing levels of concern.   

“A moderate level of algorithmic noise can dampen these incentives of emotional misrepresentation, making emotional signaling less manipulative and reducing distortions in the market equilibrium,” Yu said. “‘Weak’ AI can serve as a natural regulator in emotionally charged digital interactions.”

A human agent can then come in to assess the actual tone and concern at hand. “While AI ensures consistency,” Yu said, “humans excel at contextual and empathetic judgment, recognizing subtleties in tone, irony or complex emotional states that algorithms still struggle with.”

The study sees a human-AI collaboration as the most effective approach, with AI augmenting the experience. That way, Yu said, AI can be the first layer and humans can give more attention to situations that call for “empathy, negotiation or creative problem-solving.” 

“Emotion AI should be viewed as a socio-technical system, not just a technology. Its success depends on how organizations design incentives, regulate data use and account for human strategic behavior,” said Yu. “AI doesn’t operate in isolation. It changes how people express emotions and compete for attention. Recognizing these feedback loops is essential for responsible deployment.”

Sign Up for The Daily Upside to Unlock This Article
Sharp news & analysis on finance, economics, and investing.