No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Do We Still Need Human Customer Agents? Yes.

CX_AdobeStock_350286710_102121.jpeg

Someone rating customer service
Image: Looker_Studio - stock.adobe.com

One of the stereotypes about people my, um, age, is that when we need customer service, we only want to speak to a human agent, primarily because we’re old and cranky and need someone to yell at in case there are We probably also just don’t get technology.

At Enterprise Connect 2024, I found myself in more than a few conversations where I was assured that, while generative AI technology may not yet be ready for customer-facing scenarios in contact centers, there are business leaders who fully expect it to replace all or almost all human agents… eventually.

This Fast Company article picks up on a survey that sheds some light on how close we are to such a goal, at least from the customer’s perspective. Only 14% of respondents said they’d rather engage with an AI assistant than wait even a minute for a human agent. At the other end of the spectrum, 16% said they’d wait 11 or more minutes to talk to a live agent. When you think about how often you wind up waiting a lot longer than 11 minutes for a live agent, I’m kind of surprised this number isn’t bigger.

Most enterprises still see Gen AI as too risky for customer-facing contact. Also, companies are rightly optimistic about back-end, agent assist scenarios providing ROI over the next few years, which means you can make a business case without going all-in on getting rid of agents. But should the eventual elimination of human agent positions even be a long-term goal?

I just can’t see it happening, at least until we achieve artificial general intelligence (AGI) and the computers really can think like humans. Because the truth is, we don’t really want to talk to a human so that we can yell at them; we want to reason with them, and Gen AI doesn’t do reasoning.

Not long ago I had an important financial question for which I needed to make sure I got an accurate answer. I would never have considered not talking to a human. Because this issue was important to me, I needed to communicate in a human way: specifically, I needed to explain the situation, get some kind of a response that clearly indicated that the agent understood what I was asking about, hear their answer, repeat that answer back to them and then repeat this process with follow-up questions.

I don’t believe that even a highly accurate sentiment analysis tool could fully deal with the situation. It’s not just about how I feel; it’s about what I’m trying to accomplish, and that underlying motivation is what’s causing me to feel the way I do. The term we tend to hear is “empathy,” which to me isn’t quite right, maybe because its connotations relate to emotion. I don’t need the agent to feel my pain, I need them to understand my need and help me meet it. Maybe we’d be better off talking about agents needing to be “perceptive” rather than “empathetic.”

In any case, I don’t see Gen AI achieving this level of perception or alignment with human motivations very soon. Gen AI has good work to do in contact centers and elsewhere without us pushing into areas for which it’s not well suited.