There’s a debate on which is more emotionally draining — working in retail (or hospitality) or working in a call centre. The answer is obvious, however. It’s the call centre. At least dealing with the worst of humanity in person offers the possibility of smacking your dimwitted tormentor upside the head. Call centre agents just have to take the abuse.
Yes, mute buttons exist but all calls are recorded and it’s easy to get fired for mouthing off in a moment of weakness. Japan’s Softbank reckons it might have the answer and it involves AI. Specifically, AI designed to ‘cancel emotions’.
My emotions!
Customer harassment (that is: customers who harass service staff) is enough of a problem in Japan (and everywhere else, really) that the country’s Ministry of Health, Labor and Welfare is considering making staff protections mandatory for companies. What form that might take isn’t clear right now but Softbank is turning artificial intelligence loose on the problem.
The company’s EmotionCanceling Voice Conversion Engine, an in-development product expected to be ready for call centres by the end of 2025, effectively shuts down angry responses by modulating tone and pitch in real time. That won’t do anything for the blood pressure of overly angry customers but it might blunt the impact of that confident unreasonableness that is only attained by paying customers who feel slighted and intend to take it out on the nearest helpless target.
Softbank’s system won’t alter any angry statements or change the content of complaints, so errant customers can still cause trauma get their point across. Enough of an angry tone will also be retained to allow call centre agents to judge that the person on the line is indeed upset. But blowing up at a target whose entire job is to be helpful may prove less satisfying when higher-pitched shrieks or angry shouts are toned down before they make it to the object of aggression’s ears.
Damping down customer emotion is an interesting measure but a better one would be filtering those problem customers entirely. Dumping repeat offenders on an AI entity might be a better option. Firing those customers who regularly abuse staff is another. Simply not rewarding entitled behaviour might also go a long way toward protecting workers without having to train an entire AI system to make the angry sound… well, less angry.
1 Comment
As a 911 dispatcher, this could be very useful as some of our callers are screaming hysterically and it is almost impossible to hear what they are saying. It would be nice to press a button and have that info filtered to a manageable level and certainly would save time getting services to them.