Contact Center Pipeline May 2026 | Page 36

DEEPFAKES

BY RANA GUJRAL, BEHAVIORAL SIGNALS
ILLUSTRATION PROVIDED BY ADOBE STOCK

WHEN THE VOICE ISN’ T HUMAN HOW TO MANAGE THE GROWING DEEPFAKE THREAT.

In 2026, contact centers operate at the intersection of two seismic forces reshaping customer engagement: hyper‐scaled AI communications and an equally rapid rise in synthetic voice threats.

These aren’ t future risks, they’ re reality. Recent industry surveys point to a sharp increase in deepfake voice attacks and identity spoofing.
• 85 % of surveyed organizations said they had experienced at least one deepfake-related incident within the previous 12 months( Ironscales).
36 CONTACT CENTER PIPELINE
• Organizations are also reporting attempts to use stolen personal information and cloned voices to bypass security checks and request sensitive account actions.
To protect customer experience( CX), improve authentication, and safeguard brand trust, forward-thinking contact centers are moving beyond traditional analytics.
They are adopting next-generation AI that can read customer behavior and emotions while spotting synthetic or cloned voices in real time. These combine conversational insights with fraud detection in a single, integrated system.
THE RISE OF SYNTHETIC AUDIO THREATS
Voice cloning and synthetic audio are no longer niche technologies. Open-source models and cloud-based tools make it possible for non-experts to generate convincing deepfake voices quickly and inexpensively.
Analysts now consider synthetic audio attacks part of a high-growth category of emerging fraud threats.