CHATBOTS
BY DHWANI SONI, 8X8
ILLUSTRATION PROVIDED BY ADOBE STOCK
YOUR BOT JUST BECAME A LEGAL PROBLEM
HERE ' S WHY AND HOW TO FIX IT.
In December 2025, the Federal Trade Commission( FTC) fined Instacart $ 60 million for trapping customers in automated loops with no way out. A few weeks later, California ' s AB 578 went into effect, requiring food delivery platforms to provide access to a human when automation fails.
This isn ' t a food delivery story. It ' s a contact center story.
The complaint data that drove that legislation exists in our industry too. The regulatory attention landed on food delivery first because that ' s where the consumer evidence was loudest. It ' s moving.
The movement is gaining international momentum, with Spain recently passing legislation requiring large companies to answer customer service calls within three minutes and prohibiting the exclusive use of automated systems.
And the operational gaps regulators are targeting – broken escalation paths, missing context, automation that doesn ' t know when it ' s failing – are the same ones contact center leaders have been quietly managing around for years.
WHAT TO FIX?
So, let ' s talk about what to actually fix.
1. Escalation paths
This is simpler than you think but is messier than you ' d expect.
AB 578 ' s requirement is straightforward: when automation can ' t resolve a request, a human must be available.
That ' s it.
The question is whether your operation actually clears that bar. Not in theory but in practice.
• How many steps does it take a customer to reach a human when the bot fails?
• Is that path visible to them or do they have to fight for it?
Run the flow yourself. If it takes more than two steps from failure to human, you have a gap. If the path isn ' t clearly surfaced, surface it.
This isn ' t a complex fix. It ' s a design choice that was made incorrectly and hasn ' t been revisited.
MAY 2026 39