Contact Center Pipeline May 2026 | Page 42

In other words, California is not banning automation; it’ s codifying that automation cannot be the only path when there is a live dispute about money, service, or responsibility.
Even if you never touch food delivery, this is a big deal. AB 578 is a concrete statutory example of a pattern we’ re starting to see across channels and sectors( also see Figure 1).
Namely, automation is acceptable, even expected, but only if customers can see what’ s happening, understand their rights, and can reach a human when the stakes are high.
A BROADER“ AUTOMATION PLUS HUMAN FALLBACK” TREND
California has already sent other signals in this direction. SB 243, the state’ s new“ companion chatbot” law, targets AI systems that provide human-like, emotionally supportive interactions. It requires:
• Clear disclosure that the user is talking to a chatbot.
• Safety protocols around self-harm and sexual content.
• Additional protections for minors.
The common thread with AB 578? Discomfort with“ silent automation”: systems that look and feel human but aren’ t, and which may not have obvious escape hatches when something goes wrong.
At the federal level, the proposed " Keep Call Centers in America Act of 2025 "( introduced as S. 2495 in the Senate and H. R. 4954 in the House) pushes the same themes into the broader contact center and outsourcing space.
• Businesses handling customer service would have to disclose the physical locations of their agents at the start of interactions. If the agents are overseas, they must inform customers of their right to transfer to U. S.-based human agents.
42 CONTACT CENTER PIPELINE
FIGURE 1
• For AI or automated systems, companies would have to clearly disclose that automation is being used and offer a transfer to a human agent upon request.
Add in new and pending chatbot transparency laws in states like New York, along with AI and customer experience( CX) bills in jurisdictions such as Maine, Utah, Nevada, and Illinois, and a pattern emerges( also see Figure 2).
That is this: regulators are not trying to freeze customer service in the past, but they do want three things- disclosure, agency, and human escalation- for customers navigating automated experiences. Automation is fine as long as it is transparent and always comes with a real path back to a human.
If you run outbound programs, this feels familiar. TCPA and FCC rules already constrain automated voice and AI-assisted calling, requiring consent for many types of calls and texts and giving consumers clear opt-out rights.
The same underlying values are now showing up on the inbound and service side. That customers should know when automation is involved, should have a say in how far it goes, and should be able to reach a human when the interaction affects their money, safety, or legal position.
IMPLICATIONS FOR DIGITAL- FIRST, AUTOMATION-HEAVY CX
For digital-first platforms, AB 578 is a warning shot against“ IVR lock-in” and chatbot traps.
If your business model relies heavily on self-service flows, you now need to ask hard questions about where those flows can safely stop and where the law, or simply customer expectation, will demand a human.
Food delivery is the first category explicitly singled out in California. But it’ s not hard to imagine similar rules extending to travel cancellations, subscription renewals, insurance claims, or recurring billing disputes.
From a design perspective, that means mapping your automated journeys with the same rigor you apply to compliance controls.
So, you need to ask yourself, and have answers for,“ Where are my customers most likely to contest charges, allege fraud, or raise issues that could escalate to regulators or social media?”
Those points should have prominent, documented pathways to human agents, not just buried“ contact us” options.
Outbound teams are affected too. When a refund or complaint triggers follow up calls or messages- think collection of negative balances, outreach about disputed transactions, or make good offers- the same customer who just battled your automated gauntlet may be less tolerant of robocall-style outreach.
The safest posture is to treat outbound and inbound as a unified CX and compliance surface. Disclosures, consent, human access, and record keeping should be harmonized rather than siloed, and handled by separate teams with different thresholds.