Privacy, AI, and the Great Regulatory Patchwork: What 2025 Means for Your Data Strategy
Last updated: July 2, 2025

So far, 2025 has brought a mishmash of state-level privacy laws, a rising tide of AI governance implications, and exactly zero federal legislation to tie it all together. That means marketers, data leaders, and legal teams are stuck playing compliance bingo—with a board that changes every quarter.
At the Omeda Idea Exchange, our VP of Privacy and Data Governance, Bettina Lippisch, broke down what’s really going on in the world of privacy this year—and what you can do about it. Here’s what we learned.
State Privacy Laws Are Multiplying—and Mismatching

This year alone, five new state privacy laws came into effect in January. More are landing in July, and Maryland’s set to close out the year with its own spin on things in October.
You’d think with that much activity, we’d have a national standard by now. We don’t.
“Half of U.S. companies are now operating under some version of a privacy law. But there’s still no federal regulation,” Bettina said. “And what’s coming isn’t guaranteed.”
The ADPPA (American Data Privacy Protection Act) fizzled out. Now the APRA (American Privacy Rights Act) is in the pipeline—but given the political climate, it could go either way.
The result? You’re managing compliance state by state. And if you work across regions, it’s more like managing compliance 50 ways at once.
Privacy, Security, and AI Governance Are No Longer Separate Conversations
Just a few years ago, privacy was mostly seen as a legal or IT responsibility. Today, it’s a cross-functional conversation.
“At this year’s IAPP Global Privacy Summit, it was clear—privacy, data security, and AI governance are converging,” Bettina said. “They’re no longer siloed disciplines. You need to collaborate across your teams.”
Privacy is no longer just a checkbox in your tech stack or a line item in a contract. It’s tied to your marketing strategy, your audience engagement, your ad targeting, your product roadmap—even your AI prompts.
And if your privacy program is still just a policy PDF in a shared folder, it’s probably time for a rethink.
AI Changes the Privacy Game—In Ways We’re Still Figuring Out
Here’s the tension: most privacy laws emphasize data minimization. Collect only what you need. Don’t keep what you won’t use.
Meanwhile, AI needs more data—sometimes a lot of data—to work well. Training large language models (LLMs), building recommendation engines, segmenting audiences… it all runs on volume.
So which wins?
“It’s up to your organization to solve that paradox,” Bettina said. “You need to find a middle ground between privacy compliance and AI performance.”
Oh, and by the way—data you use with AI still counts as data and may contain Personal Information, including:
- What you feed into your model (training data)
- The prompts your teams enters
- What the AI spits back out (especially if it includes PII or proprietary info)
Bottom line: If AI is part of your stack, your privacy program has to evolve to cover AI. Fast.
Consumers Don’t Trust AI. But You’re Already Using It.
The Smart Advertising Services market, driven by AI-ad tech, is projected to hit $2.2 trillion by 2030. 1 Yes, trillion.
That means nearly every marketer, publisher, and media brand is already dabbling in—or fully dependent on—AI to optimize ad targeting, personalize content, and automate decisions.
The problem? Only about one-third of consumers trust AI to behave fairly or transparently.2
“What does that mean for your ad programs?” Bettina asked. “Or for how you segment subscribers and serve content?”
Even if the math checks out on your end, if the experience feels opaque or creepy to users, it’s a trust issue. And trust is harder to rebuild than it is to earn.
AI Governance Is No Longer Optional
We’re no longer in the “wait and see” era. Regulators are moving. Fast.
Here’s what’s on the radar:
- EU AI Act: Companion to GDPR, focused on AI transparency, bias, and safety.
- Colorado AI Act: Coming in 2026 with strict requirements for AI use.
- California: Has provisions that apply to AI under existing privacy law.
- A looming APRA: If passed, it would add even more federal oversight to the mix.
“It’s not just about doing the right thing anymore. There will be enforcement,” Bettina warned. “The FTC is watching. And we’re already seeing lawsuits related to biased offers and advertising.”
If your team is using AI to drive targeting, offers, or automation, it’s time to ask some hard questions:
- What data are you using to train and test?
- Does it include PII or IP?
- Do your users understand the risks and best practices?
- Do you understand what the AI is actually doing and how to prevent unintentional outputs?
No shame if the answer to that last one is “not really.” You’re not alone. But it’s time to start digging deeper.
Remember…
More data = more accountability = more risk
BUT ALSO
More data = more insights = better customer experiences
Key is to find that balance through a collaborative data, privacy and AI Governance program that unites your organizations around customer trust.
What You Can Do—Even in a Fragmented Regulatory World
If your head’s spinning, take a breath. Here’s the good news: there are common threads across most laws and best practices.
Focus on these two pillars:
- Business obligations (what you must do)
- Consumer rights (what your users can expect)
When in doubt, build your program around the gold standards—like GDPR, CCPA, and CPRA. Most state laws follow similar patterns and consumer rights.
Then, make it a habit:
- Review your privacy strategy and notices at least 1–2x per year
- Loop in your legal team, security and AI stakeholders
- Create clear internal policies for how AI tools should (and shouldn’t) be used
- Treat AI output as governed data—because it is
“You don’t need to reinvent your entire strategy every month,” Bettina said. “But you do need to keep checking in.”
Wrapping Up: Data Is Data—Even When It’s AI
Whether it’s subscriber info, training data, or chatbot output, if it includes personal or sensitive data, it belongs under your privacy umbrella. The lines are blurry. The laws are changing. But the fundamentals still apply.
And here’s the real takeaway: Privacy isn’t just about risk anymore. It’s about trust. If your audience can’t trust how you use their data—human or AI-powered—they’ll go elsewhere.
Watch the full presentation
Want help navigating the chaos? Omeda helps audience-driven businesses build smart, privacy-first data strategies so you can adapt as fast as the laws.
1 Smart Advertising Services – Global Strategic Business Report. ResearchAndMarkets.com
2 2024 Edelman Trust Barometer Global Report.
Subscribe to our newsletter
Sign up to get our latest articles sent directly to your inbox.