75.7 F
Los Angeles
Friday, Mar 13, 2026

Businesses Beware: Top Data Privacy Threats in 2026


PETER K. JACKSON
Attorney, Data Strategy & Privacy
[email protected] | (310) 785-6803
GreenbergGlusker.com

Data privacy risk is reaching critical mass in 2026 as “zombie” privacy claims, tougher state laws, and everyday AI use converge to create significant liability for companies. Here are some of the top data privacy threats to be mindful of in 2026 and beyond.

ZOMBIE PRIVACY CLAIMS ARE STILL UNDEAD

“Zombie” laws that predate 21st-century statutes remain among the biggest privacy threats today. While only the government can enforce newer laws like the California Consumer Privacy Act (CCPA) and its peers, these zombie laws open up class-action lawsuits and fixed, per-person fines. Claims under laws like ’60s-era California Invasion of Privacy Act (CIPA) argue that the common web tools on your website constitute surveillance devices or wiretap visitors.

If your company operates online and hasn’t faced a zombie claim, luck may soon run out. To date, producing the audit trails and visual artifacts necessary to backstop a claim has been a laborious process for the small group of law firms responsible for most claims. New software tools can scan websites and create evidence automatically.

As the floodgates open, courts may finally put up barriers. Until then, taking proper mitigation steps can reduce your risk profile without sacrificing the tools that power your sales & marketing.

OUTBOUND AI REQUESTS CREATE INPUT PERIL FOR PRIVACY & IP

By now, many companies license paid AI tools, but standard license terms and an internal AI Use Policy are far from airtight. An AI tool’s contractual commitment not to train on our data isn’t a confidentiality obligation. Most businesses send sensitive and proprietary input to a destination unknown. Hosted models in a dedicated cloud can manage that threat. Beyond that, employees may be using unlicensed or personal AI tools to meet their needs, which can create risks that a company is not aware of.

Developing products and having discussions of legal or compliance issues with AI are particularly risky. Conversations and usage records are discoverable and unlikely to be privileged. By default, many services keep them indefinitely.

DON’T MESS WITH TEXAS (OR CONNECTICUT)

About 20 states now have a disclosure and-opt-out privacy regime like CCPA. Early on, most states watered down CCPA’s requirements. But the big-tech backlash of the last few years created strong rules in surprising states like Texas.

Texas law kicks in when a business ceases to be a “small business” under the U.S. SBA definition (and sells something “consumed by” Texans). In some industries, it’s well under the $26.5m in trailing-year revenue CCPA requires.

Texas and a few other states require optin consent to process data for previously undisclosed purposes, or to sell or share sensitive data, like precise geolocations. An email update about your new Privacy Policy may not cut it.

AGE VERIFICATION GROWS UP

State efforts to broaden age-verification requirements have grown beyond pornography, and some now kick in if you offer user-to-user messaging. Meanwhile, state privacy laws require consent to sell or share kids’ data and increasingly define 16 (CA) or 18 (DE, FL) as the upper limit. If your products or services attract a teen audience and you haven’t considered verification measures, it’s time to revisit.

RISK ASSESSMENTS REACH OUR SHORES

State privacy laws require businesses to document internal risk assessments before exposing consumer data to risky processing. Texas joined California in requiring these cost-benefit analyses prior to ‘sharing’ (using AdTech) online. As states with newer laws finally ramp up enforcement, targeted companies should expect higher fines and longer scrutiny if they can’t provide their paperwork.

CCPA SECURITY AUDITS AND AI BURDENS LOOM

Not to be outdone, California’s latest privacy regulations carry new compliance burdens for businesses that:

• maintain 250,000 Californians’ data or 50,000 Californians’ sensitive data; requiring annual cybersecurity audits

• use AI to help reach decisions with legal or similarly significant effects; requiring pre-use notices & opt-outs.

A cybersecurity audit can be internal, and audits under existing standards suffice. The AI decision-making standard isn’t as broad as it may seem. But be sure to determine whether and when you need to comply.


Peter K. Jackson is an attorney focusing on data strategy and privacy at Greenberg Glusker LLP.

Learn more at greenbergglusker.com.

Featured Articles

Related Articles

Peter K. Jackson Author