2025 Q4: Top Updates in Privacy and AI
The following insights reflect Jackie’s personal analysis of the recent updates. While we hope you find them helpful, they are not legal advice. Reading this does not create an attorney-client relationship. You should not act on this information without seeking professional counsel.
AI:
Ensuring a National Policy Framework for AI
Adding to the climate of uncertainty and general confusion around AI legislation in the US, in December, the President signed an Executive Order (“EO”) titled "Ensuring a National Policy Framework for AI."
The EO directs the Attorney General to create an AI Litigation Task Force to challenge state AI laws that are “inconsistent with” the policy of the EO (focused on US “global AI dominance” through “minimally burdensome” policy).
Note that the EO does not, in itself, invalidate state AI laws. It also tasks the Secretary of Commerce with preparing an evaluation of state AI laws to identify “onerous” laws.
The EO also threatens to withhold federal broadband funding (BEAD) from states with “onerous” laws. In the lead-up to the EO, in November, a coalition of 36 State Attorneys General sent a letter to Congress opposing any federal moratorium on state AI regulation. They argued that states must remain free to protect citizens from AI-generated deepfakes, scams, and discriminatory hiring algorithms.
***Take-away: While the Executive Order could signal that enforcement of AI laws will be minimal or non-existent, state AI laws are still on the books and state regulators are poised to enforce (and fight back). We will likely see this play out in the policy wrestling ring for a while. A wise course would be to monitor developments while continuing to prepare for enforcement.
New York SB-8420A
Been using–or thinking of using–AI-generated performers in your advertising content? If you haven’t already, you may need to post some disclosures, friends! In December, New York’s governor signed SB-8420A into law.
It requires “conspicuous” disclosures for advertisements using an AI-generated “synthetic performer.” There are some key exceptions, so check into whether one might apply.
The law goes into effect in June 2026 (caveat! See point above about the Executive Order–should this law be deemed “onerous,” it may be TBD.)
***Take-away: If you are using or plan to use AI-generated people in your ads, evaluate whether disclosures may be necessary.
RAISE Act
Frontier AI models were a focus of state-level legislation in Q4 (and before Q4 as well!). New York has been busy with AI legislation in Q4 for sure! In December, the state also enacted the Responsible AI Safety and Education Act (“RAISE Act”).
The RAISE Act regulates “large developers,” which are those that train at least one “frontier model” and have spent an aggregate of over $100 million in compute costs to train them. The Act prohibits models that present an “unreasonable risk of critical harm.”
It also imposes documentation and disclosure requirements. New York joins California in regulating frontier AI models. California passed its frontier AI legislation, the Transparency in Frontier Artificial Intelligence Act (“TFAIA”), at the end of September, which includes similar requirements (we mentioned it briefly in our Q3 recap).
The NY RAISE Act is set to go into effect on January 1, 2027, but is pending amendments. (The CA TFAIA went into effect January 1, 2026.)
***Take-away: Developers of “frontier AI models” will need to prepare for compliance with the transparency and documentation requirements, and those deploying frontier AI models will need to ensure their use doesn’t run afoul of the intended use parameters and should consult the disclosures and documentation provided by the developers.
Privacy:
Maryland Online Data Privacy Act
Only one state privacy law went into effect in Q4. Which state? (Hint: We mentioned it in our Q3 recap!) Big reveal: This quarter, it was Maryland’s turn to join the ever-increasing group of US states with privacy laws on the books.
On October 1, 2025, the Maryland Online Data Privacy Act (“MODPA”) went into effect. Maryland’s law has a fairly low applicability threshold, so it’s likely to apply where others may not.
A couple of key features to note about the MODPA:
It requires that controllers limit the collection of personal data to what is "reasonably necessary and proportionate" to provide the requested product or service.
Perhaps its most unique feature is that it prohibits the processing of sensitive personal data unless it’s “strictly necessary” to provide or maintain a product or service requested by the consumer.
It also prohibits the sale of sensitive personal data and the targeting of advertising to minors under 18. It does provide for a 60-day cure period that sunsets on April 1, 2027.
***Take-away: Check the applicability threshold for your business. If it does, pay special attention if you process any sensitive personal data, and be sure you do the appropriate assessment to confirm the personal data you collect meets the “reasonably necessary and proportionate” standard.
California Delete Act
On the heels of adopting the finalized versions of regulations on ADMT, cybersecurity audits, and risk assessments in Q3, the California Office of Administrative Law approved regulations to implement the Delete Act, which cover how consumers can submit requests through the Delete Request and Opt-Out Platform (“DROP”).
Data brokers definitely should not DROP the ball heading into 2026! See the enforcement section of this Q4 recap for more on the upcoming DROP-related enforcement activity.
***Take-away: If you’re a data broker, begin preparing now to comply with the DROP requirements.
California AB 56
Protecting children’s privacy remains a priority, but how it’s done is all over the map, literally and figuratively! A few quick Q4 highlights:
(A) In October, California passed AB 56 into law, which requires “health labels” for “covered platforms,” primarily social media platforms. The new law requires that a “black box warning” be displayed to users under 18 when the user’s access meets certain timing requirements. The law specifies the content of the warning, which includes a statement that “social media is associated with significant mental health harms.”
The law goes into effect on January 1, 2027.
(B) Texas’s App Store Accountability Act, SB 2420, was set to go into effect on January 1, 2026, but in December was enjoined by the US District Court in the Western District of Texas following a challenge by the Computer & Communications Industry Association (CCIA). SB 2420 (adopted in May 2025) required app stores to verify user age and report age signals to developers. App stores had begun rolling out developer guidance in response.
Several other states have passed their own app store accountability laws, which means we haven’t seen the last of this!
(C) Australia’s Online Safety Amendment (Social Media Minimum Age) Act 2024 went into effect in December. The law requires certain “age-restricted” social media platforms to block users under 16. The law comes with significant monetary penalties for non-compliance.
According to the eSafety Commissioner, the government’s aim with the law is to protect young Australians from potential harm. The legislation remains controversial due to concerns regarding digital autonomy and the potential for tech-savvy minors to bypass the restrictions.
***Take-away: Children’s and minors’ privacy continues to be a hot area for legislation, but some contraction may be on the horizon. Evaluate your processing of personal data for individuals under the age of 18 and be ready to navigate increasing challenges, including age verification requirements.
Europe’s Digital Omnibus Package
The wintry winds of change, ushering in a promise of “modernisation” and “simplification,” have begun gusting across Europe. In November, the European Commission introduced the Digital Omnibus Package. What the heck is this mysterious “Digital Omnibus” (and didn’t Europe just pass a whole bunch of data laws not that long ago)?!
The Digital Omnibus Package consists of two parts:
(1) The Digital Omnibus Regulation proposal that applies to the GDPR and cookie requirements (among others), and
(2) The Digital Omnibus on AI Regulation proposal, which applies to the EU AI Act. (For context, note that the Digital Omnibus Package is one of 10 total Omnibus packages being introduced by the European Commission, with the stated focus of simplification and implementing administrative cost savings.)
The Digital Omnibus proposes important changes to data laws in Europe, including to the GDPR, the EU AI Act, and cookie requirements.
Some key proposed changes worth noting:
(a) narrowing the scope of what’s considered to be personal data under the GDPR;
(b) new exceptions to cookie consent requirements;
(c) delaying the effective date of the AI Act provisions for high-risk AI systems; and
(d) allowing legitimate interest to be a valid legal basis for AI training.
While proponents focus on simplification and better enabling business in Europe, critics say the Omnibus gutters many of the hard-fought, long-negotiated consumer protections established under existing laws.
We’re in the early stages, and the Digital Omnibus proposal will be subject to negotiation and potential revisions during the legislative process. The timeline for reaching the finalized version of the Regulations is unclear, but is expected to be between Q3 of 2026 and mid-2027, depending on the source.
***Take-away: Be aware of the key proposed changes and start considering how they may impact your business. Continue to monitor the evolution of the proposals so you’re ready to adapt your compliance approach where the changes could be supportive.
Enforcement:
Sling TV and Dish Media Settlement
It’s no surprise that California continued its active enforcement streak in Q4! So, what enforcement actions was the Golden State up to in Q4? The CA Attorney General’s Office was busy wrapping up the year with two significant enforcement actions.
(1) In October, the CA AG announced a $530,000 settlement with Sling TV and Dish Media for CCPA violations. This was the result of a coordinated investigative sweep in 2024. The CCPA violations cited by the AG in its complaint included failure to provide an easy-to-use mechanism to opt out of sales and sharing, and insufficient privacy protections for children, particularly the failure to obtain affirmative “opt-in” consent for targeted advertising by users under the age of 16.
(2) In November, the CA AG announced a $1.4 million settlement with Jam City, Inc., a mobile app gaming company, for CCPA violations. Like in the Sling TV action, the claims brought against Jam City focused on failure to provide an in-app method to opt out of sale and sharing, and failure to obtain affirmative “opt-in” consent for targeted advertising for users 13 to 16 years old. (Are we seeing a theme here, folks?!)
***Take-away: Take a close look at your sale/sharing opt-out mechanism(s) in light of these actions and determine if there are any potential gaps. If you offer a mobile app, make sure users can opt out through the app (not just your website). A streamlined, easy-to-find, and easy-to-use opt-out mechanism that’s accessible from wherever the consumer is using your service is critical – make sure that yours meets these standards. Monitor and maintain age gates for access to your services. If you have users between the ages of 13 and 16, default targeted advertising to “off” and ensure a clear opt-in mechanism is in place.
CPPA Activities (CalPrivacy)
Of course, we can’t talk about the California AG’s efforts without mentioning the CPPA’s activities from the past quarter (that’s the California Privacy Protection Agency, or as they’re calling themselves on their website, “CalPrivacy”).
In connection with the newly adopted DROP regulations, in November, the CPPA announced that it is launching a “Data Broker Enforcement Strike Force” within its enforcement division. (I’m imagining a group of menacing-looking individuals sporting dark glasses and FBI-style jackets that say “CPPA” or “DROP” on them.) According to the CPPA, the Strike Force will provide “additional resources to combat potential violations.”
Not long after, the Strike Force brought its first Delete Act enforcement action against a Nevada marketing firm. The CPPA’s Strike Force fined the firm $56,600 in fines and unpaid fees for failure to register as a data broker.
***Take-away: If you are, or think you might be, a data broker handling California consumer data, evaluate whether you need to register (and register if you think you do). Folks, this includes creating inferences to profile consumers. To quote Michael Macko, the head of enforcement at the CPPA, “We will scrutinize any business that walks and talks like a data broker to make sure it’s registered, and we will continue to examine businesses that create inferences about consumers to profile them.” Now that’s a “mic DROP” (I couldn’t resist).
Consortium of Privacy Regulators
We’ve been hearing about collaboration among state regulators on privacy law enforcement at multiple privacy conferences since this spring. In October, the CPPA announced that two additional states, Minnesota and New Hampshire, had joined the Consortium of Privacy Regulators, a bipartisan effort of multiple US state enforcers initially announced in April.
The addition of MN and NH brings the total of states participating in the Consortium to 9.
***Take-away: This development further emphasizes the importance of compliance with all applicable state privacy laws and of having your compliance ducks in a row. If one regulator wants to investigate you, it is very likely that there are others coming along for the ride. This is enforcement to the ninth degree.
FTC Enforcement
For those who have been thinking that the FTC is no longer going to be doing much in the enforcement arena, au contraire! In December, the FTC announced an enforcement action against Illusory Systems Inc. (d/b/a Nomad) for failing to implement adequate security measures, resulting in a $186 million loss. According to the FTC, Nomad claimed that its services were “security-first.”
Despite the claim, the company failed to implement adequate security incident response processes and secure coding practices that ultimately led to the loss. The FTC’s complaint is focused on events that occurred in 2022.
The FTC’s proposed order prohibits the company from making misrepresentations about its security practices, requires the company to implement a comprehensive information security program, and requires the company to obtain biennial assessments of its security program. On top of that, Nomad is required to refund consumers who have not already been refunded.
***Take-away: While this is a pretty standard enforcement action from the FTC, it signals that enforcement is still alive and well and that the FTC is still focused on adequate security measures.
NYOB Cross-App Data Tracking
Last but not least, the pressing question: how did Max Schrems and the NOYB complete their 2025? By focusing on cross-app data tracking practices, with an LGBTQ rights lens, of course!
In December, NOYB filed a complaint with the Austrian data protection authority against TikTok, Appsflyer, and Grindr for data-sharing practices involving Grindr data shared with TikTok via Appsflyer.
The details? An individual’s use of Grindr was being shared with TikTok to draw conclusions about the individual’s sexual orientation and sex life (special categories of data under GDPR Article 9, mind you). NOYB argues that Grindr and Appsflyer lacked a legal basis under the GDPR (and no special condition under Article 9) to share the data with TikTok, and that the apps never obtained the individual’s consent.
This cross-app data sharing activity was discovered through a data subject access request. NOYB filed a separate complaint against TikTok for failing to provide a full response to the access request.
According to NOYB, TikTok provided only limited data that it deemed the most “relevant” in response to the request, and the individual was unable to access information about what data was being processed for what purposes after repeated requests.
***Take-away: Know your data flows, especially when it comes to sensitive data. If you’re going to process this data, be up-front about it, provide all data in response to an access request, and know what data is flowing in and out and for what purposes. Or be prepared for NOYB to bring a complaint, because they will find out, and strongly recommend an “effective, proportionate and dissuasive fine.”