2025 Q3: Top Updates in Privacy and AI

Overview

AI:

  • In August, the GPAI provisions of the EU Act went into effect. 

  • California finalized the long-awaited regulations on automated decision-making technologies (ADMT).

  • Illinois and California both enacted new AI legislation in Q3, with Illinois focused on AI use in therapy (August), and California regulating developers of “frontier AI models” (think the truly massive LLMs)(September).

  • Anthropic changed the terms of certain service offerings to allow for training by default.

Privacy:

  • Tennessee and Minnesota are the latest states with privacy laws going into effect. Maryland’s law, with its restrictions on sensitive personal information, went into effect October 1 (we’ll cover it in Q4’s recap).

  • In addition to the ADMT regs, California also approved the finalized regulations on risk assessments and cybersecurity audits. 

  • The EU Data Privacy Framework (DPF) has been upheld against a recent challenge. 

  • The EU Data Act went into effect.

Enforcement:

  • California regulators brought two significant enforcement actions involving violations of the CCPA that included failure to honor opt-outs and failure to implement CCPA-required contractual provisions, among others. The CPPA issued a $1.35 million fine to Tractor Supply, and the CA AG imposed a $1.55 million fine against Healthline Media.

  • The CPPA and AGs of Colorado and Connecticut are joining forces for an enforcement sweep related to opt-out preference signals. 

  • Connecticut has undertaken its own enforcement actions, bringing a settlement against TicketNetwork and imposing a fine of $85,000.

  • Disney agreed to a $10 million FTC settlement to resolve COPPA violations.


AI

Q3 saw existing AI legislation enter into force, with new laws and regulations continuing to emerge.   

  • Tighten up those laces on your hiking boots, folks–we’ve just passed another milestone on our trek up the EU AI Act implementation mountain! On August 2, 2025, the provisions related to General Purpose AI (“GPAI”) Models went into effect. Note that there’s a grace period for GPAI models on the market prior to this date; those GPAI models aren’t required to comply for another 24 months. We’re now on our final climb to the summit! Most of the remaining EU AI Act provisions go into effect in August 2026. ***Takeaway: Check to see if you might be a provider of GPAI models under the Act, and if so, determine if the provisions might apply to you. If so, start getting your transparency and compliance documentation in order. This includes technical documentation and information regarding the training date. And start prepping for compliance when the full scope of the Act goes into effect next year–it may be a trek, so start training early.

  • At long last, the California ADMT regulations have been finalized! Were you holding your breath in eager anticipation? Well, go ahead and take a deep exhale–and then another deep breath as you dive into the new regs! Yes, at the end of September, the finalized automated decision-making technology (ADMT) regulations were approved by the CA Office of Administrative Law. Some quick highlights: The new regs require businesses using ADMT to provide a pre-use notice and the right to opt out to consumers, among other requirements. The regs generally go into effect January 1, 2026, with some exceptions. ***Takeaway:  Assess whether you might be using ADMT, and if so, begin implementing the necessary measures for compliance. Plan ahead to get your notices updated in time to accommodate the pre-use notice. 

  • Multiple states, including Illinois and California, passed new laws focused on regulating AI use. In August, Illinois enacted the Wellness and Oversight for Psychological Resources Act, which restricts the use of AI systems in therapy and psychotherapy settings to only supportive and administrative tasks, requiring licensed professionals to maintain full responsibility. In September, California passed SB 53, the Transparency in Frontier AI Act (“TFAIA”). It’s a broad transparency law requiring developers of "frontier AI models" to publish frameworks detailing how they assess and mitigate systemic risks. The law requires transparency reports and reporting of “critical safety incidents.” It applies to what the law calls “frontier AI models,” or AI models trained using 10^26 FLOPS of computing power. ***Takeaway:  While these specific laws may not directly impact your company, it’s beneficial to be aware of the broader trend of emerging AI laws that are restricting the use of AI in certain industries and requiring greater degrees of reporting and transparency. Be prepared to back up your development and/or use of AI models and systems with documentation and risk assessments, as the trend continues.

  • Is training by default the official new norm? In August, Anthropic revised its Consumer Terms to state that all future user data (chats and coding sessions) may be stored for up to five years and used to train its Claude models unless the user opts out. This marks a shift away from the previous deletion-by-default stance and indicates the increasing focus of AI model developers on obtaining training data. It also raises privacy concerns about the extended retention period and raises “pay or consent” issues related to targeted advertising.  ***Takeaway: If your company or personnel are using Claude under the Free, Pro or Max plans, these changes will apply to you. Consider re-evaluating the scope and purposes of use and, at a minimum, exercising the choice to opt out or move to the enterprise plans. 

Privacy

New requirements continue to come into force as state and European laws have entered into effect and additional California regulations have been approved. 

  • It’s time again to play “name the state privacy laws that went into effect last quarter”!  (Cue the cheesy theme music.) Ready for the big reveal? That’s right! The privacy laws in Tennessee and Minnesota went into effect in Q3. We talked about Tennessee in Q2’s recap, so we’ll take a look at Minnesota this time. Some highlights: The Minnesota Consumer Data Privacy Act (“MN CDPA”) implements standard consumer privacy rights and requires opt-in consent prior to selling personal information of consumers between 13 and 16, and for sharing that information for targeted advertising. It has a 30-day cure period that sunsets July 31, 2026. It exempts small businesses and excludes data of individuals acting as job applicants or employees. Heads Up: Maryland’s privacy law is up next, taking effect in October. Maryland’s law is unique in its strong restrictions on processing sensitive personal information. More to come next quarter! ***Takeaway: Determine if either or both of these newly effective laws apply to you. For Minnesota, pay special attention if you process the personal information of minors aged 13-16.

  • Friends, this is a landmark month in the realm of privacy regulations. The California OAL didn’t approve just the ADMT regs, certainly not! They also approved the new regulations on cybersecurity audits and on risk assessments. What should you know? For cybersecurity audits, Businesses that meet the specified threshold (based on revenue and the volume of personal information processed/sold/shared) must complete an independent audit and produce a report, and annually certify to the CPPA that the audit was completed. Businesses that meet the threshold and that generate over $100 million in gross annual revenues report first, starting on April 1, 2028. Businesses meeting lower revenue thresholds are required to report on April 1, 2029, or 2030, depending on their revenue level (lower revenue, later). Risk assessments are required when a business engages in processing activities that constitute a “significant risk to privacy.” There is a list of elements that need to be included in the risk assessment, including specific purposes or processing, and safeguards, in addition to information on the individuals who provided the assessment. The big thing to know is that businesses are required to report information about completed risk assessments to the CPPA. The reporting requirement includes a sworn attestation under penalty of perjury that the business conducted a risk assessment. The CPPA also has the ability to request the full assessment. The reporting requirement for risk assessments completed in 2026 and 2027 is April 1, 2028. ***Takeaway: Check if you meet the thresholds for completing audits and risk assessments. If you do, start preparing now, as you’ll need to report to the CPPA and be ready to back up your report. It will likely take time to prepare and implement the processes to complete audits and risk assessments if you don’t already have them in place. If you do, evaluate how you can leverage your existing processes to comply. And mark your calendars for April 1, 2028!

  • Have you been wondering–hey, what’s the latest on the EU-U.S. Data Privacy Framework? We have an update for you! Quick refresher (cue the wavy lines indicating a flashback moment): In Q1’s recap, we talked about uncertainty and upheaval around the DPF resulting from Trump’s firing of the Privacy and Civil Liberties Board (“PCLOB”). In 2023, a member of the French Parliament, Phillipe Latombe, brought a legal challenge to annul the DPF. This September, the European General Court dismissed the challenge. The court’s ruling, however, focused on the validity of the DPF at the time the initial adequacy decision was reached. What does this mean? For now, the DPF lives on to see another day (or at least wasn’t a non-starter from the get-go). This also means that future challenges are likely imminent. Latombe has already appealed the decision. There is also the possibility of a future challenge to the ongoing adequacy of the DPF, given recent developments with the PCLOB. ***Takeaway: For now, maintain your DPF certification with the Department of Commerce if you have one. Stay on the lookout for future developments and know that the validity of the DPF will likely remain in question for the foreseeable future. 

  • Across the pond, in September, most of the provisions of the EU Data Act (Regulation (EU) 2023/2854) went into effect. Which of the many recent European laws is the Data Act, you might ask? It’s the law that’s designed to provide for greater access and cross-party sharing of data from connected devices. It applies to IoT manufacturers and users, in addition to the GDPR. It provides for data access requirements as well as enables “cloud switching.” *** Takeaway: If your business offers connected devices or related services, consider whether the Act applies. Be prepared to make data accessible to users or other service providers, depending on the scope of your obligations. This may mean making operational and even technical updates. 

Enforcement

It’s a U.S. enforcement frenzy this quarter. State regulators are ramping up enforcement activity, and fines are increasing. And the trend of enforcing children’s privacy requirements continues. Focus on ensuring opt-outs work and that you have the right contractual terms in place now, in case they come your way.

  • Yes, we’re entering the stormy season in California, and that includes enforcement storms. California regulators brought two landmark enforcement actions in Q3. In July, the CA AG announced a settlement with Healthline Media LLC for violations of the CCPA (and Unfair Competition Law) and imposed a $1.55 million civil penalty, the highest to this point. The violations cited by the CA AG included violations of the purpose limitation principle of the CCPA for disclosing sensitive health information in the form of article titles regarding specific health conditions for targeted advertising purposes and failure to honor sharing opt-outs, including via opt-out preference signals. In September, the CPPA announced its decision against Tractor Supply Company, imposing a $1.35 million fine and requiring changes to the company’s business practices. The CPPA cited violations of the CCPA for failing to honor consumer opt-out requests, including via opt-out preference signals, and for failing to include appropriate CCPA language in its contracts. Are we seeing some themes here? ***Takeaway: Check and re-check your sale/sharing/targeted advertising opt-out mechanisms frequently to ensure they’re working correctly and actually result in an opt-out. Be sure you’re honoring and actually effectuating opt-out preference signals. And make sure you’re including the necessary CCPA language in your contracts. 

  • What’s that saying? Regulators that enforce together get stronger together? Maybe that’s not a saying now, but it will be! In September, the “power regulatory trifecta” of the California Privacy Protection Agency (CPPA) and the Attorneys General of Colorado and Connecticut announced a joint investigative sweep targeting potential noncompliance with the Global Privacy Control (GPC) universal opt-out signal. The regulators kept telling us that they’re collaborating, and now we’re seeing it in action! ***Takeaway: Yes, the regulators clearly mean business when it comes to honoring Global Privacy Control signals. Review your internal procedures to ensure GPCs are being honored and opt-outs are actively implemented. Test and retest regularly!

  • Connecticut is getting achievement badges for its privacy-related activities in Q3 as well. In July, the Connecticut Attorney General reached a settlement with TicketNetwork, Inc., imposing an $85,000 fine for violations of the CTDPA arising from the company’s repeated failure to cure deficiencies in its privacy notice. As we also mentioned in Q2’s report, the state adopted amendments to its privacy law, the CTDPA, that expand the law’s applicability. ***Takeaway: Connecticut might look lovely in this autumn season, but be wary and remain active to implement and maintain compliance with the privacy law. If you haven’t already, check whether your company now falls within the law’s scope. Also, review your privacy notice to make sure it complies with the CTDPA’s requirements. 

  • Labels matter, at least to the FTC, and particularly with respect to children’s content. The U.S. Federal Trade Commission (“FTC”) announced a $10 million settlement with Disney over alleged violations of the Children's Online Privacy Protection Act (“COPPA”) for allowing the nonconsensual collection of children's data from "kid-directed" YouTube content. According to the FTC, Disney failed to mark certain YouTube videos directed to children as “Made for Kids” or “MFK.”  The issue stemmed from Disney’s labeling videos at the channel level rather than monitoring changes to how they were labeled. ***Takeaway: If you post or publish content that could potentially be considered directed at children, review it and your labeling and settings frequently, especially if posting content on a third-party platform. 

  • We couldn’t conclude the quarter recap without checking in on Max Schrems and the NOYB. What’s he been up to? Eradicating spam ads in his Gmail inbox, apparently! In September, the NOYB announced that the French regulatory authority, the CNIL, issued a decision against Google and imposed a fine of €325 million based on a complaint brought by the NOYB. The NOYB complaint claimed that Google should have obtained consent before sending users unsolicited advertising emails directly to their inbox. The CNIL agreed and issued its decision against Google. The decision also requires Google to implement measures to stop showing ads in Gmail.  ***Take-away: Don’t mess with the NOYB–oh wait, you knew that already! In all seriousness, before you think of creative ways to deliver marketing messages, consider whether consent might be needed and plan accordingly. 

Next
Next

Q2 2025: This Quarter in Privacy and AI