Skip to site navigation Skip to main content Skip to footer content Skip to Site Search page Skip to People Search page

Alerts and Updates

California Leads the Way in Consumer-Facing AI Regulation with Automated Decisionmaking Technology Rules

October 7, 2025

California Leads the Way in Consumer-Facing AI Regulation with Automated Decisionmaking Technology Rules

October 7, 2025

Read below

The ADMT rules are triggered when the ADMT is applied to make “significant decisions.”

As an important development in U.S. AI regulation, California enacted its automated decisionmaking technology (ADMT) rules in September 2025. These are the first enacted, broadly scoped, consumer-facing AI governance rules in the country. They offer opt-out rights and logic disclosures for AI-driven significant decisions affecting consumers. The rules took effect on October 1, 2025, with compliance required by January 1, 2027, for covered businesses that use ADMT in significant decisions before that date.

Background

California has long led the nation in privacy regulation. As privacy is a main component of AI governance, the California Privacy Protection Agency (CPPA) was authorized to regulate AI systems that use personal data to make significant decisions under the California Privacy Rights Act in 2020, as an amendment to the California Consumer Privacy Act (CCPA). After years of drafting and public input, the CPPA finalized the ADMT rules as part of a rulemaking package, alongside new provisions related to cybersecurity audits and risk assessments.

Among the package, the ADMT rules stood out for their early conceptual drafting, heightened public scrutiny and substantive revisions—driven by the rise of generative AI in late 2022 and the urgency to address opaque, consumer-facing algorithmic decisions. The CPPA leveraged its existing privacy authority to sidestep legislative gridlock and deliver the first enforceable privacy-based AI governance regulations.

Scope of the ADMT Rules

The ADMT rules apply to CCPA-covered entities—for-profit entities doing business in California that collect personal information and meet one of the statutory thresholds. In its final form, ADMT is defined to apply to “any technology that processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking.” The computation generally involves machine learning as a form of AI.

The ADMT rules are triggered when the ADMT is applied to make “significant decisions.” Significant decisions include financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, and healthcare services. Therefore, examples of applying ADMT to make significant decisions include using AI tools to automatically screen and reject applicants for employment, school admission, credit or housing.

Compliance Obligations by Stage

The figure below summarizes how a business should comply with the rules around each use of ADMT. Overall, businesses must notify consumers before any use of ADMT, explain how it works and provide meaningful opt-out and appeal rights. In doing so, a business honors the consumer rights afforded by the ADMT rules.

Specifically, as illustrated in the figure:

Before applying ADMT, businesses must (1) conduct a pre-use risk assessment to demonstrate understanding of what the ADMT is doing, what kinds of personal information it processes and what the foreseeable impacts are on consumers. The business must also (2) provide pre-use notice to all consumers that describes at least the business purpose, inputs, logic, outputs, final outcomes and the opt-out and appeal processes of the ADMT, as well as how decisions would be made in the absence of ADMT. Moreover, businesses must (3) establish opt-out mechanisms that provide two or more easy-to-use methods to submit requests, such as an online form accessible via the pre-use notice and a designated email address.

While applying ADMT, businesses must (4) handle opt-out requests and cease applying ADMT to make a significant decision for the requesting consumers as necessary. They must also (5) respond to access requests concerning making significant decisions related to specific users by clarifying the purpose, logic, parameters, outputs, final outcomes or future uses of the ADMT.

After applying ADMT, businesses must (6) handle appeals of significant decisions related to specific consumers through a human reviewer who must analyze the ADMT output alongside other relevant information and may revise the significant decision as appropriate. Furthermore, they must (7) perform risk assessments periodically or responsive to material changes in processing activities to ensure that the business’ use of ADMT functions as intended and does not unlawfully discriminate.

Comparison and Implications

The ADMT rules establish safeguards for businesses applying ADMT to make significant decisions affecting consumers by requiring businesses to be transparent, careful and accountable in their use of ADMT. Covered businesses should start familiarizing themselves with the compliance steps involved in developing or acquiring any ADMT and recognize that compliance with the rules will allow them to best serve consumers responsibly while leveraging ADMT.

Compared with the closest state AI laws from Colorado, which apply to high-risk AI systems in consequential decisions, the ADMT rules offer more extensive consumer rights and specifically cover profiling rooted in privacy concerns. The ADMT rules may thus serve as a model for other states seeking to regulate AI in ways that ensure fairness, transparency and accountability for consumers.

For More Information

If you have any questions about this Alert, please contact Agatha H. Liu, Ph.D., any of the attorneys in our Artificial Intelligence Group or the attorney in the firm with whom you are regularly in contact.

Disclaimer: This Alert has been prepared and published for informational purposes only and is not offered, nor should be construed, as legal advice. For more information, please see the firm's full disclaimer.