In September 2025, the California Privacy Protection Agency (CPPA) adopted a series of additions to regulations supporting the California Privacy Rights Act (CPRA). These new rules mandate companies to protect California consumers when they deploy automated decisionmaking technology (ADMT), with effect from January 1, 2027. In this article, we look at the implications of the rules, their impact on common use cases, and how institutions can prepare.
The CPPA defines ADMT as “any technology that processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking.”1 The new rules apply when a business uses ADMT to make a ‘significant decision’ that “results in the provision or denial of financial or lending services, housing, education enrollment… or healthcare services.”
The rules set out a range of new obligations such as risk assessments, pre-use notices, consumer opt-out, data access requests and appeals. Pre-use notices must use plain, specific language, describing any alternatives available to consumers who decline automated decision-making.
The transparency and control mechanisms of the CCPA ADMT rules represent a major shift from traditional ‘notice-and-consent’ models toward proactive governance and consumer empowerment. Many financial institutions will be subject to the updated regulations, given the expansion of algorithmic decision-making in areas such as loan approval, fraud detection and credit scoring.
CPRA exempts financial institutions from enforcement where the collection, processing and disclosure of personal data is limited to ‘nonpublic personal information’ (NPI), because this is regulated under the federal Gramm-Leach-Bliley Act (GLBA). NPI includes any personally identifiable financial information that a financial institution obtains about an individual in connection with a financial product or service, as long as that information is not otherwise publicly available. This includes information provided by a consumer to a financial institution, information derived from transactions, and any other information obtained while providing a financial product or service.
Most ADMTs used by financial institutions collect and process NPI. However, they often also require additional information that falls outside the scope of NPI, such as geolocation data, marketing profiles, behavioral analytics and web tracking. When they design their solutions for ADMT compliance, financial institutions should consider the data requirements to determine whether the information falls within the boundaries of the GLBA exemption.
Furthermore, the updated CPRA regulation explicitly defines rules for any ADMT that makes financial decisions about consumers, so in this regard financial institutions cannot rely on GLBA to shield them from the new obligations.
Common financial sector ADMT use cases
Several use cases straddle the GLBA exemption line. In the table below, we present various examples alongside a brief characterization of the potential impact of the CPRA’s new rules and some selected actions for financial services firms to consider.
These use cases demonstrate that as financial institutions increase their use of AI tools in their servicing operations, GLBA provides less air cover under the new CPRA regulations. The combination of automated decision-making and hybrid data sets used to formulate many decisions means that they will fall squarely within the scope of the new regime.
We anticipate that California will serve as a model for other states, as they define their own legal and regulatory obligations for the use of AI across the financial services sector. Financial institutions should consider treating the California obligations as a foundation for establishing robust AI governance and transparency. Building a strong foundation now will enable organizations to more easily pivot to fit different legal frameworks as state regulations continue to evolve.
Preparing for CPRA’s new ADMT rules
To fulfil the upcoming requirements, financial institutions must shift from a minimalist notice and consent model to a full-fledged operational governance framework with documented oversight and greater transparency around the use of automated decision systems.
Well before January 1, 2027, financial institutions will need to set a series of preparations in motion. For example, businesses will need documented evidence of compliance with elements such as a comprehensive ADMT/decisioning inventory that flags systems in terms of their risk and compliance obligations.
They also will need to set up consumer-facing privacy solutions designed and tested in terms of their application to use cases. For example:
- Pre-use notice to consumers – This should include an explanation of the use of AI for decisionmaking and an explanation of consumers’ rights
- Frictionless consent opt-out – This should offer at least two easy methods to opt out of ADMT use, aligned with how the business normally interacts with customers. These might include an online form or clearly labeled link (e.g. ‘Opt-out of Automated Decisionmaking Technology’) on the website or app; and another accessible opt-out channel, such as by email or in-person form filling. Cookie banners do not qualify, as they relate to data collection, not decisionmaking.
- Appeals mechanism – This refers to the process that allows consumers to challenge or request human review of an automated decision that has had a significant impact on them. This might be via Data Subject Access Request (DSAR) forms or a published phone number that connects the customer with a customer service agent.
Financial institutions should also put in place ADMT-related governance policies and defined procedures to cover new and existing ADMT use cases – including risk assessments and pre-launch compliance reviews – as well as response procedures for consumer inquiries, opt-outs or automated decision appeals. They will also need to formulate a process to update, test and revalidate systems periodically or when material changes occur.
Compliance will mean revisiting the definition of roles and responsibilities for managing new ADMT deployments, the introduction or modification of automated decisionmaking systems and for executing the associated risk and compliance assessments required before deployment. It will also mean documenting risk assessments and compliance reviews within a formal register, together with mitigation strategies and plans.
Vendor contracts must be aligned to require cooperation with fulfilling the new ADMT obligations. Each institution will need to put in place audit-ready logs, staffing, dashboards, human review chains and contingency plans for ADMT system failures or known hallucinations.
Finally, institutions will need an agreed course of action in the event of problems and issues. For example, they will need to think through playbooks for responding to CPPA challenges or consumer complaints – and make sure that these playbooks can lean on defensible documentation.
How Capco can help
Capco’s Data Security & Privacy capability is uniquely positioned within the firm’s broader Data & Analytics consulting practice to help financial services and energy sector clients answer challenges such as:
- Developing AI inventories and assessing related risks
- Improving data management practices and data security posture
- Exploring graph technologies to modernize data analytics
- Architecting a privacy program to meet regulatory obligations
- Implementing cookie or ADMT banners and universal consent and preference management solutions.
Capco can help embed governance, risk and compliance frameworks that are scalable, explainable and aligned to evolving standards such as the EU AI Act, NIST, ISO and state-specific regulations. Capco partners with best-of-breed technology vendors to automate and integrate data management and privacy operations capabilities – helping our clients balance marketing objectives with regulatory obligations.
References
1 California Privacy Protection Agency, California Consumer Privacy Act Regulations on Automated Decisionmaking Technology (ADMT), adopted September 2025. https://cppa.ca.gov/regulations/ccpa_updates.html