Top 3 CCPA Compliance Priorities for Employers: Cybersecurity Audits, Privacy Risk Assessments, and ADMT Obligations

By Jennifer Ruehr

New California Consumer Privacy Act (CCPA) Regulations came into effect on January 1, 2026, that may have a significant impact for California employers. Additionally, CalPrivacy disclosed in its 2025 Report that it is ramping up its audit division within its Enforcement Division to assess businesses compliance with CCPA. If your organization is a “business” subject to the CCPA and collects, uses, or discloses personal information relating to California applicants, employees, contractors, or other workers, these updated regulations add three major compliance workstreams that impact your compliance program today:

Cybersecurity Audits - where processing presents a “significant risk” to consumers’ security

Privacy Risk Assessments - where processing presents a “significant risk” to consumers’ privacy

Automated Decision-making Technology (ADMT) Obligations - when ADMT is used for “significant decisions,” including employment decisions

This post summarizes these requirements for California employers and highlights practical steps you can take now to help your organization stay compliant.

Cybersecurity Audit Requirements

The updated regulations create a formal cybersecurity audit regime for businesses whose processing presents a “significant risk” to consumers’ security. While this is not “HR-only,” employers should take special care because workforce data is often sensitive data and tends to be widely distributed across vendors and internal systems.

Triggering thresholds
The cybersecurity audit obligation applies where your business meets any of the following “significant risk” thresholds:

  • The business derives 50% or more of annual revenues from selling or sharing personal information in the preceding calendar year; or

  • The business has annual gross revenue, currently $26,625,000; and processed

o   (A) personal information of 250,000 or more [California] consumers or households in the preceding calendar year; or

o   (B) the sensitive personal information of 50,000 or more [California] consumers in the preceding calendar year.

When considering whether your business meets the following thresholds, the calculation should assess the total number of California residents, not just personnel.

Audit requirements

Your audit must assess “how the business’s cybersecurity program: protects personal information from unauthorized access, destruction, use, modification, or disclosure; and protects against unauthorized activity resulting in the loss of availability of personal information.” While your business may have performed some form of cybersecurity audit already, you will need to ensure that it meets the specific requirements under CCPA, including:

Audit independence and qualifications
Audits must be performed by a qualified, objective, independent auditor using accepted professional procedures/standards. The auditor may be internal or external, but independence requirements apply (including independence from those responsible for the cybersecurity program).

Evidence-based audit (not just management assertions)
Audit findings must rely primarily on evidence (documents reviewed, testing performed, interviews conducted), not primarily on leadership attestations and address the 18 controls described in the regulations. Additionally, the report must include the title of up to three qualified individuals responsible for the cybersecurity program, certification by the auditor, as well as detailed findings by the auditor.

Record retention
Businesses (and auditors) must retain documents relevant to each audit for at least five years after completing the audit.

Audit certification and submission to CalPrivacy

You must submit required certifications to CalPrivacy no later than April 1 of the year following your completed audit in compliance with deadlines described below.

Timing and phased deadlines
The regulations phase in initial compliance deadlines based on annual gross revenue, with initial certification/submission deadlines:

April 1, 2028 – 2026 annual revenue was more than $100M as of January 1, 2027; audit for period of January 1, 2027, through January 1, 2028

April 1, 2029 – 2027 annual gross revenue was between $50M and $100M as of January 1, 2028; audit period of January 1, 2028, through January 1, 2029

April 1, 2030 – 2028 annual gross revenue is less than $50M as of January 1, 2029; audit period of January 1, 2029, through January 1, 2030

Annually thereafter

Employer takeaway: Even if Security “owns” this, HR systems (HRIS, ATS, payroll, benefits, timekeeping, monitoring tools, identity/access management for workforce apps) are typically in-scope systems for audit scoping and evidence gathering. Employers should expect increased diligence around access controls, segmentation, vendor oversight, and incident response readiness as audits approach.

Privacy risk assessment requirements

Effective on January 1, 2026, the updated regulations now require privacy risk assessments to be completed where processing presents a “significant risk” to consumers’ privacy. In the employment context, this can be triggered by common HR and talent technology practices, especially where analytics, profiling, monitoring, or AI are used.

When a risk assessment is required (employment examples)

A risk assessment is required for processing activities that present a “significant risk” to an individual’s privacy.  CCPA describes these “significant risk” activities as:

  • Selling or sharing personal information
    Workplace example: using third-party tags/trackers or data-sharing arrangements tied to recruiting sites, careers pages, or employee-facing portals in a way that qualifies as “selling” or “sharing” for cross-context behavioral advertising.

  • Processing sensitive personal information
    Workplace example: collecting government IDs, precise location, certain health-related information, union membership, biometrics, private, employee communications, information about covered dependents known to be under the age of 16 or other sensitive categories beyond narrow administrative purposes.

  • Using ADMT for significant decisions
    Workplace example: A “significant decision” includes decisions that result in the provision or denial of employment or independent contracting opportunities or compensation, including:

    • Hiring

    • Allocation/assignment of work and compensation

    • Promotion

    • Demotion, suspension, or termination

  • Automated processing used to infer or extrapolate traits such as intelligence, ability, aptitude, performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, location, or movements, based on systematic observation in a workplace/education setting.
    Workplace example: productivity scoring, “attrition risk” predictions, behavioral scoring from monitoring data, or performance “potential” scoring based on observation/telemetry.

  • Automated inference/extrapolation of the above traits based on a consumer’s presence in a sensitive location (with limited delivery/transportation carve-outs)

  • Training an ADMT for significant decisions  

Employee/contractor sensitive PI exception

A risk assessment is not required for processing sensitive personal information related to employees and contractors for core administrative purposes, such as administering compensation payments, determining/storing employment authorization, administering benefits, providing legally required reasonable accommodations, or wage reporting.

Timing, updates, and submissions

  • Risk assessments requirements came into effect on January 1, 2026.

  • The business must be prepared to provide required information to the regulator under the regulations’ submission framework (including early submissions due by April 1, 2028, for initial risk assessments described in the rules).

  • Risk assessments also need to be revisited and updated per the regulations’ governance approach when processing materially changes.

Employer takeaway: The risk assessment requirements are similar to other privacy assessment requirements from non-US laws, including DPIAs under GPDR/UK GDPR. Review your current risk assessment practices to determine how to integrate HR use cases. If this is a new requirement for your organization, consider developing a shared intake/triage process for “new HR tech,” “new monitoring/analytics,” and “new AI use cases” that can be reviewed by key stakeholders including HR, Privacy, Security, and Procurement. And, make sure to identify someone in the organization who is responsible for ensuring the risk assessments are completed and the necessary information is submitted to CalPrivacy annually.

ADMT obligations for employers (effective beginning 2027)

When ADMT obligations apply

The ADMT obligations apply when a business uses automated decision-making technology (ADMT) to make a “significant decision” about a consumer. Employment is explicitly within the definition of significant decisions (as described above).

As a practical matter, employers should evaluate whether tools used in recruiting, performance management, and workforce management are functioning as ADMT for significant decisions—especially where outputs (e.g. scores, ratings, recommendations, etc.) are used without meaningful human involvement.

Core obligations

Where ADMT is used for significant decisions, the regulations include obligations such as:

Pre-use notice requirements to provide notice before ADMT processing occurs, with specific content expectations in the regulations

Opt-out rights for ADMT, subject to exceptions described in the regulations; in some cases, providing a method to appeal to a human reviewer may be part of an exception/alternative path

Right of access to ADMT information requirements to respond to requests for information about ADMT used for significant decisions

Employer operational implications

  • HR must be able to explain what tools are used, for which decision points, what data inputs are used, and what the outputs do (and do not) mean.

  • Employers should also coordinate with vendors: if a vendor’s tool is used for significant decisions, the vendor’s documentation and cooperation often becomes critical to meeting notice, access, and risk assessment obligations.

Employer Takeaway: To achieve compliance efficiently, employers should: (1) conduct an inventory of relevant systems, (2) identify which use cases involve “significant decisions,” (3) implement notice and request-handling procedures, (4) ensure meaningful human oversight where appropriate, (5) thoroughly document associated risks and safeguards, and (6) implement procedures to monitor deployed ADMT and receive internal and external reports of unexpected results.

Practical employer checklist

There is a lot to do to ensure your organization is complying with the new regulations. Here’s how to get started:

1) Start building an inventory of “workplace AI / analytics / monitoring”
Include ATS screening/ranking, assessments, interview analysis, scheduling, productivity scoring, attrition prediction, compensation modeling, and workforce optimization.

2) Map use cases to help classify risk
Identify sensitive PI, systematic profiling sources, inference outputs, location data tracking, data sharing/selling risk, and training uses.

3) Identify key stakeholders to help develop and implement CCPA workplace governance program
Define who approves new HR tech uses, who owns risk assessments, and who owns ADMT notices and rights requests. Engage with IT and security teams to ensure HR use cases are considered in cybersecurity audit reviews.

4) Initiate risk assessments
Review existing privacy assessment templates (or create a template) that captures: use cases, business purpose, categories of PI/SPI, privacy notices/policies, vendors, ADMT logic information, retention periods (and other operational elements), benefits, privacy risks, safeguards, stakeholders involved, and approvals.

5) Prepare for cybersecurity audit readiness
Engage with key IT and security stakeholders to align HR systems with security controls and evidence requirements; ensure vendor contracts and controls support audit needs.

6) Operationalize worker rights requests
Ensure HR and Privacy teams can respond to CCPA requests that touch workforce systems—especially ADMT-related access and opt-out requests.

Timeline snapshot:

  • Jan 1, 2026: Updated CCPA regulations take effect; risk assessments begin for significant-risk processing

  • Jan 1, 2027: ADMT requirements begin for significant decisions (including employment/compensation decisions)

  • Apr 1, 2028: Early risk assessment submissions/attestations due; earliest cybersecurity audit certifications due for the largest revenue tier

  • Apr 1, 2029 / Apr 1, 2030: Additional phased cybersecurity audit deadlines for lower revenue tiers

Now is an ideal time to scope your workforce data ecosystem and pinpoint the HR and talent technologies and workflows most likely to trigger cybersecurity audits, risk assessments or ADMT obligations. Doing so will help Privacy, Security, HR, and Procurement align on a clear, phased roadmap well ahead of CalPrivacy’s audit and enforcement ramp-up. Consider scheduling an internal working session this quarter to identify key stakeholders and governance program owners, prioritize the highest-risk use cases, and set near-term milestones for risk assessments and audit readiness. You can also reach out to your Hintze contact if you would like to discuss a practical, phased compliance roadmap tailored to your workforce systems and highest-risk HR use cases.

Hintze Law PLLC is a Chambers-ranked and Legal 500-recognized, boutique law firm that provides counseling exclusively on data protection including AI, privacy, and data security. Hintze attorneys and data consultants support technology, advertising, media, fintech, health, biotech, ecommerce, and mobile industries.

Jennifer Ruehr is Co-Managing Partner at Hintze Law PLLC and co-chair of the firm’s Workplace Privacy Group, Cybersecurity and Breach Response Group, and the Artificial Intelligence and Machine Learning Group. Jennifer advises a diverse range of clients, including leading technology, automotive, airline, telecommunication, publishing, and media companies to innovative start-ups. She provides business-focused, actionable legal advice on privacy program development, AI risk management, and complex data agreements.

Don’t Sleep on Maryland: The Maryland Online Data Privacy Act Will Keep Health and Wellness Companies Up at Night — Hintze