By Sam Castic
This past year has been a busy year for privacy leaders and professionals, and the pace of change underscores that reactive approaches to new laws, regulations, and enforcement actions are not effective ways to build or scale privacy programs. Laws and risks will continue to evolve, and strategically planning and evolving existing privacy programs may be the best way to keep them effective. Below are ten privacy program areas plans should focus on in the first half of 2024 to help programs adapt to new and emerging risks and US requirements:
1. Pixel and Tracking Technology Governance and Management. Four pixel and tracking technology areas to focus on H1 2024 include your company's: (1) governance approach, (2) ability to respond to signals, (3) expansion of opt-out rights to relevant states, and (4) technologies used.
For governance, check in on how existing processes are working to keep an up-to-date understanding of the specific third-party pixels and trackers your company uses, how their purposes are classified including based on contract terms specifying how they can use data they get, and what steps must be followed before new ones are added. If you use a cookie preference management solution that scans and helps classify trackers used, that’s great, but make sure these governance practices are working so that the solution can work properly. Take a look too at what data your company's responsible stakeholders are configuring relevant technologies to pass, as this can help your practices to stay in line with your privacy notice and disclosures, and to mitigate growing risks associated with increased regulator scrutiny of these technologies and private lawsuits under federal and state privacy and wiretap laws.
Validate that your preference management solutions are properly responding to third-party signals to opt-out of sales or targeted advertising. Adjustments may be needed to address new requirements in California and Colorado. For example, in March, California's mandate to honor the Global Privacy Control gets some additional enforceable requirements for how companies need to respond and associate signals with known customers when the stay on the California Privacy Protection Agency's (CPPA) regulations ends. By July 1, companies may need to honor the Global Privacy Control, OptOutCode, and Opt-Out Machine signals if their respective applications to the Colorado Attorney General are approved.
If you've taken a state-specific approach to allowing opt-outs of targeted advertising or pixel/tracker-based "sales" of personal data, adjust your preference management solution and processes so that residents of other states can exercise these rights as well, including for Utah by the end of this year, and in Tennessee, Florida, Oregon, and Texas by July 1.
With third-party cookies increasingly being restricted by browsers, check in to make sure your program has an up to date understanding of the types of methods and technologies used to send data of site and app users to third parties. Validate that the preference management solution and supporting processes you have for opt-outs of sales and targeted advertising are covering all the technologies used to pass data for these purposes, whether by cookie, pixel, app SDK, server to server integration, or otherwise. If you've de-scoped some of these technologies due to assessments that laws don't apply to them, confirm this analysis still holds true--for example, the new draft EDPB guidelines may expand what the ePrivacy Directive applies to.
2. Flows and Uses of Biometric, Health and Wellness Data. If your company provides products or services related to fitness, wellness, health, gender affirming care, pregnancy, or dietary restrictions, you may be dealing with "health data" that will soon be regulated by strict new privacy laws in Washington and Nevada, and that is newly regulated in Connecticut. Your company also may be in scope if it's dealing with precise location data of people visiting stores or locations providing these products or services, or with biometric data which has a widening definition that includes certain photographs, voice recordings, or keystroke or gait patterns or rhythms. With the Washington and Nevada laws going into effect at the end of March, and Washington's My Health My Data Act (MHMDA) having a broad private right of action, make sure your program has appropriately determined whether these laws apply based on an understanding of: (1) what products and services your company provides that could be health-related; and (2) what other data your company processes that could be construed as health data.
If your company is in-scope for these laws, make sure key compliance processes are in place to: know what health data your company and its vendors process; limit processing to the narrow necessary purposes or meet consent requirements for other purposes; cease "selling" any health data unless you can comply with the challenging authorization requirements; address stricter access and deletion rights including by revisiting any old risk-based decisions based on prior laws; review and limit your own employees' access to health data; and stand up a consumer health privacy policy.
3. Privacy Impact Assessments and Data Protection Assessments. There are three areas to check with your privacy assessment program: (1) triggers so data protection assessments (DPAs) are conducted when required by new state requirements; (2) updating DPA processes to cover what must be assessed under the forthcoming CCPA regulations and having related approval and reporting processes, and (3) confirming DPA and privacy impact assessment (PIA) processes take into account considerations that are in the soon to be enforceable CCPA regulations.
Existing comprehensive privacy laws require a DPA to be conducted and documented in a number of circumstances, including when sensitive personal data is processed. On July 1, Oregon's law goes into effect and adds new types of sensitive personal data including national origin, transgender or nonbinary status, or status as a crime victim (which Connecticut's amended definition also now reflects), so make sure your existing processes identify when this data is involved so that a DPA can be conducted.
The California Privacy Protection Agency (CPPA) is considering regulations for risk assessments that would be required when DPAs are required in other states, and in additional circumstances. Specifically, it is considering requiring them in certain cases when personal information will be used to train AI models. The draft regulations have a number of specific questions and topics that need to be addressed in documented assessments; while these questions and topics are similar to what the Colorado regulations require, there are some differences that should be considered. Assessments would also need to be re-reviewed and updated when there are material changes to processing activities, and even if there are no changes, every three years; this means new processes may be needed to track and trigger re-assessments on this cadence.
Two of the most significant new requirements would be company board reporting and CPPA submissions of assessments. First, assessments would need to be presented or summarized to the company's board of directors (!), or for companies without a board, the highest-ranking executive responsible for related oversight. If this requirement remains in the final regulations, privacy program leaders should think strategically about the processes needed to finalize and surface privacy risk assessments to the board, and they may have an opportunity to bring additional visibility to the topics they work on to senior leaders at their company. Second, abridged forms of assessments may need to be submitted to the CPPA annually, and the highest-ranking executive responsible for oversight must make an annual certification. The certification would include that the company has complied with the assessment requirements, the executive has personally reviewed the assessments, and that in-scope personal information processing only occurred after the assessment was completed. Privacy programs may need significant additional processes to support a company executive in making these certifications, though the current draft will give two years from finalization of the regulations until this requirement becomes effective.
While not yet a requirement for privacy impact assessments (PIAs) or DPAs, it would be a good idea to validate that PIA processes consider the data minimization and processing purpose limitations contained in the enacted-but-stayed CCPA regulations. These will go into effect in March and have detailed guidance and requirements for what processing purposes are reasonably expected and compatible with the context of processing, and also for what processing is reasonable and appropriate for such purposes.
4. AI and Automatic Decisionmaking Assessments. Privacy programs may already have DPA processes to evaluate AI-related uses of personal data when they amount to profiling or automatic decision-making under the state privacy laws. If those processes consider all the detailed requirements from Colorado's regulations for profiling, they should be a good starting foundation from which to adapt your processes to address the CCPA regulatory requirements for profiling and automatic decisionmaking, once they are finalized. Based on the latest draft, adjustments may be needed to evaluate the following topics to make sure AI uses stay compliant: (1) how specific disclosures with functioning opt-out mechanisms will be provided before certain AI uses occur; (2) whether opt-out rights need to be provided and if so how; (3) the processes that will be used to maintain and provide extensive information that needs to be disclosed in response to access requests regarding certain uses of AI; and (4) planned methods to inform individuals when certain decisions are made using AI or other automated decisionmaking technology.
The CPPA's draft risk assessment regulations have a number of specific requirements for what must be included in an assessment of automated decisionmaking technologies that differ from what is required by the Colorado regulations. The draft also would require documented assessments to be based on privacy consultations with stakeholders external to your company, or to explain why these consultations did not occur and the related safeguards implemented to address privacy risks related to the lack of such consultations. The first half of next year might be a good time to start to socialize that choice with company AI governance stakeholders.
5. Data Subject Rights. If you haven't looked at your data subject rights fulfillment processes recently, planning a review and potential refinement for H1 2024 would be a good idea. Four areas to focus on include: (1) existing rights fulfillment processes; (2) modifications or new processes needed for health data laws; (3) rights changes if your company discloses data to third parties, and (4) potential new rights for AI and automatic decisionmaking.
Check to make sure existing rights fulfillment processes are working as intended, and consider whether any refinements are needed for the CPPA's CCPA regulations that become enforceable in late March. These regulations have detailed requirements about how certain data subject rights need to be responded to and fulfilled, including for: (i) communicating deletion requests to service providers and third parties; (ii) addressing the right to correct; (iii) responding to requests to limit use or disclosure of sensitive personal information; and (iv) honoring opt-out preference signals including associating those signals with known people and persisting preferences for purposes of opting them out of sales or sharing. They also include obligations service providers have independently when deletion requests are communicated to them by a business. Note that the CPPA is considering additional regulation amendments that would impact how individual rights requests need to be responded to, including requiring measures to keep data deleted or corrected, and informing consumers that they can complain to the CPPA or AG when requests are not fulfilled.
If your company is in scope for laws like Washington's MHMDA or Nevada's health privacy law, you may be leveraging or modifying existing data rights fulfillment processes to address obligations under those laws. If you do, plan to revisit scoping and risk decisions so they track the different requirements of these laws, and the private litigation risks for non-compliance. For example, deletion exceptions are much narrower under the MHMDA, and there may be elevated risk if unstructured health data is not addressed in response to data subject requests.
If your company discloses personal data to any third parties, update your individual rights procedures so that you can inform requestors of the specific third parties that your company discloses personal data to, where you are required to do so. Oregon's comprehensive privacy law will require this by July 1. Washington's MHMDA will also require this along with contact information for each such third party at the end of March for both third parties and affiliates that your company discloses or sells personal data to.
Finally, have the CPPA's automatic decisionmaking regulations on your radar, and consider doing some initial exploration about how the access and opt-out rights could be accommodated for your company's existing and planned AI and automatic decisionmaking uses. The current draft imposes access rights with many specific details that need to be disclosed to people upon request. It also has several areas where companies may need to let people opt out of automated decisionmaking that go beyond requirements in other states that tend to focus only on uses that involve legal or similarly significant effects. Specifically, the draft considers extending opt-out rights to topics including: (1) employer users with employees, contractors, job applicants, or students, including for tracking computer use, productivity, or other matters; (2) profiling in publicly accessible places, such as via wi-fi, Bluetooth (beacons), video recording, audio recording, or other methods; (3) profiling for behavioral advertising; (4) profiling children under the age of 16; and (5) processing personal information for training of automated decisionmaking technology.
6. Opt-ins for Sales of Sensitive Data. If your company "sells" any sensitive personal data, determine whether your company is in scope for laws that will take effect requiring consent before "selling" such data, such as under the Florida or in Texas privacy laws. Note too that the FTC is becoming more active in this space, with enforcement actions and warnings in health, financial, and other contexts that express consent is required before sharing certain sensitive personal data for unrelated purposes (which sales under the state laws would likely consist of). If your company is in scope, update data collection and preference management processes to obtain and maintain compliant opt-in consent where required. The recommendations above on pixel and tracking technology governance and management will also help to confirm that pixels and other technologies your company allows are configured not to "sell" sensitive data to third parties.
7. Customer Journeys and User Interfaces. Planning a proactive review of customer-facing user interfaces in H1 2024 may be a good use of time in light of CCPA regulations regarding "dark patterns" and symmetry in choice becoming enforceable. With many of the state privacy laws regulating "dark patterns", and some "dark patterns" being self-evident to site visitors and service users, this is an easy place for regulators to investigate in connection with enforcement sweeps. Consider scoping a review of locations where personal data and related opt-in consents are collected, individual rights can be requested or exercised, and privacy-related preferences are registered. A comparison of what you find to a simple checklist of practices that look like dark patterns--drawn from FTC guidance and the state privacy laws and regulations--may be a scalable way to approach review.
8. Privacy Notices. Unless your company completed a privacy notice review in the second half of 2023, plan one for the first half of 2024. Annual reviews and updates are required by privacy laws like the CCPA, and the CCPA regulations that become enforceable at the end of March have additional requirements for privacy notices that you may have delayed when these regulations were stayed. If your company is required to disclose individual rights metrics under the CCPA, review the CPPA regulations for new details that need to be disclosed and plan to provide these in your privacy notice or otherwise. It would also be a good idea to see what your privacy notices say about biometric and health data, as stating your company processes such information may suggest your company is in scope for the Washington MHMDA and other new state consumer health privacy laws. If your company is in scope, make sure you have a plan for releasing a compliant consumer health privacy policy.
9. Customer and Vendor Contracts and Processes. There are a couple of vendor and customer contract related areas that should be explored, including for (1) template updates, and (2) assuring contracts have appropriate privacy provisions.
If your company has not already done so, update business customer and vendor contracting templates and processes to include the additional provisions required by the CCPA regulations that go into effect at the end of March. Further requirements for contracts with service providers and contractors are being considered in connection with the CPPA's cybersecurity audit regulations and privacy risk assessment regulations.
Template updates may also be desired to reflect recent privacy developments and emerging risks. For example, if your company relies on or accepts the EU-US Data Privacy Framework for cross-border data transfers, verify this is reflected in your templates and processes. If your company says it does not, or otherwise don't want vendors to use personal data they process on your company's behalf to train AI models, consider how this is addressed in your contracting processes; the discussion draft CPPA automatic decision-making regulations suggest that the CPPA is considering requiring companies to let people opt-out of having their personal data used for AI model training so you may save some work down the road to start thinking of contractual approaches to accommodate or limit the scope of this potential obligation in the first half of next year.
For assurance, consider checking in on the status of projects to make required contract updates for legacy contracts that may have been de-prioritized due to resource or time limits. Also, it may be prudent to do some basic validation that processes and controls are working to obtain required contractual terms with vendors that process personal data on your company's behalf. The illustrative CCPA enforcement examples on the AG's website note several actions where deficient contracts were at issue, and in light of the clear requirements in the statute and regulations, it may be challenging to defend gaps that result in non-compliant contracts to a regulator. If your company is covered by the Washington MHMDA, make sure all vendors dealing with health data have some form of contract in place that makes them a "processor" under the law unless you plan to comply with the strict requirements for "sharing" such data with third parties; personal data you may have deemed too low risk and non-sensitive to justify personal data processing contract terms has a different risk calculation for a law that has a broad private right of action.
10. Internal Policies and Standards. Consider whether any changes in internal policies or standards are needed to address changes in law or emerging privacy risks. For example, with recent amendments to the CCPA definition of sensitive personal information, the Washington MHMDA and Nevada health privacy law, and different definitions across the comprehensive state privacy laws, do your company's internal data classification standards appropriately classify all of the different types of personal data and sensitive personal data? Are current policies and standards on the use of AI in applications involving personal data current and sufficient? Based on the additional recommendations above, consider planning a review other internal policies and standards that relate to your privacy program and ability to achieve privacy compliance and objectives, including for: (1) vendor contracting and risk management, (2) privacy impact and data protection assessments, (3) pixels and tracking technologies on sites, apps, and services, (4) personal data management and governance; and (5) responsible AI and automatic decision-making.
Privacy programs can be enhanced and right-sized to achieve objectives while navigating new and emerging risks and US requirements. By focusing on the ten privacy program areas above in the first half of 2024, privacy leaders can take a strategic as opposed to reactive approach to keeping their programs effective, compliant, and successful.
A version of this was published by Law360 on January 1, at https://bit.ly/4aK19oa.
Sam Castic is a partner with Hintze Law with 15 years of global privacy and cybersecurity experience.
Hintze Law PLLC is a Chambers-ranked, boutique privacy firm that provides counseling exclusively on global data protection. Its attorneys and privacy analysts support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy and data security.