Workplace Privacy – 5 Things I’m Keeping in Mind for 2025

By Jennifer Ruehr

Many of us are returning to work this month with New Year’s resolutions, predictions, and lists top of mind, and top of inbox.  As I turn back to work, I’m thinking ahead to how U.S. laws and regulations are going to impact my clients from a workforce perspective.  Here’s what is top of mind for me right now: 

  1. Fair Credit Reporting Act 

  2. State law AI requirements 

  3. Biometrics in the workplace 

  4. Genetic data risk 

  5. Workplace monitoring 

Fair Credit Reporting Act (FCRA) 

In October, the Consumer Financial Protection Bureau (CFPB) issued “Consumer Financial Protection Circular 2024-06” detailing its position on employers’ use of “background dossiers, algorithmic scores, and other third-party consumer reports about workers” and compliance with FCRA. 

Employers are generally familiar with FCRA with respect to conducting background checks.  However, this new Circular identifies other possible activities that could trigger FCRA compliance, including using third party tools or providers to  “…monitor workers’ sales interactions, to track workers’ driving habits, to measure the time that workers take to complete tasks, to record the number of messages workers send and the quantity and duration of meetings they attend, and to calculate workers’ time spent off-task through documenting their web browsing, taking screenshots of computers, and measuring keystroke frequency.”  

Employers should review their current use of third-party tools and providers to determine if they have additional FCRA obligations, especially if the employer (or its provider on the employer’s behalf) intends to analyze worker data to generate worker assessments, scores of worker productivity, or determine an employee’s risk to the employer. 

State Law AI Requirements 

A couple states already have AI-workplace laws in place, including Illinois and Marland (related to AI and video interviewing).  And, New York City implemented Local Law 144 to govern the use of automated employment tools. 

In 2024, we saw new legislation from states that will have an impact on employers who use automated decision-making tools within their businesses. Illinois passed HB3773 (Limit Predictive Analytics in Employment), which amends the Illinois Human Rights Act to address the use of AI in the workplace and takes effect on January 1, 2026. Colorado recently passed the Colorado AI Act, which will come into effect February 1, 2026. The California Privacy Protection Agency is moving forward with finalizing its rules on risk assessments and automated decision-making under the California Consumer Privacy Act.  And, California’s Civil Rights Council is closer to finalizing rulemaking on its own updated regulations for automated decision-making activities in the workplace.   

While there are some variation within the specific requirements, employers will need to be prepared to identify where they are using automated decision-making technologies in the workplace which a primary or substantial factor in decision-making, especially in these areas: 

  • Hiring 

  • Allocation of work, training, salary, assignment, compensation, or benefits 

  • Promotions 

  • Demotion, discipline, suspension, or termination 

  • Other conditions of employment 

This will help employers prepare for the coming requirements which include: 

  • Notice 

  • Opt-out (with some exceptions) 

  • Audits 

  • Risk assessments 

  • Adequate contracting practices 

  • Incident response plan and procedure updates (e.g., CO AG requires employers to provide the AG with notice within 90 days from discovering algorithmic discrimination) 

Biometrics Laws 

Illinois tends to be the state law most companies focus on, and for good reason.  The Biometric Information Privacy Act has frequently been applied  against employers.  Texas and Colorado also have biometric privacy laws in place.  Colorado recently passed an amendment to the Colorado Privacy Act which is directly applicable to employers.  And the Colorado Attorney General recently released updated regulations addressing employer requirements, which includes details on the necessary “Biometric Identifier Notice.” 

Employers will want to assess their current biometric notices to determine if any updates are necessary to meet Colorado’s requirements.  

Genetic Privacy

Illinois also has a law related to genetic privacy, the Genetic Information Privacy Act (GIPA).  In addition to GIPA, the EEOC regulates the federal Genetic Information and Non-Disclosure Act (GINA).  Both laws prohibit employers from requiring employees to disclose their genetic information to the employer (subject to limited exceptions).  

Similar to BIPA, GIPA permits a private right of action with heavier penalties than BIPA ($15,000 for each intentional violation and $2,500 for each negligent violation).  And, litigation is on the rise, especially related to pre-employment (or during employment) medical testing for  either employment generally or employer-based life insurance. The litigation is typically directly related to a provider asking for family history during this medical exam.

Employers will want to consider their (or their providers) practices to determine if family medical history questions are included in their questionnaires. 

Employee Monitoring

Employee monitoring is becoming more complex for U.S. employers due to changes in technology, the law, and the workforce, especially with the shift to remote work. There are many benefits to monitoring technologies, including protecting corporate data and networks, analyzing productivity and workspace to help improve efficiency, and training and coaching.  If not implemented correctly, these tools can create significant business risks, such as anti-eavesdropping and wiretapping risk, and risk of violating existing federal and state laws protecting worker organizing, workers with disabilities, and other anti-discrimination laws.  

In addition to state regulators, federal regulators are paying attention.  The Employment Opportunities Commission (EEOC) just released guidance on the use of wearables in the workplace and previously advised on the use of AI in the workplace. The CFPB released its Circular (see #1 above) on how the use of monitoring tools and AI could trigger FCRA. The National Labor Relations Board (NLRB) released a memo warning employers on the use of electronic monitoring as potentially interfering with “Section 7” rights. And the FTC’s 2022 advance notice of proposed rulemaking specifically addressed questions related to employer “surveillance.”  

When implementing any monitoring software, employers will want to consider many of the items discussed in this post, including AI and biometrics.  It is also a good idea to consider how much of the data is necessary on an identifiable basis, when, and who should have access.  

Next Steps 

There are a few steps companies can take now to address some of these risks.   

  • Reach out to your contacts within your HR, Sales, and IT teams and keep notes on the tools they use and why they use them.   

  • Consider drafting some guidelines for these teams for implementing new technologies, especially those that will use AI, record or transcribe employees’ communications, and/or monitor employee activity and identifyuse cases for when additional privacy review will be required.   

  • Keep a checklist of triggers for risk assessments, audits, notices, and other legal requirements so when these activities cross your desk, you have a quick way to assess whether a deeper review is required.   

  • Finally, be curious!  If you are internal to the company, you are likely subject to many of these tools and processes as part of your own work.   

Jennifer Ruehr is Co-Managing Partner at Hintze Law PLLC and co-chair of the firm’s Workplace Privacy Group, Cybersecurity and Breach Response Group, and the Artificial Intelligence and Machine Learning Group.

 

Hintze Law PLLC is a Chambers-ranked and Legal 500-recognized, boutique law firm that provides counseling exclusively on privacy, data security, and AI law. Its attorneys and data consultants support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy, data security, and AI law.