Maryland Legislature Advances Age Appropriate Design Code to Governor’s Desk

On April 6th, the Maryland Legislature passed the Maryland Age-Appropriate Design Code (“MD AADC”), sending the bill to Governor Moore’s desk. If signed, the MD AADC would take effect on October 1, 2024, with data protection impact assessments required by April 1, 2026.

Like Connecticut before it, MD AADC seeks to adopt many of the protections that exist in the California AADC while avoiding the pitfalls that have resulted in a preliminary injunction against the California AADC. Below, we address the scope and high-level requirements of the bill before diving into the significant differences between the MD AADC and the California AADC.

Scope

MD AADC applies to “covered entities” offering online products that are either accessed by or reasonably likely to be accessed by children under 18 (hereinafter, we’ll refer to children under 18 as “minors”).

Covered entities are legal entities that:

  1. Are organized for the profit or financial benefit of its shareholders or other owners;

  2. Collect consumers’ personal data (or uses another entity to collect such data on its behalf);

  3. Determine the purposes and means of processing consumers’ personal data, alone or jointly with affiliates or subsidiaries;

  4. Do business in the State of Maryland; and

  5. a. Have annual gross revenues in excess of $25 Million;

    b. Annually Buy, Receive, Sell, or Share the personal data of at least 50,000 consumers, households or devices alone or in combination with its affiliates or subsidiaries, for the covered entity’s commercial purposes; or

    c. Derive at least 50% of its annual revenues from the sale of consumers’ personal data.

Covered entities include (1) entities that control or are controlled by a business that shares a name, service mark, or trademark that would cause a reasonable person to understand that two or more entities are commonly owned; and (2) a joint venture or partnership composed of businesses in which each has at least a 40% interest in the joint venture or partnership.

Like the CCPA definition of “business,” at first blush, Maryland’s “covered entities” may appear to include only for-profit businesses. However, the wording of “organized for the profit or financial benefit of its shareholders or other owners,” leaves open the possibility that some not-for-profit organizations—such as trade associations—may be pulled into scope.

The scope is limited to online products accessed by or reasonably likely to be accessed by minors. Whereas Connecticut’s youth online privacy and safety updates to its data protection law eschewed this concept in favor of a knowledge standard (that is, it applies to products and services where companies have actual knowledge of minor users or “willfully disregards” that its users are minors), Maryland largely adopts California’s “likely to be accessed by children” standard, where children are consumers under 18. However, Maryland adds a more traditional constructive knowledge criterion. That is, under Maryland’s bill, an online product is “reasonably likely to be accessed by children” (i.e., minors) when:

  1. The product is “directed to children” as defined under COPPA;

  2. The product’s audience is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of minors;

  3. The product is substantially similar to a product that is routinely accessed by a significant number of minors;

  4. The product features advertisements marketed to minors;

  5. The covered entity’s internal research finding determine that a significant amount of the product’s audience is composed of children; or

  6. The covered entity knows or should have known that a user is a child.

Although we still don’t know what a “significant” minor audience looks like in practice, these factors, taken together, may be applied by analogizing to the COPPA factors and COPPA enforcement actions (albeit, with an aged-up lens to capture content, language, celebrities, and advertising appealing to teens), as well as more traditional “constructive knowledge” precedent.

High Level Requirements

Overall, MD AADC’s substantive requirements are nearly identical to California’s AADC requirements. They include: increased documentation requirements; transparency requirements; data minimization and restrictions on processing; and product design changes.

Data Protection Impact Assessment Documentation

Like California’s DPIA requirement, Maryland requires in-scope companies to prepare DPIA assessments by April 1, 2026, for any existing online products that are likely to be accessed by minors and to complete DPIAs for products that are launched after April 1, 2026. There is no timeline by which DPIAs for these post-April 1, 2026, products must be completed.

The DPIAs must identify the purpose of the online products and how the product uses minors’ data. The DPAs must further assess whether any of the following results in a number of risks to minors:

  1. Any data management or processing practices of the online product that could lead to minors experiencing or being targeted by contacts;

  2. Any data management or processing practices of the online product that could lead to minors participating in or be subject to conduct;

  3. Any data management or processing practices of the online product that could lead to minors becoming party to or exploited by a contract through the online product;

  4. The online product uses system design features to increase, sustain, or extend the use of the online product (including automatic playing of media, rewards for time spent, and notifications);

  5. Any collection and processing of minors’ personal data (including how the data is processed and for what purposes) by the online product;

  6. Any data management or design practices that are revealed by data collected to understand the experimental impact of the product (including how the data collected reveals the data management or design practices);

  7. Algorithms used by the online product; and

  8. Any other factor that may indicate the online product is designed in a manner inconsistent with the “best interests” of minors.

 The material risks that must be evaluated are:

  1. Reasonably foreseeable risks of material physical or financial harm to minors;

  2. Reasonably foreseeable and extreme psychological or emotional harm to minors;

  3. Highly offensive intrusions on minors’ reasonable expectations of privacy; and

  4. Discrimination against minors based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation.

The DPIAs must further include a description of steps that the covered entity has taken and will take to comply with the duty to act in a manner consistent with the best interests of children.

Companies may use existing DPIAs (and add the relevant assessments above to those DPIAs), including any DPIAs created to comply with the UK’s Age Appropriate Design Code, but must maintain the DPIA and update it within 90 days of any material change.

Within 5 days of a written request, companies must provide a list of their DPIAs to the Consumer Protection Division of the Maryland Office of the Attorney General and provide the complete DPIA within 7 days of receiving a written request (with a possible 7-day extension).

Transparent Communications to Minors

Companies are required to provide privacy information, terms of service, policies, and community standards, concisely, prominently, and using clear language suited to the age of children likely to access the online product. This is likely more important for teen users, as communications regarding data processing practices for children under 13 are more often communicated to parents and guardians.

Further, the bill would require covered entities first notify minors and minors’ parent or guardian before allowing any person other than a parent or guardian to monitor a minor’s online activity. This may include the monitoring or “tracking” done by advertising and analytics pixels and similar technologies and screen readers, and it could also include signals to other users when a minor user is “online” or “active.”

Notably, however, covered entities are not required to alert minors when they are monitored or tracked by their parent or guardian.

Data Handling Should Be Governed by the Best Interests of Minors

Throughout the MD AADC, companies are directed to process minors’ personal data in a manner consistent with the “best interests of children.” I’ll discuss the meaning of “best interests of children” in the discussion of comparisons with California below, but it is important to note that the concept is the guiding duty of care required of covered entities. Accordingly, this bill would require covered entities to default minors’ privacy settings to a high level (unless they can show a lower setting is in minors’ best interests). For covered entities that offer tiered privacy settings to consumers on an opt-out basis, this effectively creates a right of minors to opt into lower levels of privacy.

It further restricts covered entities from:

  1. Processing minors’ personal data that are not reasonably necessary to provide an online product that the child is actively and knowingly engaged with;

  2. Processing minors’ personal data for purposes other than for which the personal data was collected (often referred to as “secondary uses”);

  3. Processing precise geolocation data (within a radius of 1,750 feet) by default unless the data are (a) strictly necessary and (b) processed only for the limited time it’s necessary to provide the online product;

  4. Profiling (which is automated processing of personal data to evaluate, analyze, or predict certain characteristics) a minor by default unless (a) profiling has appropriate safeguards to ensure it is consistent with the best interests of children, and (b) profiling is either necessary to provide the requested online product with which the minor is actively and knowingly engaged or the entity can demonstrate a compelling reason that the profiling is in the best interests of children;

  5. If age estimation is used (and it is not required), processing any personal data for the age estimation that are not reasonably necessary to provide the online product; and

  6. In making a determination of whether an online product is “reasonably likely to be accessed by children,” collecting or processing personal data beyond what is reasonably necessary to make that determination.

Product Design Changes

To round out the MD AADC requirements, there are a few direct product design changes that covered entities are required to incorporate:

  1.  As stated above, monitoring or tracking by a person other than a minor’s parent or guardian will need some kind of notice, which may involve in-product notice disclosures and/or a signal indicator that the minors’ activity is being monitored.

  2. Companies must provide “prominent, accessible, and responsive” tools to help minors or guardians exercise their privacy rights and report concerns.

  3. Provide an obvious signal for the duration that precise geolocation is being collected. For example, this might be an icon that is displayed or that “lights up” or changes color when precise geolocation is collected.

  4. Avoid “dark patterns” to cause minors to provide personal data beyond what is reasonably expected to provide the online product, circumvent privacy protections, or take any action the covered entity knows or has reason to know is not in the best interests of children.

Notable differences between MD AADC and the California AADC

In response to the preliminary injunction against California’s AADC, Maryland has made a handful of crucial changes in its version of the AADC. These changes primarily focus on definitions and scoping. We’ve touched on some (e.g., an additional criterion to “reasonably likely to be accessed by children), but we’ll dive a little deeper here.

No Age Estimation or Content Moderation

The first notable change is that this bill does not require any age assurance, and, in fact, limits the processing of any personal data for any age estimation to data that are reasonably necessary to provide the online product. The preliminary injunction against California’s AADC was levied, in part, because the court questioned the constitutionality of the “age estimation” as it would require companies collect more data—and potentially more sensitive data—from minors than necessary to provide the product, thereby counteracting the state’s interest in promoting youth privacy. Here, Maryland not only avoids requiring age estimation, but limits data used for age estimation to the pool of data covered entities already have or intend to collect.

In addition, whereas California requires companies to enforce all policies (including content moderation policies) and prompting legal action for conflicts with federal law, MD AADC attempts to harmonize with federal law. It explicitly does not require companies to “monitor or censor third-party content or otherwise impact the existing rights and freedoms of any person.”

Clarified Definitions to Avoid Vagueness Challenges

Although Maryland retains a duty to process minors’ information consistently with the “best interest of children,” it has offered a much-needed definition for the term:

A covered entity’s use of the personal data of children or the design of an online product in a way that does not:

  1. Benefit the covered entity to the detriment of children; and

  2. Result in (a) reasonably foreseeable and material physical or financial harm to children; (b) severe and reasonably foreseeable psychological or emotional harm to children; (c) a highly offensive intrusion on children’s reasonable expectation of privacy; or (d) discrimination against children based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation.

Maryland also defined “dark patterns.” (AUTHOR’S NOTE: “dark patterns” carries some racist/colorist connotations. When we use the term, we are quoting from the legislation, but otherwise advocate for more affirming and conscientious language.) That is, in Maryland, “dark patterns” shall be any practice identified by the FTC as a dark pattern. In this case, “dark patterns” will likely remain practices that are likely to result in FTC deceptiveness claims.

Greater Protections for DPIAs

Unlike California, MD AADC would codify attorney-client privilege and work-product protections for DPIAs, in addition to treating DPIAs provided to the Maryland Office of the Attorney General as confidential and not subject to public disclosure under Maryland’s Public Information Act.

Enables Stealth Parental Monitoring but Provides Companies Flexibility

Neither California nor Maryland would require companies allow parental monitoring or tracking. If the product does provide this feature to parents and guardians, California requires companies to provide minors an obvious signal of that a parent or guardian is monitoring a minor’s online activity or location. Maryland, on the other hand, lets companies decide whether to provide such a signal or not.

Cure Periods Are More Difficult to Obtain

To obtain Maryland’s 90-day cure period, companies must be in substantial compliance with the statute’s substantive provisions, whereas California requires only substantial compliance with the DPIA provisions.

Conclusion

Violations of the AADC will be subject to enforcement under Maryland’s Consumer Protection Act and considered an “unfair, abusive, or deceptive trade practice.” Civil penalties range from $2,500 per affected minor for negligent violations and $7,500 per affected minor for intentional violations.

The passage of this bill should alert companies that states are continuing to push an online product safety agenda and will experiment with new legislation and more traditional legal concepts to find online safety schema that is compatible with constitutional protections.

Charlotte Lunday is a Senior Associate at Hintze Law with expertise in COPPA, FERPA, and online safety.

Hintze Law PLLC is a Chambers-ranked, boutique privacy firm that provides counseling exclusively on global data protection. Its attorneys and privacy analysts support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy and data security