In mid to late April 2026, Roblox settled with Alabama, Nevada, and West Virginia over allegations that its interactive gaming platform exposed children and teens to harmful content and predatory users. The three separate settlement orders will require Roblox to pay a total of $33 million to the three states and implement robust privacy and online safety controls. Note that, at this time, draft complaints have not been provided by Alabama, Nevada, or West Virginia, and the press releases do not describe the specific laws Roblox allegedly violated leading to the settlements.
Roblox faces similar scrutiny in Iowa, Kentucky, Nebraska, Tennessee, Texas, Florida, Louisiana, and Los Angeles County, as well as suits from private plaintiffs. Those complaints generally allege that Roblox was in violation of the states’ consumer protection laws because it misrepresented and had insufficient controls related to chat safety and content moderation.
Roblox will be required to take the following measures under the Alabama, Nevada, and West Virginia orders. While not all of these requirements are present in each states’ order, many are similar, and we describe these collectively. Note also that, at this time, the settlement order with West Virginia is not yet available — only a summary of the measures is provided in the press release issued by the West Virginia Attorney General.
Conducting heightened age assurance prior to accessing chat functionality: Roblox currently requires all users to self-report their age prior to accessing the platform. Roblox will be required to take heightened age assurance measures prior to enabling users to access chat functionality (Roblox began rolling out age checks via facial age estimation and ID verification late last year). Roblox also agreed to continue its practice of behavioral monitoring to assess whether a user is of a different age than they self-reported or was estimated.
Implementing chat safety measures: The order provides detailed obligations for when adults can communicate with minor users, requiring parental consent for users under 13 and other approval mechanisms for users under 16 (in Nevada, the order also seeks to protect users under 16–17-years of age, obligating Roblox to “take steps” to address harms they may encounter). Roblox will also be required to surface an alert each time a user under 18 enters a private chat with another user about the dangers of communicating with strangers. To help law enforcement act on illegal interactions on the platform, Roblox agreed to not encrypt messages between minor users and other users.
Providing an age-gated minor experience: To address minors’ access to inappropriate or unsafe content, Roblox will be required to create a default minor-safe experience for users under 16 and for users whose age hasn’t been verified. The orders will require parental consent to be obtained for users under 16 to access more mature experiences.
Assigning content maturity ratings for experiences: Roblox will be required to assign, or ensure that developers publishing experiences on its platform assign, a content maturity rating to its experiences. Roblox will need to have enforcement mechanisms in place to moderate developers that do not accurately represent their experiences and will be required to publish (among other public reporting requirements) the “statistics and measures” taken to address developer violations of Roblox policies.
Maintaining parental controls: Roblox is not required to ensure parents have accounts linked to their minor under the order; however, Roblox is required to take steps to increase the adoption of linked accounts. The settlement orders obligate Roblox to maintain parental controls (e.g., Alabama requires controls to set how much time a minor spends on the platform and spending limits).
Restricting personalized advertising and push notifications: The orders restrict Roblox from providing personalized advertising on the platform to children under 16, although it is unclear if this is a full prohibition or whether parental consent can be provided for these users to receive ads. Further, the orders also provide detailed requirements for when push notifications and other notifications can be sent to children under 16.
Awareness capabilities: The order with Nevada will require Roblox to conduct a multi-media public safety awareness campaign to the public about online safety, including its parental controls and age assurance practices.
Key Takeaways: These orders follow a current trend of increased scrutiny into child and teen online safety and experiences that are appealing to minors, especially those that permit engagement between users. As was reaffirmed in the Alabama Attorneys General press release, the settlement with Roblox “sends a clear message to every platform operating in this space” that they will continue to “aggressively enforce” child and teen online safety.
The obligations in the settlement orders are similar to many of the new obligations placed on online operators that have been established under the new age-appropriate design and minor privacy and safety laws in Colorado, Vermont, Nebraska, Arkansas, and New York, particularly around communication limitations between users, parental controls, limits on personalized adverting, and when notifications can be sent. The orders also introduce net new obligations on Roblox not specifically required under any current law like bans on encrypted messages with minors imposed by Alabama and Nevada.
With the new COPPA regulations having taken effect in April, the newly effective and soon to be effective state laws (New York, effective June 20, 2025; Colorado, effective October 1, 2025; Nebraska, effective January 1, 2026, with additional amendments coming into effect on July 17,2026; Arkansas, effective on July 1, 2026), and other recent state enforcement activity related to child and teen online safety under consumer protection laws, we expect to see continued regulatory enforcement activity in this space.
If you haven’t already, assess whether your organization’s online user experience will likely be considered appealing to, or have known users that are, children and teens. Organizations offering such experiences should devote significant resources and time to comply with this increasingly complex space, including, potentially, to develop processes and mechanisms to obtain parental consent, build or leverage third-party tools for age assurance, and conduct vendor and third-party management processes and tools to ensure that data isn’t shared or sold in manner that triggers additional consent and other obligations.
Hintze Law PLLC is a Chambers-ranked and Legal 500-recognized, boutique law firm that provides counseling exclusively on data protection including AI, privacy, and data security. Hintze attorneys and data consultants support technology, advertising, media, fintech, health, biotech, ecommerce, and mobile industries.
Emily Litka Sanford is a Senior Associate at Hintze Law PLLC. She focuses her practice on global privacy and emerging AI laws and regulations. She regularly counsels on risk during product development, the development and operationalization of privacy programs, the preparation of data protection impact assessments, and the development of internal privacy policies and processes.
