Social Media Regulation Wave Sweeps Across the US

Massachusetts plans to ban social media entirely for children under 14, in what would be one of the toughest youth protection laws in the United States. Competing regulatory models are spreading across the country.

No access to social media for young people; instead, more data collection by corporations and governments. Photo: Simon McGill/Getty Images/ChatGPT

No access to social media for young people; instead, more data collection by corporations and governments. Photo: Simon McGill/Getty Images/ChatGPT

The US state of Massachusetts is setting the highest bar and, in the debate over protecting minors online, is now turning to one of the toughest instruments yet proposed in the United States. On 8 April 2026, the House of Representatives in Boston passed a bill by 129 votes to 25 that would deny children under 14 access to social media altogether. Teenagers aged 14 and 15 would only be allowed to open or retain accounts with verifiable parental consent. At the same time, platforms would be required to build an age verification system for both existing and new users.

The Senate still has to approve the measure, while the attorney general has already been tasked with drawing up implementing rules by 1 September. According to the plans, the package would come into force as early as 1 October this year.

The proposal is controversial because it does not affect minors alone. In order to filter out children and teenagers, platforms would in effect have to determine and verify the age of all users. That is the core of the criticism raised by civil liberties groups, privacy advocates and parts of the digital economy. They warn that such a model could ultimately rely on identity documents, biometric procedures or other sensitive forms of proof.

The landmark ruling against Meta and Google – and what follows

You might be interested The landmark ruling against Meta and Google – and what follows

The very broad definition of social media in the draft law is also causing concern. Critics note that, under a strict interpretation, the scope could extend beyond traditional networks such as Instagram, TikTok or Snapchat. Platforms hosting user-generated content such as YouTube, Roblox or even Wikipedia could also be affected. In practice, age gates and access controls would spread across large parts of the internet.

Focus on Youth Protection

The law would also require platforms to disclose how many users pass age verification, how many are denied access for lack of proof and how often access is granted after appeals. It would further prohibit platforms from sharing information on the sexual orientation or gender identity of minors, reflecting how sensitive lawmakers consider the data environment to be.

Supporters justify the measure with youth protection, pointing to real risks of addiction and the now well-documented psychological strain associated with excessive social media use. The Massachusetts House of Representatives explicitly cites harmful content, addictive algorithms and distraction in schools.

At the same time, it remains unclear how a system can protect minors without forcing adults and teenagers into comprehensive identification.

In practice, the approach would create extensive databases of highly sensitive user data. Who would ultimately have access to that information remains an open question. The law is largely silent on data security and how it would be guaranteed, as is the case in most other states. Likewise, the proposed penalties are strikingly high. Media reports from Boston cite fines of up to $5,000 per individual violation.

A Nationwide Wave

Massachusetts is not alone in pursuing this course. Rather, it is part of a wave that has now spread across almost the entire country. According to the National Conference of State Legislatures, at least 40 states and Puerto Rico introduced more than 300 bills related to children and social media during the 2026 legislative session. A key element in many of these initiatives is age verification or parental consent. Massachusetts would therefore not be an outlier, but among the most far-reaching, with a clear ban for those under 14 and a comprehensive verification infrastructure.

The House of Representatives in Boston, Massachusetts. Photo: Matt Stone/MediaNews Group/Boston Herald via Getty Images

A similarly strict approach can be seen in Florida. In March 2024, Governor Ron DeSantis signed House Bill 3 (HB 3) into law. It prohibits children under 14 from creating or maintaining accounts on covered platforms. Those aged 14 and 15 may use them only with parental consent.

At the same time, Florida emphasizes the protection of anonymous speech online, without explaining how it can be reconciled with growing monitoring and control. The underlying political logic is similar to that in Massachusetts, although the legal architecture differs and places greater emphasis on specific platform features such as infinite scroll and autoplay. These functions, which are common across most platforms, are widely seen as particularly harmful and addictive for children and teenagers because they encourage endless consumption of content.

In late November 2025, a federal appeals court allowed the state to enforce the law on a preliminary basis, even though the constitutional dispute has not yet been finally resolved. A dissenting opinion by some judges noted that, in the end, age controls could affect all users, not just minors.

Laws Blocked by Courts

Arkansas and Louisiana illustrate how quickly such laws can run into constitutional limits. Arkansas moved rapidly with its legislation, requiring age verification for all account holders and explicit parental consent for minors. The law referred to digital IDs and official documents as possible verification methods. However, a federal court permanently blocked the measure in spring 2025.

In Louisiana, Act 456, known as the Secure Online Child Interaction and Age Limitation Act, was introduced in 2023. It likewise required social networks to verify the age of account holders and obtain parental consent for minors. In December 2025, a federal judge ruled that the law could not be enforced because it imposed disproportionate restrictions on civil liberties. Both cases show that while the political will is strong in many states, legal challenges often fail on First Amendment grounds and because of unclear statutory design.

Constrained by Technology and Courts

Utah is pursuing a different, though no less stringent, approach. Under its current child protection law, platforms must determine users’ ages in order to identify minors. Stricter privacy settings would then automatically apply. Search engines would not be allowed to index their profiles, while functions such as autoplay, infinite scroll and push notifications would be disabled. Parents would also be given tools to supervise their children’s profiles, including time limits and usage controls.

For now, enforcement has been suspended because it remains unclear whether the policy can be implemented technically without collecting large amounts of data, including reliable location data for all users.

Utah therefore represents a second strand in the American debate. The focus is less on a strict ban for specific age groups and more on a tailored regime for minors that directly affects product design. This raises significant data protection concerns, as both age and location would have to be reliably established to prevent circumvention through VPNs or proxy servers. Without access to GPS data, enforcement would be difficult, which in turn implies storing detailed location histories. Utah highlights the extent of data collection such regulations could trigger.

Parental Responsibility as the Key

In Georgia, the attorney general is currently defending Senate Bill 351 (SB 351) against legal challenges. According to the state, the law simply requires parental consent before platforms can enter into user agreements with children aged 15 and under.

https://twitter.com/juliecbarrett/status/2041945755607347221

Texas has taken a broader approach with the Securing Children Online through Parental Empowerment (SCOPE) Act, which extends beyond traditional social networks to cover digital services more generally while emphasizing parental responsibility. Providers must offer enhanced protections for minors, limit data collection, restrict targeted advertising and provide parents with control tools. Parents are also granted the right to request access to their children’s personal data or to demand the deletion of profiles.

The Texan model does not impose a ban on social media for teenagers. Instead, it strengthens parents in their supervisory role and provides a framework for overseeing digital services used by children. It not only restricts data collection but also enshrines the right to deletion in law. However, Texas also requires companies to verify users’ ages. The same data protection concerns therefore arise here as elsewhere, despite the comparatively more balanced approach.

Not least because of the differing approaches across states, the decision in Massachusetts is being closely watched nationwide. The state’s House of Representatives combines three policy strands that are usually separate: a ban for the youngest users, parental consent for older teenagers and a technical infrastructure for age verification affecting all users. Supporters see this as a decisive step to protect children in a digital environment they view as a health and educational risk.

Critics argue that it shifts the cost of youth protection onto the privacy of the entire population. Unlike the Texan model, they also see it as weakening the role of parents.

Judicial and Security Concerns

Another trend is emerging across several states. Not every law withstands judicial scrutiny, and it is often only through court challenges that measures are halted as unlawful or unconstitutional. Beyond legal concerns, questions of data protection and security remain central. Wherever large volumes of data are collected, there is always a risk that sensitive information could fall into the hands of criminals.

Thus, protecting minors becomes a double-edged sword when the state goes too far in its appetite for data collection. The Texan model suggests that strengthening the legal position of parents vis-à-vis tech companies may offer the most effective form of youth and data protection, while minimizing interference with the freedoms of all citizens.