Concerns about social media causing damage to teens have prompted some lawmakers to call for rules around age verification and parental consent. Photo / Antonio Guillem, Shutterstock
Social media platforms are struggling to navigate a patchwork of US state laws that require them to verify users’ ages and give parents more control over their children’s accounts.
States including Utah and Arkansas have already passed child social media laws in recent weeks, and similar proposals have been putforward in other states, such as Louisiana, Texas and Ohio.
The legislative efforts are designed to address fears online platforms are harming the mental health and wellbeing of children and teens amid a rise in teen suicide in the US.
But critics — including the platforms themselves, as well as some children’s advocacy groups — argue the measures are poorly drafted and fragmented, potentially leading to a raft of unintended consequences.
One senior staffer at a large tech company who leads its state legislative policy described the patchwork of proposals as “nightmarish [and] nonsensical, if not Kafkaesque”.
“Being able to prepare for this with confidence is a Herculean task,” the person said, describing it as an “engineering lift”. The person added that their legal teams were thrashing out how to interpret the various rules and their associated risks.
There is a growing body of research linking heavy use of social media by children and teens to poor mental health, prompting demands to better protect children from toxic content.
Republican Utah state representative Jordan Teuscher, who was the House sponsor of the state’s bill, said that it was created in response to a number of studies showing “some really devastating effects of social media” on teens.
“We strongly believe that parents best know how to take care of their own children. It was parents coming to us saying ‘I need help’,” he said of the decision to introduce the legislation, which is set to come into force in March 2024.
The Utah law requires social media platforms to verify the age of all state residents and then get parental consent before allowing under-18s to open an account. In addition, platforms must grant parents access to those accounts, and they are banned from showing them ads or targeted content.
Governments and regulators around the world are racing to introduce legislation, with both the UK’s Online Safety Bill and the EU’s Digital Services Act compelling social media companies to shield children from harmful content.
In the US, a new federal proposal, the Kids Online Safety Act, was introduced by US senators Marsha Blackburn, a Republican, and Richard Blumenthal, a Democrat, which would place a duty of care on platforms to protect children. Earlier this year, Republican senator Josh Hawley also introduced a bill that would enforce a minimum age requirement of 16 for social media users.
Social media platforms and experts agree federal laws would be most effective in order to impose a uniform nationwide standard. But in the meantime the smattering of state laws emerging has forced the platforms to scramble to adapt.
States taking action on the issue have diverged into “two lanes”, said Zamaan Qureshi, the co-chair of a youth coalition advocating for safer social media for young people.
In one, several Democratic-led states, such as California, have been focused on regulation that aims to “force technology companies to make design changes to their products to better protect minors”, he said. In the other, a greater number of Republican states have focused on the role of parents.
One common theme among the Republican state lawmaking efforts is a requirement for the platforms to carry out age verification for all users.
This also paves the way for a second requirement in some states for platforms to get consent from a parent or guardian before they allow under-18s on their apps, and in some cases, to allow those parents to have access to their child’s accounts.
Parental consent issue perplexes
Given a lack of specificity in the drafting of the measures, the platforms have been left perplexed by how to gather parental consent, according to multiple people familiar with the matter, weighing whether this might be a simple check-box exercise or will require companies to collect a copy of a birth certificate, for example.
Academics and advocacy groups have also raised questions around free speech and the privacy of the children the laws are designed to protect. And certain state rules might leave LGBT+ children whose families do not support them particularly vulnerable, Qureshi warned.
“What an active parent means is very different for each child or each young person,” he said.
The age verification mandate poses some big challenges to the companies. Vetting for age, which typically involves requesting ID or using age estimation through face scanning technology, will result in underage users being removed from the platforms, in turn hitting advertising revenue.
If ID is the main method for verification, critics warn that not all minors have access to official identification. Plus, age range estimation remains an inexact science.
For instance, Arkansas, whose legislation comes into force in September, has ordered platforms to use third parties to verify ages, raising concerns about whether there are enough tools to manage the demand.
Yoti, a small British provider of age verification technology, is already used by Meta’s Instagram and Facebook Dating, the company has said. TikTok is also weighing using the technology, according to two people familiar with the matter.
One of the biggest companies offering age verification technology is MindGeek, the owner of pornography sites Pornhub and RedTube, according to two tech policy staffers.
In the meantime, social media platforms, including Meta and Snap, have begun pushing the idea that age verification should be handled by the app stores where they are downloaded or at the device level — on an Apple iPhone, for example.
Meta said the company had already developed more than 30 tools for teens and families, including parental supervision tools. “We’ll continue evaluating proposed legislation and working with policymakers on these important issues,” the spokesperson said.
Snap, which has also developed parental controls, said it was in discussions with industry peers, regulators and third parties about how to address the age verification challenge. TikTok said it believed “industry-wide collaboration” was needed to address the issue.
Still, some children’s advocacy groups argue the focus of the legislation is misplaced.
“The theme is putting it on parents and giving more parents more rights ... It’s saying the platforms don’t need to change,” said Josh Golin, executive director of non-profit Fairplay.
“Really, what we think we should focus on is making platforms safer and less exploitative of kids.”