Earlier this month, the Senate questioned Meta CEO Mark Zuckerberg, Snapchat CEO Evan Spiegel, TikTok CEO Shou Zi Chew and many other social media executives on their commitment to children’s safety.
The hearing sent the message that the U.S. government is concerned with children’s safety on the internet and is holding social media companies accountable for it, but this accountability may be hard to enforce.
Regulating social media companies is one of the most difficult tasks for state or federal lawmakers. If lawmakers actually want to protect children, they need to shift their blame to companies who create the phones or the parents who buy them.
Over the last 3 years, Arkansas, California, Ohio and multiple other states have begun to pass legislation aiming to secure the internet for children.
The lawmakers of these states have also met resistance to all of these laws from the main backer of high-profile tech companies, NetChoice.
NetChoice is associated with over 30 tech companies including Meta, TikTok, X and Snapchat, and their mission is “to make the internet safe for free enterprise and free expression.”
To do this, they have decided to challenge any law targeted at limiting free expression on the internet and they believe the newest children safety laws are exactly that.
Take Arkansas for example. Last April, the state government passed the Social Media Safety Act (SB396), which requires parental consent before a child creates a social media account. It also requires companies to correctly verify the age of their account holders through government identification.
We often give up some liberties for safety, and if it is to protect children, it might be worth it, but NetChoice disagrees. They challenged the law less than three months after it was passed on the grounds that it “undermines the First Amendment by requiring that Arkansans hand over sensitive and private information to be able to use digital communication services.”
Laws in Utah, California and Ohio have met similar challenges from NetChoice. So it begs the question, are states taking on a battle they can’t win?
Diana Jordan, a lecturer on communication law and policy at the University of Miami, believes it might be.
“These laws can be seen as a violation of First Amendment rights because by not letting them (children) use social media, or telling them how to use it, you’re effectively limiting their First Amendment right to free speech,” said Jordan.
A children’s right to free speech was established in the Supreme Court case Tinker v. Des Moines and this right, as Jordan mentioned, is exactly what NetChoice and other lobbying groups claim that the government is limiting.
NetChoice also benefits from the fact that legal precedent is on their side.
“States can always give more protections to its citizens but they can never take away,” said Jordan “States are claiming they are protecting the children, but it can easily be flipped to show the government overstepping.”
Jordan makes it clear that in any of these lawsuits, states will be fighting an uphill battle especially when trying to take away citizen’s liberties.
There are obviously other players present in children’s navigation of the internet, most importantly, their parents and the company that sells them their phones, but lawmakers have placed their blame on social media companies. In doing this, they have neglected to hold those accountable who enable the children to use the internet.
Parents are the easiest person to blame for a child being on the internet. They provide children with a phone, internet or telephone service, and permission to use it how they want. All parents have the option to install parental controls and monitor their children’s screen time, but many don’t.
According to Pew Research, only 40% of parents use parental controls to monitor and block their teens’ online activities and only 39% frequently talk to their children about what is appropriate to view online. While these numbers may seem decent, much more can be done by parents to monitor their children.
Prominent phone companies like Apple and Google also have a role to play in protecting children. Since their products contain the social media apps under fire, they can clearly regulate them. One type of regulation could be app stores requiring parents to approve of their children’s downloads as suggested by the Wall Street Journal Editorial Board.
Another solution could be for companies like Apple and Google to confirm a user’s identity before allowing them to download apps that have age requirements to play. Most social media apps require users to be over the age of 13, but barely verify it correctly.
Internet regulation for children is obviously not off the table, but the current route lawmakers are taking is a dead end. Regulating the companies that control children’s phones or working to teach parents about internet safety, while maybe not as effective, are much more viable methods to protect children on the internet.