Opinion | How Changing One Law Could Protect Kids From Social Media
Parenting is full of anxiety and guilt, but parents in the age of social media increasingly face a marked type of helplessness. Their children become the unwitting subjects of a remarkable experiment in human social forms, building habits and relationships in a bohemian environment designed primarily to maximize promote strong participation in the service of advertisers.
It’s not that the social network has no exchange value, but overall, it’s not a place for kids. If Instagram and TikTok were traditional spaces in your neighborhood, you’d probably never even let your teens go to them alone. Parents should also have a similar say in the presence of their children in these virtual spaces.
We may have the vague impression that it is impossible, but it is not. There’s a reasonable, legal, effective tool our society uses to empower parents to face the risks of social media: We should raise the age requirement for social media. assembly and give it real teeth.
It may surprise most Americans that there is an age requirement. But the Children’s Online Privacy Protection Act, enacted in 1998, prohibits U.S. companies from collecting the personal information of children under the age of 13 without parental consent or from collecting a lot of information. more personal information than they need to operate a service that targets children under the age of 13. , this means that children under the age of 13 cannot have social media accounts – as the business models of the platforms are all dependent on the collection of personal data. Technically, the major social media companies require users to be over 12 years old.
But that rule is often ignored. Nearly 40 percent of American children between the ages of 8 and 12 use social media, according to one Recent surveys by Common Sense Media. Platforms generally have users self-claiming that they are old enough and they have no incentive to make it hard to lie. On the contrary, as an internal Facebook in 2020 Remember board Leaked to The Wall Street Journal makes it clear that the social media giant is particularly eager to attract “queens,” whom it sees as a “valuable but untapped audience.”
Quantifying the hazards involved is a challenge for researchers, and there are certainly those who say that risk is exaggerating. But there is evidence that exposure to social media also causes serious harm to younger and older children. The platform companies’ own research shows as much. Internal documents from Facebook – now known as Meta – regarding the teen’s use of its Instagram platform shows genuine concern. “We made body image problems worse for one in three teenage girls,” the researchers noted in a leaked slide. The documents also point to a potential link between regular social media use and depression, self-harm and to some extent, even suicide.
TikTok, which is also very popular with teens and young adults, – along with other social media platforms – has also been linked to body image and word problems muscle disorder to one Tourette-like syndrome, sexual exploitation and types deadly stunts. More outdated issues such as bullying, harassment, and conspiracies are also often amplified and exacerbated by the mediation of platforms over children’s social lives.
Social media also benefits young people. They can find connection and support, explore things, and hone their curiosity. In responding to key reports about its own research, Facebook noted that it found that some measuresInstagram “helps many struggling teens with some of the toughest issues they face.”
Restrictions on access to platforms come at a real cost. But, as Jonathan Haidt of New York University has put it, “The preponderance of the existing evidence is disturbing enough to warrant action.” Some teen social media users also noticed this problem. As one of the leaked Meta slides “Young people are acutely aware that Instagram can adversely affect their mental health, but they are forced to spend time on the app for fear of missing out on social and cultural trends.”
That pressure balance needs to change. And as journalist and historian Christine Rosen notedPreaching “media savvy” and screen time is not enough.
Policymakers can help. By raising the minimum age of the Children’s Online Privacy Protection Act from 13 to 18 (with the option for parents to be able to verify approval of their children’s immunity as permitted by law) ), and by providing effective age verification and meaningful penalties for the platform, Congress can provide parents with a powerful tool to counter pressure to use social media.
Reliable age verification is possible. For example, as policy analyst Chris Griswold has offer, the Social Security Administration (the agency that knows exactly how old you are) “may provide a service through which an American can enter his or her Social Security number into a website linked to secure state and receive a temporary, anonymous code via email or text,” like two authentication methods commonly used by banks and retailers. With that code, platforms can confirm your age without obtaining any other personal information about you.
Some teenagers will try to cheat, and the age requirement will be very low. But the draw of the platforms is a function of the network effect – everyone wants to participate because everyone else does. The age requirement only has to be passively effective for it to be transformative – since the age requirement stays the same, it will also be less true when all others follow.
Real age verification will also help to more effectively limit access to online pornography – a huge, dehumanizing scourge that our society inexplicably pretends to be impossible. what can do. Here, concerns about freedom of expression, regardless of their merits, certainly do not apply to children.
It may seem strange to face the challenge of children using social media through online privacy safeguards, but that route actually offers some distinct advantages. The Children’s Online Privacy Protection Act already exists as a legal mechanism. Its framework also allows parents to opt in for their children if they wish. It can be a laborious process, but parents who feel strongly that their children should use social media can allow it.
This approach will also solve a core problem with social media platforms. Their business model – in which personal information and user attention is the essence of the products the company sells to advertisers – is key to why platforms are designed to ways to encourage addiction, aggression, bullying, conspiracies, and other antisocial behaviors. If companies want to create a kid-oriented version of social media, they’ll need to design platforms that don’t monetize user data and engagement that way – and therefore, unrelated to those incentives – and then let the parents see what they think.
Parental empowerment is really the key to this approach. It is a mistake to put children and teenagers on the platforms in the first place. But we are not powerless to correct that mistake.
Yuval Levin, an Opinion writer, is editor of National Affairs and director of constitutional, cultural, and social studies at the American Enterprise Institute. He is the author of Time to Build: From Family and Communities to Congress and Campus, How Adopting Our Institutions Can Revive the American Dream.