Citing a Mental Health Crisis Among Young People, California Lawmakers Target Social Media

A kid lying in bed at night looking at a phone.
(Moment/Getty Images)

Karla Garcia said her sons social media addiction started in fourth grade, when he got his own computer for virtual learning and logged on to YouTube. Now, two years later, the video-sharing site has replaced both schoolwork and the activities he used to love like composing music or serenading his friends on the piano, she said.

He just has to have his YouTube, said Garcia, 56, of West Los Angeles.

Alessandro Greco, now 11 and a soon-to-be sixth grader, watches videos even when he tells his mom that he is starting homework, making his bed, or practicing his instrument. When she confronts him, she said, he gets frustrated and says he hates himself because he feels like watching YouTube isnt a choice.

Alessandro tells her he just cant pull himself away, that he is addicted.

Its vicious theyve taken away my parenting ability, Garcia said. I cant beat this.

Some California lawmakers want to help Garcia and other parents protect their childrens mental health by targeting website elements they say were designed to hook kids such as personalized posts that grab and hold viewers on a specific page, frequent push notifications that pull users back to their devices, and autoplay functions that provide a continuous stream of video content.

A boy stands outside and smiles at the camera.
Alessandro Greco became addicted to YouTube when he was 9 and watches videos instead of doing homework, making his bed, or practicing piano, says his mother, Karla Garcia. The West Los Angeles boy, now 11, tells her that he cant pull himself away. He cant stop, she says. No matter what I try, it isnt going to get him to stop.

Two complementary bills in the state legislature would require websites, social media platforms, or online products that children use or could use to eliminate features that can addict them, harvest their personal information, and promote harmful content. Those that dont comply could face lawsuits and hefty fines. One of the measures would impose penalties of up to $7,500 per affected child in California which could amount to millions of dollars.

Federal lawmakers are making that would and target features that foster addiction. One would require online platforms to provide tools to help parents track and control their childrens internet use. The measures were approved by a U.S. Senate committee July 27.

We have to protect kids and their developing brains, said California Assembly member Jordan Cunningham (R-San Luis Obispo), a lead author of both bills and a father of four children, at a committee hearing in June. We need to end Big Techs era of unfettered social experimentation on children.

But Big Tech remains a formidable foe, and privacy advocates say they are concerned one of the California measures could increase data intrusions for everyone. Both bills have cleared the state Assembly, but whether they will survive the state Senate is unclear.

Tech companies, which wield immense , say they already prioritize users mental health and are making efforts to strengthen age verification mechanisms. They are also rolling out parental controls and prohibiting messaging between minors and adults they dont know.

But these bills could violate companies free speech rights and require changes to websites that cant realistically be engineered, said Dylan Hoffman, executive director of TechNet for California and the Southwest. TechNet a trade association for tech companies, including Meta (the parent company of Facebook and Instagram) and Snap Inc. (which owns Snapchat) opposes the measures.

Its an oversimplified solution to a complex problem, and there isnt anything we can propose that will alleviate our concerns, Hoffman said about one of the bills that specifically targets social media.

Newsletter Icon

Last year, the U.S. surgeon general, Dr. Vivek Murthy, highlighted the nations and pointed to social media use as a potential contributor. Murthy said social media use in teenagers had been linked to anxiety and depression even before the stress of covid-19. Then during the pandemic, he said, the average amount of teenagers non-academic screen time leaped from .

What were trying to do, really, is just keep our kids safe, Assembly member Buffy Wicks (D-Oakland), another lead author of the California bills and a mother of two children, said at the June committee hearing.

One of Cunningham and Wicks bills, , would require all online services likely to be accessed by a child which could include most websites to minimize the collection and use of personal data for users younger than 18. This includes setting default privacy settings to the maximum level unless users prove they are 18 or older, and providing terms and service agreements in language a child can understand.

Modeled after a , the measure also says companies should consider the best interests of children when designing, developing, and providing that service, product, or feature. That broad phrasing could allow prosecutors to target companies for features that are detrimental to children. This could include incessant notifications that demand childrens attention or suggestion pages based on a childs activity history that could lead to harmful content. If the state attorney general determines a company has violated the law, it could face a fine of up to $7,500 per affected child in California.

The other California bill, , would allow prosecutors to sue social media companies that knowingly addict minors, which could result in fines of up to $250,000 per violation. The original version would also have allowed parents to sue social media companies, but lawmakers removed that provision in June in the face of opposition from Big Tech.

Together, the two California proposals attempt to impose some order on the largely unregulated landscape of the internet. If successful, they could improve kids health and safety, said Dr. Jenny Radesky, an assistant professor of pediatrics at the University of Michigan Medical School and a member of the American Academy of Pediatrics, a group that supports the data protection bill.

If we were going to a playground, youd want a place that had been designed to let a child explore safely, Radesky said. Yet in the digital playground, theres a lot less attention to how a child might play there.

Radesky said she has witnessed the effects of these addictive elements firsthand. One night, as her then-11-year-old son was getting ready for bed, he asked her what a serial killer was, she said. He told her he had learned the term online when videos about unsolved murder mysteries were automatically recommended to him after he watched Pok矇mon videos on YouTube.

Adam Leventhal, director of the University of Southern California Institute for Addiction Science, said YouTube recommendations, and other tools that mine users online history to personalize their experiences, contribute to social media addiction by trying to keep people online as long as possible. Because developing brains favor exploration and pleasurable experiences over impulse control, kids are especially susceptible to many of social medias tricks, he said.

What social media offers is a highly stimulating, very fast feedback, Leventhal said. Any time that there is an activity where you can get a pleasurable effect and get it fast and get it when you want it, that increases the likelihood that an activity could be addictive.

Rachel Holland, a spokesperson for Meta, explained in a statement that the company has worked alongside parents and teens to prioritize kids well-being and mitigate the potential negative effects of its platforms. She pointed to a variety of company initiatives: In December 2021, for example, it added supervision tools on Instagram that allow parents to view and limit kids screen time. And in June, it started testing new age verification tactics on Instagram, including asking some users to upload a video selfie.

Snap spokesperson Pete Boogaard said in a statement that the company is protecting teens through steps that include banning public accounts for minors and turning location-sharing off by default.

Meta and Snap declined to say whether they support or oppose the California bills. YouTube and TikTok did not respond to multiple requests for comment.

Privacy groups are raising red flags about the measures.

Eric Null, director of the privacy and data project at the Center for Democracy and Technology, said the provision in the data protection bill that requires privacy agreements to be written in age-appropriate language would be nearly impossible to implement. How do you write a privacy policy for a 7-year-old? It seems like a particularly difficult thing to do when the child can barely read, Null said.

And because the bill would limit the collection of childrens personal information but still require platforms that children may access to gather enough details to verify a users age it could increase data intrusions for all users, he said. This is going to further incentivize all online companies to verify the age of all of their users, which is somewhat counterintuitive, Null said. Youre trying to protect privacy, but actually youre now requiring a lot more data collection about every user you have.

But Karla Garcia is desperate for action.

Thankfully, she said, her son doesnt watch violent videos. Alessandro prefers clips from Americas Got Talent and Britains Got Talent and videos of one-hit wonders. But the addiction is real, she said.

Garcia hopes legislators will curtail the tech companies ability to continually send her son content he cant turn away from.

If they can help, then help, Garcia said. Put some sort of regulations on and stop the algorithm, stop hunting my child.

This story was produced by , which publishes , an editorially independent service of the .

Related Topics

Mental HealthCalifornia LegislatureChildren's HealthCalifornia

More from 窪蹋勛圖厙 News