Estimated read time: 6-7 minutes
This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.
SALT LAKE CITY — Tweens and teens spend as much as nine hours a day scrolling through social media, gaming, online shopping, video chatting and texting on their cellphones.
And an increasing amount of evidence suggests all that screen time is taking a toll on their mental health.
"The statistics are clear we've got a generation of young people that are the most distressed, anxious, depressed and tragically suicidal than any generation in our history," said Rep. Chris Stewart, who was recently named co-chairman of the bipartisan Mental Health Caucus in Congress.
The rise in anxiety and depression, he says, can be almost directly correlated to when Facebook bought Instagram in 2012 and began marketing initially to girls and then boys as young as 9. The Chinese app TikTok, he said, was designed as "emotional heroin" for young people.
"We just think we've got to do something," he said.
Stewart, a Republican, believes he has a solution to the mental health crisis among adolescents: Make it illegal for social media platforms to provide access to children under 16. He intends to introduce legislation that would make social media companies responsible for age verification of their users.
The law wouldn't displace parents' decisions about their children's social media use but help them avoid something harmful, he said.
"The government is involved with regulating when my children can drink, when they can smoke, when they can drive," Stewart said. "We think society has a responsibility to protect young people and government should help in protecting them."
Since 2000, the federal Children's Online Privacy Protection Act has required websites and online services to get parental consent before collecting data of children under 13. But it is rarely enforced. Stewart's bill would basically raise the age to 16.

Stewart said he expects social media companies will "hate this," but that he's willing to take their arrows "if we can do some good here."
"They know if they can get someone addicted to social media at 9, they've got them for the rest of their lives," he said.
Meta, which owns Facebook, Instagram and WhatsApp, didn't have a position on Stewart's yet-to-be-filed legislation Monday, but pointed to steps it has taken to protect young people, including age-appropriate default settings, tools to encourage teens to spend time away from Instagram, and continuing to bring age verification to the platform.
"We have the same goals as policymakers," according to Meta. "We have long advocated for clear industry standards in areas like age verification, and developing experiences that are age-appropriate."
NetChoice, a tech industry group that includes Meta, Google, TikTok and Twitter, says education for both parents and children is the answer, not the "heavy-handed" government regulation Stewart is proposing.
Such laws are not only unenforceable but violate the First Amendment, said Carl Szabo, NetChoice vice president and general counsel.
Also, he said there's a reason Congress set the age at 13 in the federal law. There's an emotional and social differential between a 13-year-old and 15-year-old, who typically can drive a car, attend high school and is becoming less dependent on parents.
"This is well-intentioned. I think parenting in the 21st century is incredibly challenging," Szabo said of Stewart's proposal. "Now is there something that could be done? One-hundred percent."
We can't just turn away from it. We can't just ignore it. We can't just pat them on the back and say 'hey, you'll feel better' and ignore it.
–Rep. Chris Stewart, R-Utah
Szabo pointed to Florida and Indiana lawmakers considering legislation to require social media education in schools. The materials, he said, would be presented not only to kids but to their parents.
"Let's see how that goes first," he said.
The better approach, Szabo said, is to not try to replace parents as California has done with its Age-Appropriate Design Code Act.
Modeled off standards in the United Kingdom, the California law requires the highest privacy settings to be turned on by default for minors. It also says that online services targeting kids under 18 must assess the risk of harm to those users that could come from potentially harmful messages or exploitation. It's set to take effect in July 2024.
"California has stepped in between parents and their teenagers," Szabo said.
NetChoice sued California over the law, arguing it violates the First Amendment. "There's a First Amendment right for teenagers. There's a First Amendment right for the internet," he said.
Stewart said his legislation has Democratic co-sponsors and his initial talks with the White House have been encouraging.
In an op-ed in the Wall Street Journal about big Big Tech "abuses" last week, President Joe Biden said Democrats and Republicans can find common ground on protection of privacy, competition and children.
"Millions of young people are struggling with bullying, violence, trauma and mental health. We must hold social-media companies accountable for the experiment they are running on our children for profit," the president wrote.

A Pew Research Center survey found 95% of 13- to 17-year-olds have access to a smartphone.
Between 2009 and 2017, the number of eighth graders using social media every day rose from 46% to 78%, and the time high school students spent online doubled. Common Sense Media estimates that children ages 8 to 12 spent five and a half hours a day on screens in 2021, and teens ages 13 to 18 spent nearly nine hours a day, according to research compiled by the Institute for Family Studies and the Wheatley Institute at Brigham Young University.
A study by the two institutes found that teens who devote more than eight hours a day to screen time were about twice as likely to be depressed as their peers who were on screens less often than that.
In the past decade, anxiety, depression and teen suicide have surged, especially among girls, since the mass adoption of smartphones around 2010, according to University of Virginia sociologist Brad Wilcox, a fellow of the Institute for Family Studies and the American Enterprise Institute, and Riley Peterson, an undergraduate in religion and sociology at Baylor University.
Depression more than doubled, from 12% in 2010 to 26% today for teen girls. Emergency room visits for self-inflicted injuries almost doubled over the same period, again for teen girls. And teen suicide among girls has risen to a 40-year high, Wilcox and Riley wrote in a recent Deseret News piece.
"We can't just turn away from it. We can't just ignore it. We can't just pat them on the back and say 'hey, you'll feel better' and ignore it," Stewart said.
Stewart's bill would give states the authority to file a civil action on behalf of its residents if a social media platform violates the regulations. It also gives parents a right to sue on behalf of their children. It allows the Federal Trade Commission to impose fines for violations.
Seattle public schools recently sued the companies behind Instagram, Facebook, Snapchat, TikTok and YouTube, claiming the platforms are largely responsible for a major decline in young people's mental health.
Szabo said there's a simple reason that the social-emotional state of not only teenagers but all Americans is at an all-time low. "It happens to do with being locked down in our homes for two years," he said, referring to the COVID-19 pandemic. "That seems to be hand-waved away."
The only lifeline kids had was through technology, he said.
"It seems silly to lay the blame at the feet of technology even though it seems to be an easy answer," Szabo said. "Society goes through this every time we have a new technology."
