Attorneys general from 44 U.S. states and territories have urged Snap and TikTok to strengthen parental controls on their platforms, telling the social media giants that they must do more to protect kids online.
In a letter sent to Snap and TikTok lawyers this week, the bipartisan group of AGs said the social media platforms don’t “effectively collaborate” with third-party parental control apps that can monitor online content, and argued that the companies have gaps in their content moderation policies, particularly with direct messaging and sexual content.
“[We] write to express our concern that your companies are not taking appropriate steps to allow parents to protect their kids on your platforms,” the letter said. “We ask that you conform to widespread industry practice by giving parents increased ability to protect their vulnerable children.”
Social media companies such as Culver City-based TikTok and Santa Monica-based Snap have faced mounting pressure to address concerns that their social media apps are harmful to children and adolescents. California lawmakers are considering multiple bills to set guardrails for kids’ online lives, including one that would allow parents to sue social media giants for addicting children to their apps. Federal lawmakers in Washington have grilled tech executives about child safety, while state attorneys general are investigating social media giants over how their design, operations and promotional features could be bad for young users.
In a statement, a Snap spokesperson said the company is currently developing tools that will provide parents with more insight into how their children are engaging on Snapchat and allow them to report troubling content.
“We absolutely understand the concerns of parents who want more insight into what their teens are doing on Snapchat and, most importantly, who they’re talking to,” the Snap spokesperson said. “We look forward to providing these tools in the coming months.”
A spokesperson for TikTok noted to dot.LA that the video-sharing app deploys protective features that allow parents to manage their kids’ screen time and has announced steps to combat harmful content on its platform.
“We build youth well-being into our policies, limit features by age, and empower parents with tools and resources to customize their TikTok experience to their unique needs and circumstances,” the TikTok spokesperson said in a statement. “We appreciate that the state attorneys general are focusing on the safety of younger users, and we look forward to engaging with them on our existing features like Family Pairing and ideas for innovation in this area.”
In their letter to Snap and TikTok, the AGs argued that the companies’ current community guidelines do not generally apply to private messaging. In addition, they said some of the companies’ internal parental control settings can be changed or bypassed to still allow inappropriate content. They cited videos showing sexual content and drug use on Snap’s Discover page and TikTok’s For You feed.
The attorneys general also encouraged the companies to collaborate with third-party parental control apps to monitor the content on their respective platforms. They did not name any specific parental control apps, but argued that they could alert parents and schools to messages and posts that could be harmful.
“On other platforms where these apps are allowed to operate appropriately, parents have received notifications of millions of instances of severe bullying and hundreds of thousands of self-harm situations, showing that these apps have the potential to save lives and prevent harm to our youth,” the letter said.
Children’s safety is among a growing list of social media concerns that also include privacy and misinformation. The attorneys’ general letter seemed to acknowledge that.
“This letter is not intended to address all concerns with your social media platforms,” it said.
Update, April 4: This story has been updated to include a statement from TikTok.
From Your Site Articles
Related Articles Around the Web