A bipartisan group of state attorneys general filed a joint lawsuit against Meta, alleging that Facebook’s parent company knowingly used addictive features in its apps, negatively impacting children’s health. Lawmakers, often in disagreement, united together to demand answers from Meta’s leadership on its impact on minors.
“Look, we have a tremendous amount of evidence and information that’s been developed that shows that Meta knowingly has designed its products in a way to maximize its ad revenue by addicting young teenagers onto its products,” District of Columbia Attorney General Brian Schwalb said.
“They hid from this committee and from all of Congress evidence of the harms that they knew was credible,” Connecticut Sen. Richard Blumenthal said.
“They are deliberately misleading parents about what’s on their platform,” Missouri Sen. Josh Hawley said. “They are deliberately misleading parents about the safety of their children online.”
Now, Meta announces plans to expand safety measures for children and teens on its social media platforms. The goal is to make it harder for young users to come across sensitive content.
The company will implement restrictive settings on the accounts of teens and children, preventing users from searching “sensitive topics” and prompting teens to update their privacy settings.
In a blog post, Meta said Facebook and Instagram will hide search results for content related to suicide, self-harm, eating disorders, and nudity. Teens can still make posts on these subjects but won’t see them in their feed or stories, even if shared by someone they follow.
Meta aims to automatically place all teens under the most restrictive content control setting. These changes follow a whistleblower’s testimony to a Senate panel in November, stating that Meta knew harmful content was present on its platforms and company executives were taking no action.
“As a parent, I took the work personally,” Arturo Bejar, a former Meta employee, said. “By the time I left in 2015, I thought the work was going in the right direction. A few years later, my 14-year-old daughter joined Instagram. She and her friends began having awful experiences, including repeated unwanted sexual advances and harassment. She reported the incidents to the company, and it did nothing.”
Meta states the new update should be complete in a couple of weeks, just in time for CEO Mark Zuckerberg’s child safety testimony on Capitol Hill.