Federal lawmakers have designed bipartisan legislation called the Kids Online Safety Act to make social media use safer for children. It follows months of Congressional grilling of big tech companies, like Meta, about their safety measures for younger users.
The push for this legislation started after a former Facebook employee turned whistleblower released internal documents showing how Facebook and Instagram can be toxic for teen girls–something the company disputes. The documents shed light on how algorithms target extreme reactions to posts and unsafe advertising to children. If passed into law, the act would allow for parental controls and the option to opt out of the targeted algorithms.
It also requires social media platforms to provide minors with options to protect their privacy. Currently, the platforms collect information, use it to target kids, and according to Sen. Richard Blumenthal (D-CT), “drive destructive content to them.”
“Not only do they know that their destructive content is driving kids down these rabbit holes to eating disorders and self-harm, even suicide, but they are profiting from it,” he said.
Sen. Marsha Blackburn (R-TN) said social media platforms need to be forced to make changes.
“We have realized even as we’ve asked over the past many years for social media to take some responsibility, that they did not step up and do that,” she said.
And in what may be considered the biggest safety measure, the platforms must take responsibility to prevent harm like eating disorders, substance abuse, and the risk of suicide.
In order to make sure these safety measures stay in place, Congress wants social media platforms to undergo annual safety assessments.