Social media platforms know the harm they do to children. Kids spend huge portions of their days in front of screens. About 40 percent of kids between the ages of 9 and 12 use Instagram daily, despite current law ostensibly restricting social media use for people younger than 13. Researchers, including Big Tech’s own internal researchers, continue to confirm what American parents already know firsthand: social media use is a key driver of the current mental health crisis among children and teenagers, including growing rates of suicidal ideation and self-harm. Over 80 percent of parents support strong federal measures to protect children on social media. No wonder 41 states and Washington, D.C. just sued Meta (formerly Facebook) for its “scheme to exploit young users for profit.”
The problem, of course, is that reducing this harm might also reduce Big Tech’s revenues. Addressing parents’ concerns requires policy intervention, which is why Senators Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.) introduced the Kids Online Safety Act (KOSA). KOSA provides parents and children new tools to mitigate or avoid some of social media’s harmful features. It also creates a duty of care that legally obligates platforms to act in the best interests of the minors using them. To meet this obligation, platforms will need to take reasonable measures against the most alarming harms caused by their products, including sexual exploitation; promotion of narcotics and alcohol; encouragement of serious mental health disorders, including eating disorders, suicidal behavior, anxiety, and depression, as defined by the best available medical evidence; and deliberate aggravation of addiction to social media itself.