Safe Social Media

Protect Children on Social Media

Restrict the use of social media by minors, including a prohibition on minors uploading images for public display, and prohibitions on platforms displaying inappropriate content to minors and using algorithms to serve them recommended content or targeted advertising.

Many of social media’s core features, from public display of private information to algorithmic targeting to easily available harmful content, are especially dangerous for children. Widespread and public image-sharing enables the social comparison pressures that drive so much of social media’s mental health harms, while creating a permanent and public record that exposes children to predators and other risks. Algorithmic recommendations designed to maximize user engagement and platform revenue rely on aggressive data collection, invasions of privacy that minors rarely realize they are allowing, and manipulation of their attention. Harmful and inappropriate content is displayed to children, from pornography and depictions of abuse to glorification of illicit-drug use, eating disorders and self-harm. Even when children are not looking for such content, algorithms regularly deliver it to them.

The United States should regulate children’s access to social media, taking the same steps to provide a safe environment online that we take for granted as necessary in the physical world. The most straightforward way to address social media’s dangerous features is simply to prohibit and penalize them. Social media platforms should be responsible for ensuring that minors’ accounts are restricted to “private” settings that allow only approved family and friends access to the user’s content, especially images. The same indecency restrictions applied on the airwaves should apply to children accessing content online as well. Policymakers need not dictate how to prevent the display of harmful content to children (though measures restricting algorithmic operations would be useful). The law simply needs to establish clear and economically meaningful consequences for failure. Silicon Valley prides itself on rapid innovation; with sufficiently stiff fines at stake for every failure, that innovation will be channeled quickly toward protecting kids. Finally, platforms should be prohibited from targeting advertising at children, which will help to protect privacy and reduce the incentive to attract and addict children to the platforms in the first place.