Politics

Surgeon General: For Our Kids’ Safety, Social Media Platforms Need a Health Warning

One of the most important lessons I learned in medical school was that in an emergency, you don’t have the luxury to wait for perfect information. You assess the available facts, you use your best judgment, and you act quickly.

The mental health crisis among young people is an emergency — and social media has emerged as an important contributor. Adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and the average daily use in this age group, as of the summer of 2023, was 4.8 hours. Additionally, nearly half of adolescents say social media makes them feel worse about their bodies.

It is time to require a surgeon general’s warning label on social media platforms, stating that social media is associated with significant mental health harms for adolescents. A surgeon general’s warning label, which requires congressional action, would regularly remind parents and adolescents that social media has not been proved safe. Evidence from tobacco studies show that warning labels can increase awareness and change behavior. When asked if a warning from the surgeon general would prompt them to limit or monitor their children’s social media use, 76 percent of people in one recent survey of Latino parents said yes.

To be clear, a warning label would not, on its own, make social media safe for young people. The advisory I issued a year ago about social media and young people’s mental health included specific recommendations for policymakers, platforms and the public to make social media safer for kids. Such measures, which already have strong bipartisan support, remain the priority.

Legislation from Congress should shield young people from online harassment, abuse and exploitation and from exposure to extreme violence and sexual content that too often appears in algorithm-driven feeds. The measures should prevent platforms from collecting sensitive data from children and should restrict the use of features like push notifications, autoplay and infinite scroll, which prey on developing brains and contribute to excessive use.

Additionally, companies must be required to share all of their data on health effects with independent scientists and the public — currently they do not — and allow independent safety audits. While the platforms claim they are making their products safer, Americans need more than words. We need proof.

Back to top button