In the near future, Meta, the parent company of Facebook and Instagram, has announced that it will restrict the type of content that teenagers can see on the platforms as part of efforts to make the social networks safer for young users. The company has consulted with experts in adolescent development, psychology, and mental health in order to design and implement new content limitations to provide a more age-appropriate experience for teenagers. The controls are intended to make it harder for teens to view and search for sensitive content including suicide, self-harm, and eating disorders, while directing them to resources for support instead.
The new restrictions will affect users under the age of 18 and will be rolled out for both Facebook and Instagram over the coming months. The goal is to prevent teenagers from accessing harmful content through their feeds and stories, and the company is also placing teens into the most restrictive content control settings on both platforms. These changes come amid growing pressure from regulators and lawmakers in the United States and Europe, who have accused Meta of implementing addictive features on its platforms that are harmful to the mental health and well-being of young users.
The company has faced a lawsuit filed by attorneys general of 33 states, which accuses Meta of promoting addictive and harmful features on the apps and unlawfully collecting personal data of underage users without parental permission. However, Meta has denied these claims, stating that it has implemented over 30 tools to support teenagers and their parents. In an effort to address the issue, Meta is working towards ensuring that teenagers have safe and age-appropriate experiences across its apps. According to a Pew Research Center survey, 59 percent of U.S. teens report using Instagram regularly, while only 33 percent use Facebook, which represents a decline from previous years.