Meta, the parent company overseeing Facebook and Instagram, has revealed plans to implement more stringent content controls for teenagers on both platforms.
The company made this disclosure in a recent blog post outlining its commitment to creating a safer online environment for young users by placing them in the “most restrictive” content settings.
The social media giant stated that certain types of content will be concealed, and the use of specific terms will be restricted within the ‘Search’ and ‘Explore’ features.
Notably, content related to sensitive topics such as suicide, self-harm, and eating disorders will become less accessible and will not be recommended.
This initiative is a part of Meta’s ongoing efforts to align its platforms with age-appropriate and secure experiences for teenagers, following the guidance of experts in adolescent development, psychology, and mental health.
In a statement, Meta emphasized its dedication to providing tools and resources to support teenagers and their parents.
The company has spent over a decade developing policies and technology to address rule-breaking content and sensitive issues.
“We want teens to have safe, age-appropriate experiences on our apps,” Meta expressed in the blog post.
“Now, when people search for terms related to suicide, self-harm, and eating disorders, we will start hiding these related results and will direct them to expert resources for help.”