In response to increasing regulatory pressure to safeguard teenagers from harmful content, Meta announced on Tuesday that it would implement stricter content control settings for all teens on Instagram and Facebook.
The comprehensive measures include limiting additional search terms on Instagram, making it more challenging for teenagers to encounter sensitive content related to suicide, self-harm, and eating disorders during activities like Search and Explore.
Meta is automatically enrolling teenagers in the most stringent content control settings on both Instagram and Facebook. Initially implemented for new teen users upon joining these platforms, this setting is now being extended to encompass existing teenage users.
Additionally, in an effort to promote regular check-ins on safety and privacy settings, Instagram is introducing new notifications targeted at teenagers. These notifications are designed to encourage teens to consider adopting more private settings for an enhanced online experience. By opting to “Turn on recommended settings,” teens can seamlessly update their preferences with a single tap. This action triggers automatic adjustments to their settings, limiting reposting permissions, tagging, mentions, and inclusion in Reels Remixes. Additionally, the update ensures that only their followers can send messages, and offensive comments are better concealed.
Expected to roll out over the next few weeks, these changes aim to create a more “age-appropriate” experience.
“Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people. Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content. We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow,” Meta said in a blog post.
Meta faces intensified scrutiny in the United States and Europe, with allegations of its apps being addictive and contributing to a youth mental health crisis.
In October, 33 U.S. states’ attorneys general, including California and New York, filed a lawsuit accusing Meta of misleading the public about platform dangers. The European Commission has also sought information on how Meta protects children from illegal and harmful content.
The company’s recent changes, however, drew criticism from a former employee who testified in the U.S. Senate, stating that Meta’s efforts fell short of addressing concerns and lacked effective reporting tools for teens facing harassment.
As Meta contends with regulatory challenges, the competitive landscape with TikTok for younger users continues to evolve, emphasizing the importance of prioritizing safety in social media platforms targeting teenagers.
Comments
Loading…