Meta introduces Instagram Teen Accounts amid rising privacy concerns

On Tuesday, Meta announced the introduction of specialized teen accounts on Instagram, specifically designed to enhance privacy and parental control features. This move is part of Meta’s broader strategy to limit teenagers’ exposure to harmful content and to respond to growing regulatory and public scrutiny.

These new teen accounts will come with default privacy settings, automatically applied to Instagram users under 16. One of the most notable features is that these accounts will default to private, meaning that only followers and connected users can send messages or tag these young users. Moreover, sensitive content filters will be set to the strictest levels to protect teens from inappropriate or harmful material.

In a bid to further empower parents, Meta is rolling out a suite of parental control options. These tools will enable parents to monitor their children’s activity on the platform, restrict who can interact with them, and even limit how much time they spend on the app. Changing any of the default settings will also require parental consent, providing an extra layer of control for families.

This move comes at a crucial time for Meta. The company, alongside other major social media platforms like TikTok and YouTube, is facing mounting lawsuits. Many of these cases, filed on behalf of children and school districts, argue that the addictive nature of these platforms exacerbates mental health issues, including depression and anxiety, particularly among young users. Several studies have echoed these concerns, linking social media consumption to elevated levels of stress and cognitive challenges in teenagers.

The regulatory landscape is also shifting. Last year, 33 U.S. states filed lawsuits against Meta, accusing the tech giant of downplaying the potential dangers of its platforms. These lawsuits come as lawmakers continue to push for stricter regulations. In July, the U.S. Senate made progress on two key pieces of legislation—the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act—that aim to hold social media companies accountable for the well-being of their youngest users.

This latest rollout follows Meta’s decision, three years ago, to shelve plans for a dedicated Instagram app for teenagers. That project faced fierce opposition from lawmakers and advocacy groups who raised concerns about child safety and privacy. The company’s new approach appears to be more in line with these concerns, focusing on enhancing protection within the main Instagram platform itself.

Meta plans to implement these new teen accounts over the next 60 days in key markets, including the U.S., UK, Canada, and Australia. A wider rollout, covering the European Union, is scheduled for later in the year, with global availability expected in January 2025.

“We understand parents’ concerns, and that’s why we’re reimagining our apps for teens with new Teen Accounts. This new experience is designed to better support parents, and give them peace of mind that their teens are safe with the right protections in place. Teens will also get access to a new feature, made just for them, that lets them select topics they want to see more of in Explore and their recommendations so they can focus on the fun, positive content they love,” Meta said in a blog post announcing the new feature.

Written by Maya Robertson

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

The Best AI Girlfriend Apps

2023 sees 3.2% increase in global rewarded video ad engagement: Unity