Meta expands teen safety measures and tightens protections for child-focused accounts

Meta is rolling out new safeguards for Instagram aimed at better protecting teenagers and accounts that primarily feature children, following ongoing concerns about child exploitation and teen safety on social media platforms.

For teen users, new tools are coming to direct messages (DMs) that provide more context about who they’re interacting with. From now on, teens will see additional details like when an account was created, plus quick-access safety tips and an option to block and report simultaneously. The changes are intended to help young users spot potential scams or suspicious contacts more easily.

Meta’s internal data shows that these efforts are already being used. In June alone, teens blocked over 1 million accounts and reported another million after receiving safety prompts. The platform’s nudity filter, which blurs potentially explicit images in DMs, also remains widely active: 99% of users, including teens, keep it on by default. Last month, around 40% of blurred images stayed blurred, meaning recipients didn’t view them.

In addition to teen-focused updates, Meta is tightening rules for adult-managed accounts that mainly showcase children—such as family influencer pages or accounts run by parents or managers for child creators under 13. These accounts will automatically have the strictest messaging settings enabled to prevent unsolicited contact, and Instagram’s Hidden Words feature will be turned on to filter out offensive or inappropriate comments.

Meta says it will also limit the ability for potentially suspicious adults—like those blocked by teens—to discover or engage with these accounts. These changes build on earlier steps, like stopping child-focused accounts from receiving gifts or subscriptions. Accounts affected by these new measures will see notifications at the top of their Instagram feed prompting them to check their privacy settings.

Earlier this year, Meta’s teams removed roughly 135,000 Instagram accounts for leaving sexual comments or requesting explicit content from child-focused profiles. An additional half a million linked accounts across Instagram and Facebook were also taken down. Some of these removals were shared with other tech companies via industry coalitions to limit cross-platform abuse.

These updates come as regulators and child safety advocates continue to push for stronger protections for minors online. Instagram’s policy requires users to be at least 13, but adult-run pages for kids under 13 are allowed when clearly managed by an adult. If Meta finds a child is running the account themselves, it will remove it.

Meta says the new safeguards for teens and child-focused accounts will roll out in the coming months as the company tries to balance platform engagement with better safety measures for vulnerable users.

Written by Sophie Blake

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

The Most Relaxing iOS Games to Reduce Your Stress