TikTok tightens beauty filter ese for teens to address mental health concerns

TikTok is introducing significant changes to its platform aimed at safeguarding the mental well-being of its teenage users. In the coming weeks, the company will restrict users under 18 from accessing certain beauty filters that subtly alter appearances, making it challenging to detect their effects. These filters, such as the popular Bold Glamour, smooth skin, enhance facial features, and create a polished look that has raised alarms among mental health advocates. To provide clarity, TikTok will also expand descriptions of filters to specify what changes they apply.

The decision was announced during TikTok’s safety forum at its European headquarters in Dublin and represents a broader effort by the platform to improve safety and well-being among its younger audience.

The restrictions will not impact filters meant for entertainment or humor, like those adding animal ears or exaggerated features. Dr. Nikki Soo, TikTok’s Safety and Well-being Public Policy Lead for Europe, confirmed that these measures will be rolled out globally. The decision comes on the heels of a report by Internet Matters, a children’s online safety non-profit, which highlighted how beautifying filters contribute to unrealistic beauty standards. According to the report, young users often fail to recognize altered images, leading to distorted perceptions of reality and heightened social pressures to conform to idealized appearances.

To further its commitment to user safety, TikTok will introduce additional support resources in 13 European countries, connecting users who report harmful content such as self-harm, harassment, or hate speech to local helplines. The platform, with over 175 million monthly active users in Europe, continues to prioritize user safety as a core value. Christine Grahn, TikTok’s European Public Policy Head, emphasized on LinkedIn that fostering a safe environment is key to ensuring users feel comfortable and authentic on the platform.

In another move to enhance safety, TikTok is testing machine-learning tools to better detect accounts created by users under 13, the platform’s minimum age requirement. The company revealed that it removes about six million accounts annually for failing to meet this age threshold. Users whose accounts are flagged incorrectly will have the opportunity to appeal the decision.

Written by Jordan Bevan

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

PubMatic partners with Intuit SMB MediaLabs