One-Third of Teenagers Have Witnessed Real-Life Violence on TikTok – research

Approximately one-third of teenagers aged 13 to 17 have reportedly encountered real-life violence on the popular social media platform TikTok over the past year, according to research conducted by the Youth Endowment Fund, a charity backed by the Home Office. 

The survey, which polled 7,500 teenagers, indicated that 25% had witnessed similar content on Snapchat, 20% on YouTube, and 19% on Instagram. The most common type of violent material observed across all platforms was footage of fights, with 48% of respondents having seen such clips. 

Threats of physical harm were witnessed by 36%, while 29% viewed content involving the carrying, promoting, or use of weapons. Furthermore, 26% of teenagers saw posts showing or encouraging harm to women and girls.

Among the respondents, 27% said the platform they were using had suggested the violent material, while only 9% stated they had deliberately accessed it. Half of the teenagers reported seeing it on someone else’s feed, and the remaining third said it had been shared with them. 

Jon Yates, executive director at the Youth Endowment Fund, emphasized the need for social media companies to address the issue, stating that it is unacceptable to expose children to violent content. He argued that such content can fuel tension and contribute to unhealthy attitudes, particularly towards girls and women.

In response to the findings, a TikTok spokesperson asserted that the platform removes or age-restricts violent or graphic content, often before it receives any views. The spokesperson also mentioned the tools provided for parents to customize content and safety settings for their teens’ accounts. 

A Snapchat spokesperson highlighted their commitment to removing violent content immediately and creating an environment that limits the potential for harmful content to go viral. 

YouTube, in turn, emphasized its strict policies against violent content, stating that more than 946,000 videos were removed in the second quarter of 2023. The research underscores the ongoing challenges social media platforms face in moderating content and protecting young users from exposure to harmful material.

In a recent blog post, Antigone Davis, Meta’s Global Head of Safety, underscored the importance of parental approval for teens’ app downloads. Meta has expressed strong support for potential federal legislation that would require parental consent for users under the age of 16 to download apps.

Back in March, TikTok announced a 60-minute daily screen time limit for users under the age of 18. Similar to other social media companies including Meta and Snap, it also introduced several limitations for teens, such as not being able to have a public account, receive DMs and make comments under videos.

Written by Maya Robertson

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

MONOPOLY GO! surpasses $1B in annual revenue

TikTok enhances mobile measurement framework for app advertisers