68% of British Teens Fear Leaving Home Due to Violence Seen on TikTok

Written by Kathrine Frich

Nov.29 - 2024 2:59 PM CET

Nearly 68% of teens who had seen violent content online reported going out less often

Trending Now

TRENDING NOW

Social media has become a central part of life for young people, shaping how they interact with the world and each other.

While it offers opportunities for connection and learning, it also exposes children to harmful content that can deeply affect their mental health and sense of safety, according to Ziare.

A recent study in the United Kingdom sheds light on the disturbing impact of violent content on teenagers, revealing how it contributes to anxiety, fear, and changes in behavior.

Go Out Less Often

The research, conducted by the Youth Endowment Fund (YEF) and published in The Guardian, surveyed over 10,000 British teenagers aged 13 to 17.

The findings are alarming. Nearly 68% of teens who had seen violent content online reported going out less often due to anxiety and fear triggered by what they had watched.

Additionally, 80% of those who had viewed videos involving weapons said they felt "less safe in their neighborhood."

The study also found that one in four teenagers had witnessed acts of violence on social media, such as stabbings, fights, gang clashes, and other brutal events.

Platforms like TikTok, X (formerly Twitter), Facebook, Snapchat, and Instagram were identified as common sources of such content.

This exposure has had a profound psychological impact on many teens, with one in three reporting seeing weapons online, even though only 5% said they personally carried a weapon.

Jon Yates, the executive director of YEF, called for action from social media companies, particularly TikTok and X, stating:

“Anyone working for TikTok or X should feel guilty reading this. You should feel guilty… and then turn that guilt into action to do something about it.”

Social media companies have defended their policies.

Meta, which operates Facebook and Instagram, highlighted its efforts to remove content that incites or facilitates violence.

Between April and June, Meta claimed to have acted on nearly 15 million posts featuring violent or graphic material, with 99% flagged before users reported them.