
Teenagers exposed to 'horrific' content online - and this survey reveals the scale of the problem
27/03/25, 12:15
Over half (55%) of the 1,000 students surveyed had seen sexually explicit or violent content that was inappropriate for their age, with some saying it appeared unprompted and "pops up randomly".
Teenagers are routinely seeing inappropriate violent or sexual content, "doom-scrolling" and being contacted by strangers online, according to an exclusive survey.
Teenagers are routinely seeing inappropriate violent or sexual content, "doom-scrolling" and being contacted by strangers online, according to an exclusive survey.
More than 1,000 young people aged 14 to 17 in Darlington schools told us what they see and experience online when looking at apps commonly used by teenagers.
Their answers raise troubling questions about whether government and tech companies are doing enough to protect children online amid a growing debate among parents and campaigners about how far to restrict children's access to smartphones and social media.
Of those surveyed, 40% spent at least six hours a day online - the equivalent of a school day. One in five said they spent upwards of eight hours a day on their phones.
Some of the findings in the under-16 group were striking, including that 75% had been contacted by strangers through social media and online gaming.
Over half (55%) of the Year 10 students, aged 14 to 15, had seen sexually explicit or violent content that was inappropriate for their age.
Concerningly, a large proportion of them (50%) said this always or usually came up on social media apps without them searching for it - suggesting it is driven by algorithms.
Doom-scrolling is the act of spending an excessive amount of time online consuming negative news or social media content, often without stopping.
The survey represents a snapshot of teenagers in one town in the UK, but resonates more widely.
The teenagers said they wanted their voices to be heard in the debate about online safety. While they did not favour a social media or smartphone ban, many wanted tougher controls on the content they see.
When asked if they were in favour of social media companies doing more to protect under 16s from seeing explicit or harmful content, 50% were in favour and 14% against.
'It's quite horrific'
News was invited to film a focus group of under-16s from different schools discussing the results at St Aidan's Academy in Darlington, hosted by Labour MP Lola McEvoy, whose office carried out the research.
Jacob, who is 15, said among the things he had seen on social media were "gore, animal abuse, car crashes, everything related to death, torture".
He said: "It's quite horrific. A lot of the things that I've seen that I shouldn't have, have not been searched by me directly and have been shown to me without me wanting to.
"Most of this stuff pops up on social media, Instagram Reels, TikTok, sometimes on YouTube.
"It's like a roulette, you can go online and see entertainment, because there's always a risk of seeing racism, sexism and 18+ explicit content."
"After school, the only time I take a break is when I'm eating or talking to someone. It can turn into addiction," he said.
He also said inappropriate content was unprompted. "I've seen a varied spectrum of things - sexually explicit content, graphic videos, gory photos and just upsetting images," he added.
"Mostly with the violence it's on Instagram Reels, with sexually explicit content it's more Snapchat and TikTok."
Parents 'can't tackle this alone'
Ms McEvoy described the findings as "shocking" and said "the safety of our children online is one of the defining issues of our time".
"Parents and teachers are doing their best, but they can't tackle this alone," she added.
"We need enforceable age verification, better content controls, and more age-appropriate functions to ensure children can go online without fear."
The Online Safety Act, which was passed by MPs in October 2023, is intended to protect users - particularly children - from illegal and harmful content.
It is being implemented this year, with tough fines for platforms which do not prevent children from accessing harmful and age-inappropriate content coming in this summer.
A private members' bill debated by MPs earlier this month proposed that the internet "age of consent" for giving data to social media companies be raised from 13 to 16, but it was watered down after the government made clear it would not support the move.
Snapchat, Instagram and TikTok were contacted for comment, but did not provide an on-the-record statement on the comments by the teenagers.
The companies insist they take issues of safety and age-appropriate content seriously.
Instagram is rolling out Teen Accounts, which it says will limit who can contact teenagers and the content they can see.
Snapchat and TikTok say on their websites that accounts for under-16s are set to private.
Technology Secretary Peter Kyle said: "As the testimonies from these young people show, for too long harmful content has been easily accessible online, often reaching children in places they should feel safe and even when they aren't seeking it out.
"This week, key protections of the Online Safety Act came into force meaning platforms must take action to protect users from illegal material, and by the summer additional protections will stop children being exposed to harmful material like abusive misogyny and age-inappropriate content such as pornography.
"We are committed to creating a safe online environment where children can explore without fear and parents have confidence their children can be safe online.
"I expect these online safety laws will help achieve this, but they are the foundation, not the end of the conversation, and we are prepared to go further to keep our children safe."