Social media feeds children quack health advice, soft porn and toxic masculinity


Children as young as 13 are fed harmful material 

A newspaper investigation has revealed the disturbing extent that children as young as 13 are fed harmful material on social media platforms.

The Telegraph found youngsters are being targeted with false health advice, links to pornography and explicit self-harm content.

The report comes as a crucial vote is set to take place this week in Parliament on a vital piece of legislation aiming to police the internet. The Online Safety Bill in its current form would see tech giants face higher fines prison if they fail to protect youngsters from harmful online content. But Tory backbenchers have revolted and are pushing for social media bosses to face prison.

A Telegraph journalist created eight TikTok and Instagram accounts set as 13- and 15-year-olds and found that all of the accounts were fed inappropriate content.

Shockingly, TikTok’s algorithms served up a “menu of violence and crimes” to one of the boys, without the user searching for such material.

On Instagram, the accounts were very quickly followed by sexual models, encouraging them to watch their “private shows”.

Internet safety campaigner Baroness Beeban Kidron said the Telegraph report showed that “social media platforms were wilfully disregarding harms to children”.

“These platforms have fuelled anxiety and depression among a staggering number of young people, and in some cases cost them their lives”, she added.

The investigation found that after searching using terms such as “New Year New Me” and “health 2023”, the user was fed content about crash diets and “colon cleanses”. A “juice for vaginal health” was promoted to an apparent 13-year-old girl.

On TikTok the reporter found that promotions for “anxiety-depleting supplements” were widespread, along with before-and-after body comparison clips, workout regimes and diets promoting fast weight loss.

The teen boys’ accounts on Instagram were targeted with “testosterone-boosting” products and content about scenes of crime, accidents and violence such as murder investigations and knife attacks.

When it came to self-harm material, The Telegraph reports that Instagram blocked most content but on TikTok its reporter was able to access alarming videos in minutes.

A Meta spokesperson, speaking for Instagram, said its rules and retrictions have been developed in consultation with safety experts. Teens’ accounts are automatically set to private and are sent notifications encouraging them to take breaks and explore different topics, it added.

It said it has recently launched controls to further limit the amount of sensitive content teens view and that it has never permitted material that promotes suicide, self-harm or eating disorders, nor weight loss content to under 18s.

TikTok said it had removed inappropriate content flagged by The Telegraph and claimed the reporter’s reported experience “does not reflect genuine behaviour or viewing experiences of real people”.

News from elsewhere this week:

More TikTokers are taking the stand on children they should’ve taken in the first place – Indy100

Is a self-harm game really being shown to kids on Roblox? – Snopes

Teaching online safety in schools – GOV.UK

Safe Gaming: What parents need to know – Psychology Today

What is YouTube Kids? Everything you need to know about the video platform for kids – Make Use Of


Gooseberry Planet is offering 20% of all new purchases in January.  Visit our website for more details.

Scroll to Top