Snapchat blocks fewer children from its app compared with TikTok


TikTok deletes around 2 million accounts a year and Snapchat just 700

Snapchat is blocking a fraction of the number of underage users that TikTok is, data has shown.

The former is kicking dozens of children in Britain off its platform each month compared with tens of thousands by rival TikTok, Reuters reports.

Most of the popular social media platforms have a minimum age requirement of 13, including Instagram, TikTok and Snapchat. The restrictions aim to protect the privacy and safety of young children.

Ahead of the Online Safety Bill, which is aimed at protecting social media users from illegal and harmful material and is currently progressing through parliament, the communications regulator Ofcom asked TikTok and Snapchat how many suspected under-13s were blocked from its platforms in a year.

Between April 2021 and April 2022, TikTok said it had deleted around 2 million accounts, which is an average of approximately 180,000 a month.

In the same period, Snapchat disclosed that it had removed around 60 per month, or just over 700 in the 12-month period.

A spokesperson for Snap, the owner of Snapchat, told Reuters the figures misrepresented the scale of work it does to tackle the issue of underage users. The representative declined to provide additional context.

“We take these obligations seriously and every month in the UK we block and delete tens of thousands of attempts from underage users to create a Snapchat account,” the Snap spokesperson said.

Snapchat blocks users from signing up with a date of birth that puts them under the age of 13. It is unclear what protocols are in place to remove underage users once they have set up an account on the app.

Currently, it is the responsibility of social media firms to set age limits but this is not regulated. The aim of the Online Safety Bill is to ensure that by law platforms adhere to their terms and conditions, for example, with this issue, through age-verification technology.

As the bill stands, companies may be fined up to £18 million – or 10 per cent of their annual global turnover, whichever is higher, if they are judged by Ofcom to have failed in their duty to protect children from damaging content online. Additionally, social media bosses will face up to two years in prison.

Reuters reports that Ofcom said assessing how video-sharing platforms are making their apps safer for children remained a key focus and that it would report its findings later this year.

Last year, Ofcom reported that 60 per cent of children aged between eight and 11 had at least one social media account, often created by using a false date of birth on sign-up. Snapchat was the most popular app for underage users, the regulator found.


Gooseberry Planet offers a package of over 50 lesson plans, slides, digital workbooks and online games for children aged 5-13 years. Visit our website for more details.

Scroll to Top