Growth in Grooming, Screen Time Statistics, App Risks & More



The number of recorded online grooming crimes has reached a record high in 2021, according to an NSPCC report.

The information was obtained by a series of Freedom of Information requests submitted by the children’s charity to 42 police forces. The data showed that Sexual Communication with a Child offences from April 2020 to March 2021 increased by 9% against the previous year, and by around 70% against the same period in 2017/18. 

The NSPCC also found that almost half the online grooming offences related to Facebook-owned social media platforms Instagram, Facebook Messenger and WhatsApp, while Snapchat was another platform commonly used.

The popularity of the platforms with children, as well as the potential to exploit risky design features, were cited as among the reasons the platforms were used by offenders.

The NSPCC has called for further investment by Facebook in technology to identify abuse on its platforms alongside its plans to introduce end-to-end encryption – a feature which can be exploited by abusers to avoid detection.  The charity also calls for social media companies to be obliged to provide a named executive responsible for child safety who is personally liable for any flaws in the design of sites that put children at risk of harm.

“We know that it’s really daunting for parents to see figures like this,” Hannah Ruschen, the NSPCC’s online safety policy and public affairs officer, said in an interview with the Evening Standard.

“What’s really important is that parents and carers need to have open conversations with their children regularly about the activity they’re doing online.”


Adults in the UK spent an average of 5 hours and 40 minutes watching TV or videos each day in 2020, according to a study of national media habits by government regulator Ofcom.

The coronavirus lockdowns and restrictions are thought to be responsible for the prolonged amount of time, which represents an increase of around three quarters of an hour compared with the same study in 2019.

The average UK adult spent almost three hours per day watching live TV and roughly 1 hour per day watching streaming services. Younger people aged 16-34 spent only an hour watching live TV per day but spent over 90 minutes watching streaming services and over 70 minutes watching YouTube videos, according to the study.

The number of UK households which had a streaming service subscription reached an all-time high in 2020, climbing to 31 million subscriptions up from 20 million in 2019.

Deflated by news about the pandemic, many Netflix users in the UK turned to watching comedy shows as a means of escapism, which saw a near doubling in the number of minutes watched in 2020 to almost 1 hour per day.


Apple allows children to access adult-only apps such as sex and BSDM apps on its App Store, according to a new investigation by the Tech Transparency Project (TTP).

The TTP investigation involved creating an Apple account using the date of birth of a 14-year-old. Using the account, TTP investigators were able to download apps marked as 17+ from the App Store, such as “KinkD: Kink, BDSM Dating Life.”

In total, 37 such apps were available to download despite the age profile of the user account, raising concerns about the effectiveness of child safety measures Apple had put into place on its devices.

“The results undermine Apple’s promise that its App store is a ‘safe place for kids’ and that it rejects apps that are ‘over the line – especially when it puts children at risk’,” TTP concluded in its report into the investigation.

 “Taken together, these review failures create an ecosystem that is much more dangerous for minors than advertised.”


The Information Commissioner’s Office (ICO) Age Appropriate Design Code (aka the Children’s code) is to come fully into force on the 2nd of September.

The code, which was launched in September 2020, is designed to protect children’s data online while using apps, games, connected toys and news services and applies to social media platforms, search engines, online marketplaces and other online services children are able to access.

A number of social media companies, including Facebook, Google, Instagram and TikTok, have already made updates to their child privacy and child safety policies in accordance with the provisions set out in the ICO code.

“Ultimately the Children’s code will help industry innovate to ensure that the best interests of the child are a primary concern online and built into the design from the beginning,” said ICO Executive Director Stephen Bonner. “This will grow the trust between online services, children, parents and society.”

Our Safeguarding and Prevent Continuing Professional Development E-learning courses have now been updated for the new school year. Visit our website for more details.

Scroll to Top