Apple Plans Paused, Age-Appropriate Design Code Has Impact & More

gp_news_thumbnail

APPLE PAUSES IMPLEMENTATION OF CHILD EXPLOITATION TOOL

Tech giant Apple has announced a pause in the testing phase of a new tool designed to protect against child exploitation.

The tool, which was first announced in August, was set up to scan Apple devices for evidence of child sexual abuse by cross-referencing content against images from a National Center for Missing and Exploited Children (NCMEC) database.

The announcement was met with a backlash from many Apple users, with campaigners claiming that the new tool would represent a breach of personal privacy and security.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement.

The company has yet to announce when the new software will be installed across its devices.

AGE-APPROPRIATE DESIGN CODE TAKES EFFECT

As Gooseberry reported in last week’s newsletter, the provisions of the Age- Appropriate Design Code (AADC) came into effect on 2 September.

The AADC, which was established by the UK’s independent Information Commissioner’s Office, sets out a list of 15 standards that businesses are expected to follow to provide younger users with safer online experiences and applies to “information society services likely to be accessed by children,” including apps, online marketplaces, social media platforms and online games.

The Code has been welcomed by campaign groups as an important step in protecting children’s safety. However, some have said that the measures do not go far enough to deliver a truly safe online experience for children.

“There remain significant challenges around how companies can accurately verify users’ ages. On top of this, age verification of adult websites not hosting user-generated content is missing completely from the Online Safety Bill,” said Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation (IWF).

“We would like to see this included, as there is increasing evidence that this is normalising sexualised behaviours in children and preventing them from forming healthy relationships.”

Since the announcement of the new code, many social media companies have already made changes to their platforms to strengthen safety features affecting children. Some of the changes include:

Instagram

In July, Instagram announced a number of new measures aimed at protecting teenagers from receiving unwanted attention by suspicious adults.

As part of the new measures, any new accounts created by under-16s will be set to private by default, while teens with existing accounts will be sent notifications about the benefits of switching to a private account.

Limits will also be placed on adult accounts on their ability to engage with teenage accounts if their activity is considered suspicious, for example if they have already been blocked by a teen user.

The platform had previously banned adults from sending messages to teenagers who were not already following them on the platform.

TikTok

On Wednesday, TikTok Europe announced an expansion in its family pairing tool to help parents and guardians understand how to support their teenage children when they use the platform.

The family pairing tool allows parents to pair their accounts with teenage users, allowing them to take control of their children’s app settings, such as how much time they can spend on the account per day, what content they can view and who they receive messages from.

The expanded features include advice for parents and guardians developed in collaboration with teenagers which offers insight into best practices to help younger people stay safe when they’re online while building trust with them and respecting their privacy.

YouTube

YouTube has introduced a number of new measures affecting the online experiences of its younger users.

As part of the measures, the platform has restricted the advertising seen by young people to content that is more age-appropriate, and videos uploaded by under-18s will be set to private by default.

YouTube will now have bedtime alerts and has disabled ‘autoplay’ features by default for young people, in a move designed to reduce the amount of time young people spend on the platform.

While the policy changes need only apply to UK users, YouTube, TikTok and Instagram have all applied their new measures at the global level, highlighting the impact that the AADC has had on the global regulatory landscape for children’s safety online.


Our Safeguarding and Prevent Continuing Professional Development E-learning courses have now been updated for the new school year. Visit our website for more details.

Scroll to Top