Safety Improvements at Snap, Online Safety Bill Progresses & More



Snapchat owner Snap Inc has introduced a set of new measures for developers using its Snap Kit platform, as part of a bid to improved child safety online.

As part of the new measures, anonymous messaging will be banned, and friend-finding apps will be limited to users over the age of 18.

The announcement comes after a series of lawsuits faced by Snap related to experiences of online bullying affecting teen users of the platform.

Snap Kit is set of tools which allows other app developers to incorporate features from the Snapchat app into their own services.

Snap Kit developers have been given 30 days to comply with the new measures.


The Online Safety Bill began its passage through parliament last week after months of discussions, consultations and amendments.

Two additional provisions were introduced to the Bill as it was announced, in a bid to strengthen the existing proposals.

First, executives who fail to cooperate could face prosecution and a jail sentence within the first two months of the Bill becoming law, instead of after two years. Second, senior managers at tech companies will be criminally liable for “destroying evidence, failing to attend or providing false information in interviews with Ofcom.”

The announcement follows criticism from a number of campaign groups and children’s charities that the proposed Online Safety Bill did not go far enough to protect children and vulnerable people online.

Digital Secretary, Nadine Dorries, said: “Since taking on the job I have listened to people in politics, wider society and industry and strengthened the Bill, so that we can achieve our central aim: to make the UK the safest place to go online.”


Black Twitch streamers are urging Twitch executives to take more action to deal with racism on the platform.

The campaigners have written an open letter addressed to Twitch CEO Emmett Shear, in which they call for improved algorithmic content moderation on the platform to provide a greater voice to Black streamers, and for the company to conduct a racial equality audit to identify areas of bias and discrimination on the platform and in the wider company.

It comes after 20,000 Twitch users signed a petition, #TwitchDoBetter which called for “more humane and less hostile internet for Black people.”

In the letter the streamers said: “While Twitch claims to be ‘listening to creators and building solutions to help stop unwanted harassment,’ it still has not offered a timeline for when these demands will be met, if at all.”

“Twitch has shown repeatedly that it is not willing to be transparent and cares more deeply about its public image than the safety of Black streamers.”


Instagram owner, Meta, has introduced a range of new parental supervision tools to help parents and guardians manage their children’s experiences on the platform.

The new ‘Family Center’ will allow parents to view how much time their children spend on the platform, and to be notified when their child reports anyone.

They will also have the ability to set time limits on how much time teens spend using the platform.

“This is just one step on a longer path — our vision for Family Center is to eventually allow parents and guardians to help their teens manage experiences across Meta technologies, all from one central place,” Head of Instagram, Adam Mosseri, said in a blog post.

“We appreciate the input from experts who help us deepen our understanding of this area, so we can continue protecting teens, supporting families and preserving all the good that young people derive from the internet.” The tools will also be rolled out to Meta’s VR platform in May.

Gooseberry Planet’s Play product, comprising lesson plans, slides, digital workbooks and online games is currently on special offer. Visit our website for more details.

About The Author

Scroll to Top