Online Safety Bill, Cybersecurity Test, Diversity Survey & More



The U.K. government has published a draft legal text of its much-anticipated Online Safety Bill.

The Bill is aimed at protecting children from online harms as well as tackling online and social media abuse, by forcing websites and other online service providers to take responsibility for identifying and removing harmful user-generated content.

Under the provisions of the Bill, U.K. communications regulator Ofcom would gain the right to issue a “technology warning notice” to providers that it believes are failing to comply with their obligations by not removing harmful content, while the regulator would have the power to seek a court order to inhibit the activities of providers that it deemed noncompliant.

Exemptions have been put in place for certain types of content, including emails and text messages, as part of an effort to safeguard principles of free speech.

Online safety for children has become an area of particular concern since the beginning of the coronavirus pandemic, given the increased time children have spent online unsupervised.  According to data compiled by Europol, there was a significant increase in activity relating to child sexual abuse and exploitation in 2020, including a rise in referrals from the National Center for Missing and Exploited Children to Europol about child sexual abuse material. In addition, the number of reports from the public about child sexual abuse material online to the Internet Watch Foundation grew more than 50% between March 23 and July 9 2020, compared to the same period in 2019.

The government announcement follows a consultation which ran from April to July 2019, in which over 2,400 technology companies, children’s charities, SMEs and government organisations offered their response to the government’s proposals, and was followed by a white paper published in December 2020.

The government proposals have not received unreserved support from the private and voluntary sectors. A report by the NSPCC published in March suggested that the provisions in the Bill contain “a number of significant weaknesses which need to be addressed.” In particular, the children’s charity urged the government to better address the cross-platform nature of risks to children. According to the charity, companies must be required to have “processes in place to share data on offending behaviour, and on highly agile and constantly evolving threats.”

However, the government has hailed the Bill as a “milestone in the Government’s fight to make the internet safe” which will “put an end to harmful practices.”

“Today the UK shows global leadership with our groundbreaking laws to usher in a new age of accountability for tech and bring fairness and accountability to the online world,” said Oliver Dowden, Secretary of State for Digital, Culture, Media and Sport.

“We will protect children on the internet, crack down on racist abuse on social media and through new measures to safeguard our liberties, create a truly democratic digital age.”


Workers at West Midlands Trains were promised a bonus for their efforts during the pandemic in a staff email, only to find that the email was sent as part of a cybersecurity test. 

Employees who clicked on the rail operator’s email were sent a follow-up message: “This was a test designed by our IT team to entice you to click the link and used both the promise of thanks and financial reward.” 

Following condemnation of the initiative by rail unions, a West Midlands Trains spokesperson told the Guardian: “The design of the email was just the sort of thing a criminal organisation would use – and thankfully it was an exercise without the consequences of a real attack.”


The National Cyber Security Centre (NCSC) has launched a new workforce diversity survey together with KPMG to tackle shortcomings on diversity and inclusion in the cyber security industry.

This year’s survey seeks to capture new data on diversity, including new benchmarks on disability and neurodiversity, to improve the understanding of diversity characteristics in the workforce.

Last year’s survey found that the sector has a higher representation of women and LGB representation than the wider technology industry, but that more than two in five black cybersecurity professionals had experienced discrimination in the workforce.


The Cyberspace Administration, China’s cybersecurity watchdog, has identified at least 33 mobile apps which collect excessive personal data, including data collected for unrelated services, that has been stored without user consent.

The apps include Baidu, Tencent and popular shopping company Alibaba. The companies in question have been given 10 days to rectify their data collection policy or face financial penalties.

Gooseberry Planet offers two E-learning Continuing Professional Development courses – Safeguarding CPD (based on Keeping Children Safe in Education) and the Prevent Duty CPD.  Click here for more details.

Scroll to Top