The safety features depend on engaged parents
Meta has announced new parental control tools across Instagram, Facebook and Messenger – but child protection organisations argue the new measures provide little protection for youngsters who are most vulnerable to exploitation.
The new features include a parental supervision hub in Messenger, which will be available in Meta’s Family Centre and will be rolled in the US, UK and Canada.
However, experts warn that when safety features depend on engaged parents, children who are lacking when it comes to consistent supervision are still at risk. Vulnerable youngsters can include those in the child welfare system.
There are criticisms that the move diverts responsibility to keep young users safe from the tech giant to parents.
Rani Govender, senior child safety online policy officer at the National Society for the Prevention of Cruelty to Children (NSPCC), told The Guardian: “An approach to safety that puts the onus on parents and carers is not enough on its own. Many young people may not be able to speak to a parent about online concerns, particularly children in care.
“Many parents will not have the technical knowledge or time to supervise their child’s social media use.”
Earlier this year The Guardian revealed that Meta is failing to report or detect the use of Facebook and Instagram for child trafficking. It also uncovered how Messenger is being used by traffickers to communicate to buy and sell children.
In June, Meta announced it had set up a taskforce to investigate Instagram’s role in the distribution and sale of child sexual abuse material.
However, Meta is in the midst of cutting 21,000 jobs in 2023, including its content moderator teams.
Sophie Voegel, a spokesperson for Meta, told the newspaper: “Far from replacing them, our parental supervision tools are intended to complement our existing safeguards to help protect teens from unwanted contact,” Meta also said. “These include defaulting teens into private accounts when they sign up to Instagram, preventing people over 19 from sending private messages to teens who don’t follow them and preventing adults who have shown potentially suspicious behaviour from finding, following and interacting with teen accounts.”
Meta’s new safety tools also include a feature that allows parents and guardians to see how much time their teen spends on Messenger, receive updates on changes to their contact lists and privacy settings and receive a notification if their teen blocks another user (if the child allows this notification).
Parents can also now see which types of users – friends, friends of friends, or no one – can message their teens, though they can’t alter this setting.
On Instagram DMs, Meta is testing a feature that forces users to send an invite and receive permission to connect before they can message a user who doesn’t follow them.