A National Society for the Prevention of Cruelty to Children (NSPCC)/YouGov survey has found that 33% of U.K. adults support using end-to-end encryption on social media and messaging services, but this jumps to 62% if it’s rolled out only if and when tech firms can ensure children’s safety is protected.
Private messaging is where most child sexual abuse happens online, and end-to-end encryption means only the devices communicating have the ability to decrypt and read the messages. While this is useful for privacy, it also presents risks for child safety and means abuse can go unnoticed online.
A major NSPCC roundtable event attended by the U.K. Home Secretary, Priti Patel, will bring together child protection, civil society and law enforcement experts from the U.K., U.S., Canada, Ireland, and Australia. The reset of the debate will focus on showing how end-to-end encryption takes away platforms’ ability to find abuse in private messaging, and how this can be avoided.
Currently, major tech firms use a range of tech to identify child abuse images and detect grooming and sexual abuse in private messages. But Facebook’s proposals for end-to-end encryption for Facebook Messenger and Instagram would make these tools useless, with an estimate of 70% of global child abuse reports lost. In 2018 these reports resulted in 2,500 arrests and 3,000 children being safeguarded in the U.K.
NSPCC says the debate around end-to-end encryption has increasingly become an ‘either or’ argument skewed in favor of adult privacy over the safety and privacy rights of children. However, the latest polling suggests public support to balance the safety of children while maximizing the privacy of all users, including the children who have been sexually abused.
- More than half (55%) of U.K. adults believe the ability to detect child abuse images is more important than the right to privacy
- Nearly a third (32%) think they are equally important
- Only 4% say privacy should be prioritized over safety
- 92% support social networks and messaging services having the technical ability detect child abuse images on their sites
- 91% support a technical ability to detect adults sending sexual images to children on their services
At the upcoming roundtable event, the NSPCC will share new research and analysis about the implications of end-to-end encryption for child protection and call for tech firms to refocus their approach through safer design features and investment in technology.
One proposal is for design features that can increase the risk of end-to-end encryption to children should be looked into, e.g. Facebook algorithms that suggest children as friends to adults or plans to auto delete messages on WhatsApp.
“Private messaging is the frontline of child sexual abuse but the current debate around end-to-end encryption risks leaving children unprotected where there is most harm,” said NSPCC Chief Executive Sir Peter Wanless. “The public wants an end to rhetoric that heats up the issue but shines little light on a solution, so it’s in firms’ interests to find a fix that allows them to continue to use tech to disrupt abuse in an end-to-end encrypted world.
“We need a coordinated response across society, but ultimately government must be the guardrail that protects child users if tech companies choose to put them at risk with dangerous design choices.”