NSPCC: Paedophiles in England and Wales using VR tech.



For the first time, police data obtained by the National Society for the Prevention of Cruelty to Children (NSPCC) has documented the use of virtual reality headsets by paedophiles in England and Wales. According to the NSPCC’s Freedom of Information Act request from police forces in England and Wales, 30,925 offences involving obscene images of children were reported in 2021–22, the highest number ever recorded and a 66 percent increase over the last five years. Virtual reality technology was mentioned eight times in the police crime reports from those years.

Snapchat was registered 4,293 times in police crime reports, making it the app most commonly used to share child abuse images, surpassing Instagram, Facebook, and WhatsApp. Roxy Longworth, who was 13 when she was contacted by an older boy on Facebook who coerced her into sending intimate images to him on Snapchat, said he sent them on to his friends and that she was then blackmailed and manipulated into sending more images, which were shared on social media.

Sir Peter Wanless, chief executive of the NSPCC, said: “These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online. We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.” He called on the government to create the post of child safety advocate and emphasized that it would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.

The NSPCC also called on Meta to pause plans to roll out default end-to-end encryption of Facebook and Instagram messenger services, which they said would make it impossible “to identify grooming and the sharing of child sexual abuse images.” Professor Alexis Jay, who chaired the seven-year inquiry into child sexual abuse, said: “The latest statistics suggest that the age at which children become victims is getting younger. The number of sexual abuse offences, recorded by the police, where the victim was a child under the age of 4 has risen by 45 percent in recent years.” Jay also specifically mentioned new technology, such as a product that enables “bare hands tactility”, which allows sexual predators to feel and be felt by their victims, without the need for physical contact.

A government spokesman said: “Protecting children is at the heart of the Online Safety Bill and we have included tough, world-leading measures to achieve that aim while ensuring the interests of children and families are represented through the Children’s Commissioner. Virtual reality platforms are in scope and will be forced to keep children safe from exploitation and remove vile child abuse content. If companies fail to tackle this material effectively, they will face huge fines and could face criminal sanctions against their senior managers.” A Meta spokesman added: “This horrific content is banned on our apps, and we report instances of child sexual exploitation to the National Centre for Missing & Exploited Children. We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue. Our work in this area is never done, and we’ll continue to do everything we can to keep this content off our apps.”

For the first time, police data obtained by the National Society for the Prevention of Cruelty to Children (NSPCC) has documented the use of virtual reality headsets by paedophiles in England and Wales. According to the NSPCC’s Freedom of Information Act request from police forces in England and Wales, 30,925 offences involving obscene images of children were reported in 2021–22, the highest number ever recorded and a 66 percent increase over the last five years. Virtual reality technology was mentioned eight times in the police crime reports from those years.

Snapchat was found to be the app most commonly used to share child abuse images, with 4,293 mentions in police crime reports, surpassing Instagram (1,363 mentions), Facebook (1,361 mentions), and WhatsApp (547 mentions). Roxy Longworth, who was 13 when she was contacted by an older boy on Facebook who coerced her into sending intimate images to him on Snapchat, said he sent them on to his friends and that she was then blackmailed and manipulated into sending more images, which were shared on social media.

Sir Peter Wanless, chief executive of the NSPCC, said: “These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online. We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.” He called on the government to create the post of child safety advocate and emphasized that it would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media. The NSPCC also called on Meta to pause plans to roll out default end-to-end encryption of Facebook and Instagram messenger services, which they said would make it impossible “to identify grooming and the sharing of child sexual abuse images.”

Professor Alexis Jay, who chaired the seven-year inquiry into child sexual abuse, said: “The latest statistics suggest that the age at which children become victims is getting younger. The number of sexual abuse offences, recorded by the police, where the victim was a child under the age of 4 has risen by 45 percent in recent years.” Jay also specifically mentioned new technology, such as a product that enables “bare hands tactility”, which allows sexual predators to feel and be felt by their victims, without the need for physical contact.

A government spokesman said: “Protecting children is at the heart of the Online Safety Bill and we have included tough, world-leading measures to achieve that aim while ensuring the interests of children and families are represented through the Children’s Commissioner. Virtual reality platforms are in scope and will be forced to keep children safe from exploitation and remove vile child abuse content. If companies fail to tackle this material effectively, they will face huge fines and could face criminal sanctions against their senior managers.” A Meta spokesman added: “This horrific content is banned on our apps, and we report instances of child sexual exploitation to the National Centre for Missing & Exploited Children. We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue. Our work in this area is never done, and we’ll continue to do everything we can to keep this content off our apps.”

Exit mobile version