The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal.
"Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone — a stranger, a classmate, or even a friend — could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children’s Commissioner Dame Rachel de Souza. "There is no positive reason for these [apps] to exist."
De Souza pointed out that nudification AI apps are widely available on mainstream platforms, including the largest search engines and app stores. At the same time, they "disproportionately target girls and young women, and many tools appear only to work on female bodies." She added that young people are demanding action to take action against the misuse of such tools.
To that end, de Souza is calling on the government to introduce a total ban on apps that use artificial intelligence to generate sexually explicit deepfakes. She also wants the government to create legal responsibilities for GenAI app developers to identify the risks their products pose to children, establish effective systems to remove CSAM from the internet and recognize deepfake sexual abuse as a form of violence against women and girls.
The UK has already taken steps to ban such technology by introducing new criminal offenses for producing or sharing sexually explicit deepfakes. It also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent. However, the Children's Commissioner is focused more specifically on the harm such technology can do to young people, noting that there is a link between deepfake abuse and suicidal ideation and PTSD, as The Guardian pointed out.
"Even before any controversy came out, I could already tell what it was going to be used for, and it was not going to be good things. I could already tell it was gonna be a technological wonder that's going to be abused," said one 16-year-old girl surveyed by the Commissioner.
In the US, the National Suicide Prevention Lifeline is 1-800-273-8255 or you can simply dial 988. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia maintains a list of crisis lines for people outside of those countries.
This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/uk-regulator-wants-to-ban-apps-that-can-make-deepfake-nude-images-of-children-110924095.html?src=rss https://www.engadget.com/cybersecurity/uk-regulator-wants-to-ban-apps-that-can-make-deepfake-nude-images-of-children-110924095.html?src=rssLogin to add comment
Other posts in this group




By all accounts, I am a good cook. Yet in all my years, I don't think I've ever made a wonderful batch of rice on the stovetop. Overcooking, undercooking, scorching; you name it, I’ve done it. Thos

There’s an understandable and undeniable pall hanging over The Last of Us after last week’s shocker. And it’s the calm before the storm that’ll make up the rest of the season: Ellie and Di

Forza Horizon 5 is the entire reason I have an Xbox Series S. I’m not really a car guy in real life — if money, practicality and burning through fossil fuels were less of a concern, I’d ge
