Deepfake abuse to generate nude photos of kids on the rise


Full story

A newly released report by Thorn, a tech company that works to combat the spread of child sex abuse material, showed deepfake abuse is becoming all too common. This is partially due to the rise of artifical intelligence.

Thorn said that is because “undressing” apps are widely available and inexpensive. Individuals can also use other user-friendly, easily-accessible, AI-powered tools to create deepfake versions of nude photographs that look believable.

In a recent survey of more than a thousand students ages 9 to 17-years-old, Thorn found 11% of them had friends or classmates who used AI to generate deepfake nudes of classmates. Another 10% of students refused to answer the question.

The survey also found that almost 1 in 4 kids (24%) aged 13 to 17-years-old said they had been sent or shown an actual nude photo or video of a classmate or peer without that person’s knowledge. Only 7% of the students surveyed admitted that they had personally shared a nude photo or video without the other person’s knowledge.

However, sharing real nudes is a fairly large problem among teens. Almost 1 in 3 (31%) teenagers surveyed said that is normal for people their age. Only 17% of them admitted to sharing nudes of themselves, however.

Thorn noted this trend coincides with a rise in sextortion among young people. About 6% of those surveyed said someone had blackmailed them by threatening to share their nude photos unless they sent money or did something else the blackmailer asked.

Tags: , , , , , ,

Full story

A newly released report by Thorn, a tech company that works to combat the spread of child sex abuse material, showed deepfake abuse is becoming all too common. This is partially due to the rise of artifical intelligence.

Thorn said that is because “undressing” apps are widely available and inexpensive. Individuals can also use other user-friendly, easily-accessible, AI-powered tools to create deepfake versions of nude photographs that look believable.

In a recent survey of more than a thousand students ages 9 to 17-years-old, Thorn found 11% of them had friends or classmates who used AI to generate deepfake nudes of classmates. Another 10% of students refused to answer the question.

The survey also found that almost 1 in 4 kids (24%) aged 13 to 17-years-old said they had been sent or shown an actual nude photo or video of a classmate or peer without that person’s knowledge. Only 7% of the students surveyed admitted that they had personally shared a nude photo or video without the other person’s knowledge.

However, sharing real nudes is a fairly large problem among teens. Almost 1 in 3 (31%) teenagers surveyed said that is normal for people their age. Only 17% of them admitted to sharing nudes of themselves, however.

Thorn noted this trend coincides with a rise in sextortion among young people. About 6% of those surveyed said someone had blackmailed them by threatening to share their nude photos unless they sent money or did something else the blackmailer asked.

Tags: , , , , , ,