A recent survey by Thorn reveals that 84% of young people aged 13-20 recognize deepfake nudes as abusive. This consensus among teens comes as they increasingly face the trauma of AI-generated fake nudes. These images are often created using free or low-cost “nudify” apps or web tools, which “undress” innocent photos of victims. The fake nudes are then circulated among peers at school or online. Despite the growing awareness among youth, the issue has caused chaos in schools, with some administrators failing to address the problem adequately. In one Pennsylvania case, a school shut down after the head allegedly ignored reports of a student creating fake nudes targeting nearly 50 female students. Meanwhile, a 2023 survey found that 74% of 1,522 US male deepfake porn users felt no guilt about viewing such content, indicating a normalization of deepfake pornography among adults.
Source: arstechnica.com















