AI-Generated Sexual Imagery: Teen Girls Using ‘Nudification’ Tools at Same Rate as Boys

15

A recent study indicates that adolescent girls are engaging with artificial intelligence-powered “nudification” apps – tools that create sexualized images from uploaded photos – at a rate comparable to that of boys. This finding challenges the assumption that this behavior is primarily driven by male users. The research, published in PLOS One, surveyed 557 English-speaking teenagers aged 13-17 in January 2025 and revealed that 55% had created such images, while 54% had received them.

Widespread Use and Non-Consensual Sharing

The study’s results are significant because they reveal a normalization of this technology among teenagers. Over one-third of respondents reported being victims, with similar percentages stating that images were made of them without consent or shared without permission. Approximately 1 in 6 teens, both male and female, admitted to frequent use of these tools to visualize how they would appear in sexualized content.

This trend is notable because it underscores how rapidly AI-driven image manipulation has become integrated into adolescent digital culture. The implications are not yet fully understood, but the study suggests that this behavior is no longer isolated to a specific demographic.

Why Girls Are Participating

While the research did not directly explore motivations, experts speculate that girls may be using these tools for several reasons. One theory links it to the prevalence of “try-on” filters for clothing and makeup, which may normalize similar AI-driven interactions. Another factor is potential coercion from male peers, with girls possibly feeling pressured to create or share explicit content to fit in or maintain social standing.

Dr. Linda Charmaraman, a specialist in youth digital wellbeing, notes that teenagers are at a vulnerable stage of development where social acceptance is paramount. “When you combine that time of development with AI, it can bring further risks,” she explains, suggesting that peer pressure and status seeking may drive usage.

The Risk of Unintentional Creation of Illegal Material

A key concern is that teens often don’t realize they may be creating child sexual abuse material (CSAM) when using nudification tools. Even if imagery is shared consensually between adolescents, the legal implications remain murky. Furthermore, predators actively seek out such content, using AI to generate images based on publicly available photos and potentially engaging in sextortion.

What Parents Should Know

The study suggests that parents should assume their children will encounter these tools and have open, non-judgmental conversations about the risks. Abstinence-only approaches are unlikely to be effective, as teens may view AI-generated content as a natural part of exploring their sexuality. Instead, experts recommend regular discussions about teens’ digital lives, fostering open communication so that distressing incidents like non-consensual sharing can be addressed promptly.

Policy Implications and Tech Company Responsibility

Researchers propose a multi-pronged approach to mitigate risks. One suggestion is to educate teens about bystander intervention: speaking up when peers plan to create non-consensual imagery. Another calls for tech companies to adopt a “duty of care” standard, providing tools that allow minors and parents to manage digital experiences, including disabling certain features and protecting personal information.

Ultimately, this study underscores the need for greater awareness and proactive measures to address the evolving landscape of AI-generated sexual imagery among adolescents. The widespread use of these tools demands a comprehensive response from parents, educators, policymakers, and tech companies alike.