A youtube thumbnail featuring images of Bridget Johnson and Evan Harris and the hook, "Parents, Teachers Beware: AI Destroying Lives"

AI Deepfakes in Independent Schools: Prevention & Response

I recently had a conversation with Evan Harris, president of Pathos Consulting Group, about the alarming rise of AI-generated deepfake sexual abuse in schools. With his background in classroom and administrative experience, along with studies in Tech Ethics at Stanford, Evan provides crucial insights for independent schools facing these emerging digital threats.

A youtube thumbnail featuring images of Bridget Johnson and Evan Harris and the hook, "Parents, Teachers Beware: AI Destroying Lives"

The Scope of the Problem

Recent data from the organization Thorn shows that 11% of students aged 9-17 admit they know classmates who are using AI to create fake nude images of other students. An additional 10% refused to respond to the question. While the tools enabling this technology are relatively new, several indicators point to a troubling trend:

  • Year-over-year doubling of sextortion cases according to FBI crime statistics
  • Presidium’s annual report tracking a doubling of youth-to-youth incidents
  • Increased media attention, including coverage on 60 Minutes and at the State of the Union

What’s particularly concerning is that approximately 90% of victims are female, while 90-95% of perpetrators are male. As Evan noted, “We have a boy problem.” This issue reflects deeper concerns about how we educate young men and their attitudes toward women.

Legal Landscape for Independent Schools

The recent passage of the federal Take It Down Act creates important protections by:

  • Criminalizing the creation, distribution, or threat to create/distribute non-consensual images (deepfake or real)
  • Requiring social media platforms to comply with takedown orders within 48 hours
  • Allowing victims to take perpetrators to civil court
  • Establishing a federal law that creates a baseline for all states

This legislation makes it clear that creating a non-consensual deepfake image of a classmate may constitute a felony – even when the perpetrator is a minor. This elevates the seriousness with which schools must address these incidents.

Crisis Management Protocol

When an incident occurs, schools should follow a clear protocol:

  1. Establish reasonable suspicion – This is all you need for mandatory reporting, which should happen immediately.
  2. Activate your crisis management team – Have predefined roles, responsibilities, and communication channels.
  3. Notify parents before students – Contact the victim’s parents first so they can support their child.
  4. Maintain discretion – Protect the victim’s privacy to avoid further shame and isolation.
  5. Handle evidence properly – The Take It Down Act includes safeguards for school employees preserving evidence in good faith, but consult with law enforcement about proper protocols.

Disciplinary Approach and Support Systems

When addressing incidents, Evan recommends creating equivalencies between:

  • Creating images and distributing them
  • Real images and fake images

For disciplinary responses, Evan suggests: “If a student has committed a felony and sexually abused another student on your campus, the result of that should probably be expulsion.” During the investigation process, suspension may be appropriate as “a victim should not be forced to share a space with the person that’s abused them.”

However, disciplinary measures are only part of the solution. Supporting victims requires a comprehensive approach:

  • Include a trauma-informed mental health counselor in your response team
  • Provide academic grace and flexibility
  • Give the victim agency in the recovery process by asking what they need
  • Recognize the various impacts: academic, social, emotional, reputational, legal, and financial

Policy Development for Independent Schools

Independent schools should explicitly address AI-generated deepfakes in their policies. Evan recommends:

  • Place the policy alongside rules for sexual abuse, not just in the technology/acceptable use section
  • Use broad, tech-neutral language that can adapt to rapidly changing technology
  • Explicitly ban the creation, distribution, or threat to create/distribute real or deepfake non-consensual images
  • Clarify that consent to create an image is not the same as consent to distribute it
  • Include specific examples to illustrate prohibited behaviors

Prevention Through Education

Prevention requires educating students, parents, and staff:

  • Provide age-appropriate education about consent and healthy relationships
  • Develop media literacy programs that address AI tools and their potential misuse
  • Meet boys where they are with messaging that resonates with them
  • Create open communication channels where students feel comfortable reporting concerns

For parents seeking guidance, Evan has created A Parents Guide to Non-Consensual Intimate Images which includes:

  • Clear definitions of terms like NCII (non-consensual intimate images) and sextortion
  • Simple actions families can take to protect children online
  • Conversation starters for parents to discuss these topics with their children

Looking Ahead

As AI technology continues to evolve, independent schools must stay vigilant about emerging threats like algorithmic bias in admissions and hiring decisions. The key to addressing these challenges is proactive planning rather than reactive responses.

Even taking small steps toward protection and preparation is significantly better than doing nothing. At minimum, schools should:

  1. Have an explicit policy addressing AI-generated deepfakes
  2. Develop basic response protocols
  3. Incorporate education about digital citizenship into existing programs
  4. Build relationships with local law enforcement before incidents occur

The landscape of digital threats is constantly changing, but with proper preparation, independent schools can protect their students while maintaining their educational mission.

Check out Evan’s “A Parent’s Guide to NCIIs”

Listen to the full podcast

Bridget Johnson's Signature

Bridget Johnson, Founder, Deans' Roundtable

Want more support as a student life professional?

Look no further than the Deans' Roundtable Community

  • Network with a vast directory of student life professionals like yourself
  • Gain a multitude of professional development opportunities to be the best version of yourself
  • Gather expert advice on the important questions you need answered
Bridget Johnson's Signature

Bridget Johnson, Founder, Deans' Roundtable

Bridget Johnson, a former associate executive director, has worked in education for much of her career, primarily in independent schools and nonprofits. As a former dean of students and director of special programs, she has helped schools expand their offerings while maintaining their core values. Bridget now works as the founder of the Deans’ Roundtable and an independent consultant helping educational institutions implement data-driven strategies that support their unique missions.

Skip to content