Awareness · Rights · Resources

Image-Based Sexual Violence: Recognize It, Stop It

Online networks enable the distribution of non-consensual intimate imagery at an unprecedented scale. This page explains the scope of the problem, your legal rights, and where to get help right now.

Find Help →Read the Facts
Scroll

01 — The Scale

Not Isolated Cases — A Global System

In March 2026, a months-long CNN undercover investigation exposed an international online network where men coach each other to drug their partners, assault them while unconscious, film the abuse, and evade law enforcement. On the platform Motherless.com alone, reporters found over 20,000 videos in the "sleep" category, accompanied by Telegram groups offering live-streams of assaults and trafficking sedatives.

This is not a fringe phenomenon. It represents a global escalation of image-based sexual violence that researchers and hotlines have documented for years — one that is now being amplified by generative AI, encrypted messaging, and a lack of platform accountability.

424K
Reports to the Internet Watch Foundation in 2024 alone — a record
IWF Annual Report 2024
+1,325%
Surge in AI-generated child sexual abuse material (NCMEC 2024→2025)
NCMEC CyberTipline
22,275
Reports to the UK Revenge Porn Helpline in 2024 — its highest ever
UK RPH
480M+
Downloads of so-called "nudify" apps worldwide
Tech Transparency Project

02 — Forms of Abuse

What Image-Based Sexual Violence Looks Like

Non-Consensual Intimate Images (NCII)

Photos or videos taken without knowledge or consent, or created consensually but shared without permission — often mislabeled "revenge porn." The term is misleading: this is not pornography. It is violence.

AI-Generated Deepfakes ("Nudify")

Apps and websites that use generative AI to turn ordinary photos into synthetic nude imagery. Students at roughly 90 schools across 28 countries have been affected. The images are fabricated; the harm to victims is real.

Sextortion (Sexual Extortion)

Perpetrators blackmail victims with intimate images — real or fabricated — demanding money, more images, or sexual acts. Boys aged 14–17 are disproportionately targeted. In the US alone, at least 36 young people died by suicide following such extortion.

Drug-Facilitated Assault with Documentation

The pattern exposed by CNN: partners are drugged, assaulted while unconscious, and filmed. Online networks share substance names, dosages, concealment methods, and stream assaults live for payment — typically in cryptocurrency.


03 — Legal Framework

Laws That Protect You

Legal protections against image-based sexual violence have expanded significantly in recent years. Below is a snapshot of key laws — this is not legal advice, and laws vary by jurisdiction. If you need help, contact a lawyer or one of the organizations listed below.

JurisdictionKey Law / RegulationWhat It Covers
European UnionDigital Services Act (DSA)Requires platforms to remove illegal content promptly; fines up to 6% of global revenue. First enforcement fine: €120M against X (Dec 2025).
EUDirective 2024/1385First EU-wide criminalization of cyber violence including NCII, deepfakes, cyberstalking, and cyberflashing. Transposition deadline: June 2027.
Germany§ 201a, § 184k, § 184b StGBCriminalizes unauthorized sharing of intimate images, upskirting, and child sexual abuse material. Civil remedies via §§ 823, 1004 BGB.
United KingdomOnline Safety Act 2023Platforms must proactively address illegal harms including NCII. Ofcom enforces with fines up to 10% of qualifying worldwide revenue.
United StatesTAKE IT DOWN Act (2025)Federal criminalization of NCII including deepfakes. Platforms must remove flagged content within 48 hours. Signed into law May 19, 2025.
AustraliaOnline Safety Act 2021eSafety Commissioner can issue removal notices for NCII with civil penalties. First deepfake penalty: AUD 343,500 (2025).

04 — Immediate Steps

What to Do Right Now

If you or someone you know is affected, these steps can help — even before contacting a lawyer or support organization.

Preserve Evidence

Take screenshots showing the URL, date, username, and content. Save chat logs. Important: If the images depict a minor, do not download or save them yourself — note the URL only and report it directly to authorities.

Use Hash-Based Removal Tools

StopNCII.org (adults 18+): Creates a digital fingerprint (hash) on your device — your images never leave it — and blocks re-uploads across Meta, TikTok, Reddit, Pornhub, Snap, Bumble, and more.
Take It Down (minors at time of image): Works similarly, run by NCMEC. Partners include Meta, TikTok, Snap, and Pornhub.

Report to the Platform

Most platforms have dedicated NCII reporting forms. Google also offers a simplified removal form for non-consensual intimate images in search results.

File a Police Report

Contact your local police, an online reporting portal, or emergency services. A formal report creates a legal paper trail and can unlock further remedies like injunctions.

If You're Being Extorted: Don't Pay

The unanimous advice from the FBI, BKA, and Interpol: do not pay, do not delete evidence, block the account, save everything, and report immediately. Payment almost never stops the extortion.

You are not alone, and this is not your fault.

Shame belongs to the perpetrator, never to the victim. Effective tools, specialized support, and legal pathways exist today. The first step — reaching out — is the hardest. Everything below is free and confidential.


🚨  In immediate danger? Call 911 (US), 999 (UK), 112 (EU), or your local emergency number.

05 — Help & Resources

Where to Get Support

Hotlines & Counseling

📞

RAINN (US)

National Sexual Assault Hotline — free, confidential, 24/7. Also offers online chat.

1-800-656-4673
rainn.org →
📞

Revenge Porn Helpline (UK)

Specialist support for intimate image abuse, including help getting images removed.

revengepornhelpline.org.uk →
📞

Hilfetelefon (Germany)

Violence Against Women Hotline — free, 24/7, in 18 languages. Also covers digital violence.

08000 116 016
📞

HateAid (Germany/EU)

Specializes in digital violence including NCII and deepfakes. Free legal counseling; in select cases, covers court costs.

hateaid.org →
📞

eSafety Commissioner (Australia)

Government body that can issue takedown orders for NCII. Handles complaints directly.

esafety.gov.au →
📞

Childline / Nummer gegen Kummer

For children and young people. UK: 0800 1111. Germany: 116 111. Austria (Rat auf Draht): 147. Switzerland (Pro Juventute): 147.

Removal & Prevention Tools

🛡️

StopNCII.org

For adults (18+). Create a hash of your intimate images locally — they never leave your device — and block re-uploads across major platforms including Meta, TikTok, Reddit, and Pornhub.

stopncii.org →
🛡️

Take It Down (NCMEC)

For anyone who was under 18 when the image was created. Same hash-based approach. Partners include Meta, TikTok, Snap, OnlyFans, and Pornhub.

takeitdown.ncmec.org →
🔍

Google NCII Removal

Request removal of non-consensual intimate images from Google Search results. Supports batch submissions since February 2026.

Google Support →

Reporting Illegal Content

🏛️

NCMEC CyberTipline (US)

Report child sexual exploitation online. Used by law enforcement worldwide.

report.cybertip.org →
🏛️

IWF (International)

Report child sexual abuse imagery found anywhere online. Anonymous reporting available.

report.iwf.org.uk →
🏛️

jugendschutz.net (Germany)

German hotline for reporting child sexual abuse material and other harmful online content.

jugendschutz.net →
🏛️

Internet Beschwerdestelle (Germany)

Joint reporting center of eco and FSM for illegal online content. Part of the INHOPE network.

internet-beschwerdestelle.de →
🏛️

Stopline (Austria)

Austrian INHOPE hotline for reporting child sexual abuse material and extreme content.

stopline.at →
🏛️

Click and Stop (Switzerland)

Swiss reporting platform for child sexual abuse material.

clickandstop.ch →