Imagine finding out that someone shared a private photo of you, without your consent, on the internet. Or worse, that they used AI to create a fake image of you that looks real. It’s scary, invasive, and unfortunately, more common than many people realize.
But there’s some important news: a new law called the TAKE IT DOWN Act was just passed by Congress, and it’s about to become the first federal law in the U.S. that specifically protects people from the non consensual sharing of intimate images including AI-generated content.
So, What Does the Law Actually Do?
The law has two main goals:
1. Makes it a crime to share private, intimate images (real or fake) of someone without their permission. This means people who do this can face criminal charges.
2. Requires websites and platforms to remove those images within 48 hours of being asked by the person affected.
Why This Matters for Survivors
This law gives survivors, especially those who haven’t always had access to justice, a new way to protect themselves. Here’s how it helps:
● It works across all 50 states, so survivors don’t have to navigate confusing local laws.
● It creates a clearer process for getting harmful images taken down.
● It includes AI-created images, which are becoming more common.
● It demands a fast response (platforms must act within 48 hours).
But There Are Some Limitations
While this law is a big step forward, it’s not perfect. Some things to be aware of:
● There might be loopholes for offenders. If someone is also in the image (like an ex-partner), they may not be held accountable even if you didn’t agree to share it.
● It’s not clear yet how well the law will be enforced or how easy it will be for survivors to get help if platforms don’t cooperate.
What Should Survivors Do if This Happens to Them?
If someone shares or threatens to share an intimate image of you without consent, here’s what you can do once the law is active:
1. Gather Evidence
○ Save links, screenshots, and anything else that shows where the image is online.
2. Prepare a Report
○ Say you believe the image was shared without your consent.
○ Include links, descriptions, your contact info, and your signature.
3. Send the Report
○ Each platform will have its own way to submit (email, form, etc.).
4. Wait for a Response
○ Platforms have 48 hours to take action.
○ If they don’t, you may be able to report them to the FTC.
Other Resources Survivors Can Use
If this law doesn’t provide the help you need, there are other options:
● StopNCII.org: Helps prevent private images from being reuploaded online.
● Take It Down (by NCMEC): A tool specifically for children and teens to remove harmful images.
● Reporting on platforms: Sites like Facebook, Instagram, X (Twitter), and TikTok have their own systems for reporting image-based abuse.
● Cyber Civil Rights Initiative: Offers legal help, emotional support, and a crisis helpline: 1-844-878-2274
At Turning Point, we know that laws can help but they aren’t the whole solution. Survivors still face real challenges, especially those from marginalized communities. That’s why we’re here: to help people understand their options, get support, and stay safe.
If you or someone you know is experiencing image-based abuse, you’re not alone. We’re here to listen, support, and help you take back control. Turning Point Hotline: (815) 338-8081 turnpt.org
Credits:
Lexie Zeppos, Outreach and Prevention Advocate, MHP, BS, CDVP
Lexie graduated with a Bachelor’s in Science for Psychology and minored in Public Health at St. Lawrence University in 2023. She has been working as an Outreach and Prevention Advocate at Turning Point for the past 1.5 years. She obtained her Domestic Violence Professional certification in 2025 and also is 40-hour Sexual Assault trained. Lexie hopes to pursue a Master’s in Public Health in the future years.
Editor: Cameron Schott, Marketing & Outreach Associate