Skip to main content

Expert Answers: What Is The Take It Down Act and How It Relates to Cybercrime

June 26, 2025

Photo of Dr. Rachel McNealey
Dr. Rachel McNealey

*Content warning: this article mentions revenge pornography, sextortion, and sexual assault.

 

The Take It Down Act, a piece of legislation that was recently signed into law with bi-partisan support, enhances current protections against the distribution of non-consensual intimate and explicit images as well as now making it illegal to distribute such materials that were generated by Artificial Intelligence (AI).

Dr. Rachel McNealey, an Assistant Professor in the School of Criminal Justice and Center Associate with the MSU Center for Cybercrime Investigation & Training, answers questions related to the act and how it may help law enforcement combat these online sex crimes.

 

 

What is “Revenge Porn” and how can “Deepfakes” relate to the issue of Revenge Porn?

“Revenge porn” is one kind of image sharing that falls under the umbrella of Non-Consensual Intimate Images (NCII). This kind of NCII is often motivated by a desire to “get back” at someone, such as a former partner. However, many people are affected by NCII from people other than former partners, and these behaviors are not always motivated by “revenge”. This widespread problem has only been accelerated by the advent of image-generating AI programs, as images can be created and used against someone who has never actually taken or sent an intimate image of themselves. Deepfakes also factor into sextortion in this way, as fabricated images can be leveraged against a person in demand for money or additional intimate images. The proliferation of “nudifying” apps means that someone can input an image of a person fully clothed and create an intimate image based on their likeness. This development is of particular concern for parents who publicly share images of their children on social media, exemplified by cases like a former Mississippi teacher who created intimate images of his students using publicly available pictures.

 

The Take It Down Act makes it illegal to knowingly publish or threaten to publish sexually explicit images and videos without consent. How does this relate to Sextortion?

The Take It Down Act applies to authentic images and those that are generated using AI. The law serves as a significant step towards legislating the landscape of AI-related harm because many existing state-level NCII laws do not include provisions for AI-generated imagery. An important element to note is that the statute only applies to AI-generated images that are deemed “indistinguishable from an authentic visual depiction”. The extent to which the legal system will interrogate whether an image is “realistic enough” is yet to be seen.

By making it illegal to publish or threaten to publish intimate images, the Take it Down Act codifies federal legal punishment for a range of behaviors including revenge porn and sextortion. Importantly, the act explicitly states that consent to create an image does not imply consent to distribute. This means that even if an individual has willingly shared an intimate image or video with someone, publishing it without their explicit consent is now a crime. The act defines “publishing” as distribution of images using social media, websites, and mobile applications. Additionally, distribution of any NCII in interstate commerce is codified as a federal offense regardless of the medium used.

 

The law also requires any website that revenge porn or deepfakes are posted on remove it within 48 hours. Do you have any insight into how sites will comply with this regulation?

The act states that internet platforms must respond to reports of NCII and remove the imagery within 48 hours. This quick response time will likely require many smaller internet platforms to implement automated reporting systems that take down content without review. Many feel that this is worth the risk, and that it is better to be safe than sorry when dealing with NCII. However, many experts worry that this will have unintended consequences – if automated systems are taking down any content reported as NCII without review, the report function could be used to censor in bad faith to remove content and posts that are unrelated to NCII. In an era where social media is highly politicized and misinformation runs rampant, proponents of internet freedom see the act as a potential vehicle for insidious motives such as censorship.

 

How does Sextortion relate to Revenge Porn and Deepfakes?

Sextortion, the threat to publish intimate images in exchange for money or additional images, is explicitly made a federal crime with a punishment of up to two years in cases with adult victims and up to three years in cases with victims who are minors.

Young women are most often affected by reported instances of revenge porn and deepfakes, particularly those who have been victims of domestic violence in intimate relationships. However, sextortion incidents are becoming more common, particularly against young boys. NCII is a concerningly prevalent behavior that can be perpetrated against people of any age or gender.

 

Is there any insight into why perpetrators engage in revenge porn, deepfakes, and sextortion?

Revenge porn and NCII are commonly committed by a person seeking power over the subject of the images. This includes, but is not limited to, an intimate partner following a breakup; research also shows that some people engage in NCII to gossip or engage in bullying against another person. Research on sextortion shows that many offenders are financially motivated, but some engage in sextortion for sexual reasons. Sextortion is also sometimes perpetrated against a former partner after a breakup to coerce the victim back into the relationship or to continue communication.

 

Are there steps we can take to help prevent the dissemination to revenge porn and deepfake material? If someone is a victim of revenge porn, deepfakes, or sextortion, what should they do?

Although the problem of deepfakes and NCII is daunting, there are things that the average person can do to try and combat it.

If you ever see images online that you know or suspect were posted without someone’s permission, these posts should be reported to both the online platform and the police. As the threat of deepfake AI imagery grows, people should also be cautious of the images they post online of minors and individuals who cannot otherwise consent.

It is of utmost importance that anyone who uses the internet knows what can be reported as a crime. In multiple instances, young victims of sextortion have been manipulated by offenders into thinking that they themselves will be punished. Children and teens in particular should be made aware that the moment someone threatens to send/post/share an intimate image without their permission, this can be reported to the police as a crime.

Most of all, the stigma around NCII victimization and sharing one’s own intimate images must be addressed. As long as victims think they may be shamed for having shared their own images, the less likely it is that incidents will be reported. Awareness is our greatest tool in preventing these cybercrimes, ensuring that incidents are addressed, and providing victims with the best avenues and resources.

If you or someone you know has experienced any of the crimes described above, the incidents should be reported to the police and the Internet Crimes Complaint Center (IC3). Additional resources can be found here: