How effective will the Senate-passed bill, S. 4569, the Take It Down Act, which would criminalize the publication of non-consensual intimate imagery (NCII) be?
08.06.2025 00:43

The phrase “non-consensual intimate imagery” is targeted toward using AI to create and publish lewd imagery of a real person. (Though there is a section in the act that really just covers intimate images that aren't deepfakes at all.)
It will likely be held unconstitutional.
“Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act”
Marc Maron to End His Long-Running WTF Podcast - Pitchfork
For example, it would become a criminal act to publish a photo like this—unredacted.
Eye-rolling, right?
You could publish the photo in a magazine. You could put it on a billboard. But you couldn't use an AI tool to generate the image—that would be a federal crime.
Why do Argentinians use "vos" instead of "tú" in informal speech?
Don't get me wrong. Given what most of the bill covers, large sections of it will still be enforceable. But they're also a little vague. And that's going to cause some issues. People that create those kinds of images to harass or embarrass or humiliate or whatever? I don't feel bad for them in the slightest, and this law would probably punish them successfully.
Probably close to zero. For two reasons:
What the feds (Cruz is the sole sponsor, I see) could have done? If they probably would have focused more on the whole take-down notice provisions and the rights of privacy and publicity, it would probably be stronger as a whole.
What type of crossdresser are you?
To be clear, the bill is not about “non-consensual intimate imagery”. At least, not what you might think what that phrase means. It is a phrase defined in the bill—which I'll come back to in a moment.
But the part about deepfakes… it probably isn't going to fly. Not because the idea of deepfakes isn't troubling. Because of the way the law is actually written.
It just doesn't… do much in terms of the criminalization. Most people that experience problems with this, federal law enforcement is going to do nothing for them. The feds just don’t have the resources.
What is some information about unprotected sex and pregnancy?
However, I'm a little bit more worried about how the vagueness is going to be used offensively to punish conduct that isn't really intended to be covered by this. I can think of a few strategic ways it could be used to threaten people for relatively benign conduct.
So in effect, there's going to be a First Amendment problem on the scope of what is covered. It has the potential to criminalize otherwise protected speech, only because it’s on a computer network. And that’s going to be a big problem.
The TAKE IT DOWN Act is really about what its letters say:
Shortcuts App to Get Revamp With Apple Intelligence Integration - MacRumors
So, simply publishing intimate photos is a federal crime. Under certain conditions. There's a takedown procedure there, and I don't really see that as being an issue. States have made publishing intimate images crimes and torts. But because computer networks fluidly crossed state boundaries, that gets complicated and messy to try to enforce to do what many victims really want—remove offending images. So, a federal takedown procedure is probably a good idea to facilitate that.