The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.
If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.
* we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.
Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.
The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.
such as when the person making them is also a minor
I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.
And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.
It does seem a bit heavy handed when the context is just two high schoolers tryna smash.
That’s a lot of words to defend fake child porn made out of photos and videos of actual children.
Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.
Words is treacherous bastards
I don’t always be like that but sometimes it do
That’s about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.
Have you tried actually reading what they said instead of just making shit up?
No reason not to ban them entirely.
The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.
Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?
Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn
Perhaps an unpopular opinion, but I’d be fine with that. I have yet to see a benefit or possible benefit that outweighs the costs.
For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.
This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.
I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.
Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”
But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.
They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.
“We’re gonna ban Internet stuff” is something said by people who have no idea how the Internet works.
This is the best summary I could come up with:
Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.
Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.
nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.
A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.
Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.
After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.
The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I’m a bot and I’m open source!