This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…
BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.
“Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”
If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.
If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.
How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.
It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.
If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.
Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.
It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.
We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.
This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…
BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.
But, I’m still torn on the first scenario…
To me it comes down to a single question:
“Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”
If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.
If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.
Hoooooly hell, good luck getting that study going. No ethical concerns there!
How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.
It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.
What is the AI trained on?
Image-generating AI is capable of generating images that are not like anything that was in its training set.
AI can compose novel looking things from components it has been trained on - it can’t imagine new concepts. If CSAM is being generated it’s because it was included in it’s training set which is highly suspected as we know the common corpus had CSAM in it: https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse
If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.
Cool, how would it know what a naked young person looks like? Naked adults look significantly different.
It understands young and old.
Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.
It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.
We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.