A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
I believe that images are important to investigation as they help with the identity of those children being abused. When that’s mixed in with a bunch of AI pedophile stuff it serves to obfuscate that avenue of investigation and hampers those efforts, which are 100% more important than anyone’s need to get off to pedophilic AI imagery.
If there was a chance of saving even one child but it meant that no one could see AI images of sexualized children then those would be completely acceptable terms to me.
I would hold there’s zero downside to outlawing the production of AI CSAM. There’s no indication that letting pedophiles indulge in “safe” forms of pedophilic activity stops them from abusing children. It’s not a form of speech or expression with any value. If we as a society are going to say we’re against abuse of children then that needs to include being against the cultivation and networking of abusive culture and people. I see no real slippery slope in this regard.
It already is outlawed in the US. The US bans all depictions precisely because of this. The courts anticipated that there would come a time when people could create images which are indistinguishable from reality so allowing any content to be produced wasn’t permissible.
Okay… So correct me if I’m wrong, but being abused as a child is like… one of the biggest predictors of becoming a pedophile. So like… Should we preemptively go after these people? You know… To protect the kids?
How about single parents that expose their kids to strangers when dating. That’s a massive vector for kids to be exposed to child abuse.
What on earth? Just don’t sexualize children or normalize sexualizing children. Denying pedophiles access to pedophilic imagery is not some complex moral quandry.
Why on earth am I getting so much pushback on this point, on Beehaw of all places…
I appreciate you posting the link to my question, but that’s an article written from the perspective of law enforcement. They’re an authority, so they’re incentivized to manipulate facts and deceive to gain more authority. Sorry if I don’t trust law enforcement but they’ve proven themselves untrustworthy at this point
I believe that images are important to investigation as they help with the identity of those children being abused. When that’s mixed in with a bunch of AI pedophile stuff it serves to obfuscate that avenue of investigation and hampers those efforts, which are 100% more important than anyone’s need to get off to pedophilic AI imagery.
Online investigation in general has been a successful avenue in the recent past.
If there was a chance of saving even one child but it meant that no one could see AI images of sexualized children then those would be completely acceptable terms to me.
I would hold there’s zero downside to outlawing the production of AI CSAM. There’s no indication that letting pedophiles indulge in “safe” forms of pedophilic activity stops them from abusing children. It’s not a form of speech or expression with any value. If we as a society are going to say we’re against abuse of children then that needs to include being against the cultivation and networking of abusive culture and people. I see no real slippery slope in this regard.
It already is outlawed in the US. The US bans all depictions precisely because of this. The courts anticipated that there would come a time when people could create images which are indistinguishable from reality so allowing any content to be produced wasn’t permissible.
Okay… So correct me if I’m wrong, but being abused as a child is like… one of the biggest predictors of becoming a pedophile. So like… Should we preemptively go after these people? You know… To protect the kids?
How about single parents that expose their kids to strangers when dating. That’s a massive vector for kids to be exposed to child abuse.
What on earth? Just don’t sexualize children or normalize sexualizing children. Denying pedophiles access to pedophilic imagery is not some complex moral quandry.
Why on earth am I getting so much pushback on this point, on Beehaw of all places…
deleted by creator
I appreciate you posting the link to my question, but that’s an article written from the perspective of law enforcement. They’re an authority, so they’re incentivized to manipulate facts and deceive to gain more authority. Sorry if I don’t trust law enforcement but they’ve proven themselves untrustworthy at this point