Current Events > "Could AI-Generate Porn Help Protect Children?"

Topic List
Page List: 1
McSame_as_Bush
09/26/23 2:35:23 PM
#1:


If the quotes don't make it clear, this is the article title, not my own.

NOW THAT GENERATIVE AI models can produce photorealistic, fake images of child sexual abuse, regulators and child safety advocates are worried that an already-abhorrent practice will spiral further out of control. But lost in this fear is an uncomfortable possibilitythat AI-generated child sexual material could actually benefit society in the long run by providing a less harmful alternative to the already-massive market for images of child sexual abuse.

The growing consensus among scientists is that pedophilia is biological in nature, and that keeping pedophilic urges at bay can be incredibly difficult. What turns us on sexually, we dont decide thatwe discover that, said psychiatrist Dr. Fred Berlin, director of the Johns Hopkins Sex and Gender Clinic and an expert on paraphilic disorders. Its not because [pedophiles have] chosen to have these kinds of urges or attractions. Theyve discovered through no fault of their own that this is the nature of what theyre afflicted with in terms of their own sexual makeup Were talking about not giving into a craving, a craving that is rooted in biology, not unlike somebody whos having a craving for heroin.

Ideally, psychiatrists would develop a method to cure viewers of child pornography of their inclination to view it. But short of that, replacing the market for child pornography with simulated imagery may be a useful stopgap.

THERE IS GOOD reason to see AI-generated imagery as the latest negative development in the fight against child sexual abuse. Regulators and law enforcement already comb through an enormous amount of images every day attempting to identify victims, according to a recent paper by the Stanford Internet Observatory and Thorn. As AI-generated images enter the sphere, it becomes harder to discern which images include real victims in need of help. Plus, AI-generated images rely on the likenesses of real people or real children as a starting point, which, if the images retain those likenesses, is abuse of a different nature. (That said, AI does not inherently need to train on actual child porn to develop a simulated version of it, but can instead combine training on adult pornography with its training on the likenesses of children.)

Finding a practical method of discerning which images are real, which images are of real people put into fake circumstances, and which images are fake altogether is easier said than done. The Thorn report claims that within a year it will become significantly easier for AI to generate images that are essentially indistinguishable from real images. But this could also be an area where AI might play a role in solving a problem it has created. AI can be used to distinguish between different forms of content, thereby aiding law enforcement, according to Rebecca Portnoff, head of data science at Thorn. For example, regulators could require AI companies to embed watermarks in open-source generated image files, or law enforcement could use existing passive detection mechanisms to track the origin of image files.

When it comes to the generated images themselves, not everyone agrees that satisfying pedophilic urges in the first place can stem them in the long run.

Child porn pours gas on a fire, said Anna Salter, a psychologist who specializes in the profiles of high-risk offenders. In Salters and other specialists view, continued exposure can reinforce existing attractions by legitimizing them, essentially whetting viewers appetites, which some offenders have indicated is the case. And even without that outcome, many believe that viewing simulated immoral acts harms the actors own moral character, and thus perhaps the moral fabric of society as well. From that perspective, any inappropriate viewing of children is an inherent evil, regardless of whether a specific child is harmed. On top of that, the potential normalization of those viewings can be considered a harm to all children.

There is also the practical concern that, while viewers of AI-generated child pornography are not contributing to the re-victimization of an abused child, generated images will not stem the abuse itself. Thats because the makers of child pornography are typically child abusers, who will not refrain from abusing children because of changing demand for the images they collect of the abuse they inflict.

Still, satisfying pedophilic urges without involving a real child is obviously an improvement over satisfying them based on a real childs image. While the research is inconclusive, some pedophiles have revealed that they rely on pornography to redirect their urges and find an outlet that does not involve physically harming a childsuggesting that, for those individuals, AI-generated child pornography actually could stem behavior that would hurt a real child.

As a result, some clinicia
... Copied to Clipboard!
McSame_as_Bush
09/26/23 2:35:48 PM
#2:


Ultimately, AI-generated child pornography could act as a form of harm reduction, a philosophy that underlies many public health policies. It is similar to the logic, for example, behind needle exchange programs for individuals suffering from drug addiction: Because we cannot stop drug use wholesale, it serves society to find ways to ensure that consumption occurs safely. This is not a perfect analogy, and it is understandably a tough sell to transfer this framework to a subject as horrific as images depicting the abuse of children, especially without studies demonstrating the effects of controlled viewings of AI-generated child pornography. But our rightful contempt for child pornography should not prevent us from considering the possibility that fake forms of such images could stand as an improvement over abuse.

For pedophiles who do not wish to harm children and therefore find satisfaction in ways that avoid doing so, ''we consider this a good outcome for them, for children that might otherwise have been victimized, and for society at large,'' wrote an anonymous representative from Virtuous Pedophiles, an online support group for non-offending pedophiles, to me over email.

OF COURSE, USING AI-generated images as a form of rehabilitation, alongside existing forms of therapy and treatment, is not the same as allowing its unbridled proliferation on the web.

''Theres a world of difference between the potential use of this content in controlled psychiatric settings versus what were describing here, which is just, anybody can access these tools to create anything that they want in any setting,'' said Portnoff, from Thorn.

And even using AI-generated child porn in a controlled environment would not be a one-size-fits all solution. Clinicians evaluate patients on an individual basis and would have to determine whether exposure to explicit images could diminish urges for a given patient, according to Berlin, or whether they might enhance them in that particular case.

Ultimately, though, incorporating AI-generated images into existing forms of therapy could be one way of diminishing risk. ''Were all for the same thing, which is the safety of children and others in the community,'' said Berlin. ''We have to do both sides of the coin. We not only have to assist the victims, but we have to look at those who might be a risk to victims, and help them not to remain a risk.''

https://www.wired.com/story/artificial-intelligence-csam-pedophilia/

I was wondering when we would start to see this argument being made. While I don't think pedos should be burned at the stake for something they obviously didn't choose to be, I'm going with option one because of options 2-5.

---
not a lot, just forever
... Copied to Clipboard!
Starks
09/26/23 2:36:23 PM
#3:


How do you have that kind of AI model without training it on real images?

---
Paid for by StarksPAC, a registered 501(c)(4)
... Copied to Clipboard!
R_Jackal
09/26/23 2:37:19 PM
#4:


Anything involving any form of generating child porn should be illegal.
... Copied to Clipboard!
Questionmarktarius
09/26/23 2:37:44 PM
#5:


If it's not obviously fake, "it's AI, bro!" is the rough equivalent of painting the end of a M16 orange.
... Copied to Clipboard!
Error1355
09/26/23 2:38:41 PM
#6:


No.

---
I'm a long, long way from giving up
Call me old-fashioned, call me a fool
... Copied to Clipboard!
Topic List
Page List: 1