A third-grade teacher at a Christian school in Florida was arrested on Tuesday for alleged possession of child pornography, some of which he said was AI-generated using the yearbook photos of past students, according to WFLA. The Pasco County Sheriffs office said it began investigating 67 year-old Steven Houser, a science teacher at Beacon Christian Academy, after receiving a tip about him.
According to a press release from the sheriffs office, investigators discovered that he possessed two photos and three videos of child porn, in addition to some AI-generated erotica depicting children. Houser told police that hed made the erotica using the yearbook photos of three of his past students. Beacon Christian Academy is a part of Beacon Community Church in New Port Richey, Florida.
As the technology becomes more accessible, incidents involving AI-generated erotica of minors is on the rise. Earlier this month, five eighth-grade students in a Beverly Hills middle school were expelled for spreading AI-generated nude images of their classmates.
Didn't we have "serious" users here claim that using AI to generate porn didn't make it child porn?
but republicans wont do shit about these people will they
arrested on Tuesday for alleged possession of child pornography, some of which he said was AI-generated
Yes. Their argument was that if it's not a real picture, even if it's based on a real person, it can't be "child porn" and thus you should not call the person who made it a pedophile.Which user said that?
It was obviously very ridiculous.
Yes. Their argument was that if it's not a real picture, even if it's based on a real person, it can't be "child porn" and thus you should not call the person who made it a pedophile.I can maybe see the "it's not real" argument from a legal standpoint, but how would they not be a pedophile?
It was obviously very ridiculous.
Which user said that?
And because CE is CE, I want to make it clear that even if AI images like this may not be illegal, I think they definitely should be.I believe if they're real looking enough that it can interfere with investigations then it's illegal. Doesn't need to be AI generated. You can't Photoshop something like this either.
I believe if they're real looking enough that it can interfere with investigations then it's illegal. Doesn't need to be AI generated. You can't Photoshop something like this either.That makes sense. I'm fuzzy on the exact legality of this shit.
From a legal standpoint, I'm curious as to what the tip was.
Sounds like everything hinges on the fact that he had the real thing, but can he actually be arrested for the AI stuff? If the stuff he was arrested for just ends up being a 22-year-old midget can he still be arrested for everything else?
Crazy to think that if he was a public school employee the Teacher's Union would give him much more protection than he gets as a private school guy. Tax dollars would go towards his protection. Him having a small amount of stuff is convenient for the people that conducted the search, I will say. There may not have been a case otherwise. Those boys weren't arrested for what they did either, just expelled.
Both parties would lose their shirts in a civil case but a criminal one? I'm not sure.
From a legal standpoint, I'm curious as to what the tip was.
Which user said that?
Am I misreading or did he have "real" child porn in addition to generated images of his students?Yes. I imagine the prosecution will focus on that because it's a much simpler case to make than the AI generated stuff.
AI needs better regulations. If your algorithm is used to generate that crap then you should be liable for the creation of child porn.this is not really how generative AI works and you cannot regulate the technology in this way short of banning it entirely. which is also very hard to do without banning import or manufacture of powerful GPUs, and that has consequences on non-AI use as well
This will not sit well with God. This guy needs to repent his sins and beg the Lord for forgiveness.