There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • That’s putting a lot of faith into CLIP, though. The thing is, to get CLIP to detect things like child porn reliably, you do need to train it to make the distinction. In my experience, CLIP tends to make up keywords, or at least misunderstand the situation surprisingly common.

    If it weren’t super illegal and super unethical, AI could easily distinguish normal porn from illegal porn if you feed it enough tagged data of both. Categorisation is something these models are very good at, after all. That’s never ever going to happen (imagine the poor shmuck being hired to tag child rape for a dollar a day, horrific) but it’s the only way I trust AI to come up with something like this.

    I think we need more research into this field. I’m also at least a little mad that AI companies release these models into the wild before the science is ready to prevent it from becoming a child rape image generator for the mentally ill. Companies just seem to throw their hands in the air and go “well we didn’t program it to do that, not our fault!” and deny any responsibility for what they’ve created.