There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • Adalast@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Technically the diffusers all have the ability to filter any material from the actual outputs using a secondary CLIP analysis to see if it kicks out any keywords which indicate that a topic is in the image. From what I have seen, most AI generation sites use this method as it is more reliable for picking up on naughty outputs than prompt analysis. AI’s are horny, I play with it a lot. All you have to do is generate a woman on the beach and about 20% of them will be at least topless. Now, “woman on the beach” should not he flagged as inappropriate, and I don’t believe the outputs should either because our demonization of the female nipple is an asinine holdover from a bunch of religious outcasts from Europe who were chased our for being TOO restrictive and prudish, but alas, we are stuck with it.

    • That’s putting a lot of faith into CLIP, though. The thing is, to get CLIP to detect things like child porn reliably, you do need to train it to make the distinction. In my experience, CLIP tends to make up keywords, or at least misunderstand the situation surprisingly common.

      If it weren’t super illegal and super unethical, AI could easily distinguish normal porn from illegal porn if you feed it enough tagged data of both. Categorisation is something these models are very good at, after all. That’s never ever going to happen (imagine the poor shmuck being hired to tag child rape for a dollar a day, horrific) but it’s the only way I trust AI to come up with something like this.

      I think we need more research into this field. I’m also at least a little mad that AI companies release these models into the wild before the science is ready to prevent it from becoming a child rape image generator for the mentally ill. Companies just seem to throw their hands in the air and go “well we didn’t program it to do that, not our fault!” and deny any responsibility for what they’ve created.