Pedos ruin everything…

  • BrianTheeBiscuiteer@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    Legally this is going to be a mess. In theory I agree that CSAM that is photorealistic should be illegal (mainly because it won’t be long before a AI generations are completely indistinguishable from photos, and we can’t just ignore real child abuse), but how do you define photorealistism or CSAM when the subjects literally don’t exist? I figure if this kind of thing ever hits the courtroom there will be wildly different verdicts and sentences.

    • Sethayy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Its always been illegal to draw csam though no? I imagine it’ll all follow a similar system (with pedos always trying to leak through the cracks with shit like lolis, and actual people getting hurt by it cause they’re above age but don’t look it. Ugh its gonna be a mess)

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Lolis are usually drawings. You can go even further than that though. It’s incredibly easy to find drawings or hi-def renderings of characters that are absolutely not adults. No one’s prosecuting them for creation or distribution. This likely hinges on the content being obviously not real, but “real” is subjective and with AI gen the grey area got 10x bigger.