• papertowels@lemmy.one
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    4
    ·
    edit-2
    9 months ago

    So this does bring up an interesting point that I haven’t thought about - is it the depiction that matters, or is it the actual potential for victims that matters?

    Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      3
      ·
      9 months ago

      How I see it: creating fake child porn makes it harder for authorities to find the real ones.

      • papertowels@lemmy.one
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        9 months ago

        That’s a good point. On the flip side, I remember there was a big deal about trying to flood the rhino horn market with fakes a few years ago. I can’t find anything on how that went, but I wonder if it could have that effect as well.

    • ilmagico@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      4
      ·
      9 months ago

      Every country has different rules, standing on wikipedia.

      Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.

      • papertowels@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        Thanks for sharing that link. I hated reading through it, but it answered the question haha…

        I don’t really have strong feelings about it but I do think I lean towards agreeing with you.

      • theangryseal@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        9 months ago

        At the same time, an argument could be made that increasing the availability of such a thing could land it in the eyes of a person who otherwise wouldn’t have seen it in the first place and problems could develop.

        It could normalize something absurd and create more risks.

        I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.

        I just know how (anecdotally) pornography desensitizes a person until it makes more extreme things less bizarre and unnatural. I can’t help but imagine a teenager who would have otherwise developed a more healthy sexuality stumbling on images like that and becoming desensitized.

        It’s definitely something that needs some serious thought.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 months ago

          Most of what you’re repeating about porn “normalizing” things and “desensitizing” viewers is straight out of the puritan handbook. There is evidence that men who overconsume porn and don’t have a healthy sex life can fall into self-destructive patterns, but porn consumption doesn’t work like a drug. It’s not like the more you consume the more hardcore of content you desire, or that being exposed to certain types of porn will create new preferences that you wouldn’t otherwise have had. This is just long-standing anti-sex-work propaganda that tries to liken pornography to narcotics.

          People who consume CSAM are already into that kind of thing. Seeing CSAM isn’t going to turn anyone into a pedophile just as playing GTA isn’t going to turn anyone into a hardened street criminal. The goal should be to protect children, not to censor any content that sexualizes youth, because that really is a slippery slope. More on that here: https://nypost.com/2010/04/24/a-trial-star-is-porn/

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.

      Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.

      It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.

      With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      15
      ·
      9 months ago

      The intent is to get off on fucking children, how you make that happen shouldnt matter

      • BreakDecks@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        If we decide that nothing else matters but protecting children, then protecting children will be the only thing that matters anymore. That’s not a reasonable outcome.