A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    129
    arrow-down
    42
    ·
    8 months ago

    This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.

    And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

    I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      113
      arrow-down
      9
      ·
      8 months ago

      People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

      Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

      • Vespair@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        8 months ago

        no skill from the person doing it.

        This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        31
        arrow-down
        40
        ·
        8 months ago

        A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

        • ChexMax@lemmy.world
          link
          fedilink
          English
          arrow-up
          30
          arrow-down
          9
          ·
          8 months ago

          Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

          • Bob Robertson IX @discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            5
            ·
            8 months ago

            This was before Google was a thing, much less reverse lookup with Google Images. The point I was making is that this kind of thing happened even before Photoshop. Photoshop made it look even more realistic. AI is the next step. And even the current AI abilities are nothing compared to what they are going to be even 6 months from now. Yes, this is a problem, but it has been a problem for a long time and anyone who has wanted to create fake nudes of someone has had the ability to easily do so for at least a generation now. We might be at the point now where if you want to make sure you don’t have fake nudes created of you, then you don’t have images of yourself published. However now that everyone has high quality cameras in their pockets, this won’t 100% protect you.

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        24
        ·
        8 months ago

        Not relevant. Using someone’s picture never ever required consent.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      89
      arrow-down
      31
      ·
      8 months ago

      I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

      We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

      • AquaTofana@lemmy.world
        link
        fedilink
        English
        arrow-up
        56
        arrow-down
        6
        ·
        8 months ago

        I don’t know why you’re being down voted. Sure, it’s unfortunately been happening for a while, but we’re just supposed to keep quiet about it and let it go?

        I’m sorry, putting my face on a naked body that’s not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it’s actually believable because it’s AI generated. That is SO much worse/psychologically damaging if they find out about it.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        2
        ·
        8 months ago

        It’s unacceptable.

        We have legal and justice systems to deal with this.

        For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

        Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

        Telegram got right on it (not). Fuckers.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        8 months ago

        How do you propose to deal with someone doing this on their computer, not posting them online, for their “enjoyment”? Mass global surveillance of all existing devices?

        It’s not a matter of willingly accepting it; it’s a matter of looking at what can be done and what can not. Publishing fake porn, defaming people, and other similar actions are already (I hope… I am not a lawyer) illegal. Asking for the technology that exists, is available, will continue to grow, and can be used in a private setting with no witness to somehow “stop” because of a law is at best wishful thinking.

        • Ookami38@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          There’s nothing to be done, nor should be done, for anything someone individually creates, for their own individual use, never to see the light of day. Anything else is about one step removed from thought policing - afterall what’s the difference between a personally created, private image and the thoughts on your brain?

          The other side of that is, we have to have protection for people who this has or will be used against. Strict laws regarding posting or sharing material. Easy and fast removal of abusive material. Actual enforcement. I know we have these things in place already, but they need to be stronger and more robust. The one absolute truth with generative AI, versus Photoshop etc is that it’s significantly faster and easier, thus there will likely be an uptick in this kind of material, thus the need for re-examining current laws.

        • Jrockwar@feddit.uk
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          5
          ·
          edit-2
          8 months ago

          And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It’s unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don’t look pretty and women don’t buy products - they clean the house and care for the kids.

          I’m aware of how much I’m extrapolating, but a lot of this is the subtext under “they’ll make porn of your sisters and daughters” but leaving out of the thought train your good looking brother/son, when that’d be just as hurtful for them and yourself.

          • lud@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            Or your bad looking brother or the bad looking myself.

            Imo people making ai fakes for themselves isn’t the end of the world but the real problem is in distribution and blackmail.

            You can get blackmailed no matter your gender and it will happen to both genders.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          16
          ·
          8 months ago

          Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            15
            arrow-down
            6
            ·
            8 months ago

            Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

            • echo64@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              17
              ·
              8 months ago

              Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

      • SharkAttak@kbin.social
        link
        fedilink
        arrow-up
        9
        arrow-down
        9
        ·
        8 months ago

        It’s not normal but neither is new: you already could cut and glue your cousin’s photo on a Playboy girl, or Photoshop the hot neighbour on Stallone’s muscle body. Today is just easier.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          6
          ·
          8 months ago

          I don’t care if it’s not new, no one cares about how new it is.

    • EatATaco@lemm.ee
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      8 months ago

      I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family’s faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

      I agree there is nothing to be done, but it’s painfully obvious to me that the scale and ease of it that makes it much more concerning.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        Also the potential for automation/mass-production. Photoshop work still requires a person to sit down to do the actual photoshop. You can try to script things out, but it’s hardly an easy affair.

        By comparison, generative models are much more hands-free. Once you get the basics set up, you can just have it go, and churn things at rates well surpassing what a single human could reasonably do (if you have the computing power for it).

    • AstralPath@lemmy.ca
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      13
      ·
      8 months ago

      This kind of attitude toward non-consensual actions is what perpetuates them. Fuck that shit.

    • dysprosium@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      8 months ago

      Exactly this. And rather believe cryptographically sighed images by comparing hashes with the one supplied by the owner. Then it’s rather a question of trusting a specific source for a specific kind of content. A news photo of the war in Ukraine by the BBC? Check hash on their site. Their reputation is fini if a false image has been found.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        At the same time, that does introduce an additional layer of work. Most people aren’t going to do that just for the extra work that it would involve, in much the same way that people today won’t track down an image back down to the original source, but usually just go by the one that they saw.

        Especially for people who aren’t so cryptographically or technologically inclined that they know what a hash is, where to find one, and how to compare it (without just opening them both and checking personally).

        • dysprosium@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Sure but that’s no problem if software would do that automatically for users of big (news) sites. Browsers on desktop and apps on phones.

    • HubertManne@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      11
      ·
      8 months ago

      This is something I can’t quite get through to my wife. She does not like that I dismiss things to some degree when it does not makes sense. We get into these convos where Im like I have serious doubts about this and she is like. Are you saying it did not happen and im like. no. It may have happened but not in quite the way they say or its being portrayed in a certain manner. Im still going to take video and photos for now as being likely true but I generally want to see it from independent sources. like different folks with their phones along with cctv of some kind and such.