A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • Merlin404@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Tragic that they were a celebrity that had to go through it for them to do something. But when children or others have it happened to them, they just shrug…

      • Viking_Hippie@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Probably helps that she’s super white too.

        This has been happening to AOC constantly since before she was first sworn in and it’s been crickets.

        When it happens once to the media’s favourite white billionaire, though? THAT’S when they start to take it seriously.

        • Ledivin@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          To be clear, this has been happening to Swift for years. She’s been very public on the problem, and pays IIRC up to a million per year for a firm to get fakes taken down.

  • Serinus@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    I don’t get it. Why care? It’s not her.

    Maybe if they’re making money of off her likeness. But without a money trail it just seems like chasing ghosts for not much reason.

    • shiroininja@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Because it’s gross, and they do it to minors now. and all they need are pictures of your kids from your social media profile. They even use AI to undress them.

      • fishos@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        And here we have the real answer: prudism. “It’s gross”. And of course “think of the children”. You don’t have a real answer, you have fear mongering