• LemmyLefty@lemmy.world
    link
    fedilink
    arrow-up
    94
    ·
    1 year ago

    While TikTok has removed multiple videos depicting James narrating his abduction and death in Kirkby, England, many remain available to view on YouTube.

    This is fucking ghoulish.

    Does this constitute psychological torture? I’m serious. This is so much eviler than what Westboro does.

    Christ. I already hate these people.

  • harmonea@kbin.social
    link
    fedilink
    arrow-up
    90
    arrow-down
    3
    ·
    1 year ago

    Stuart Fergus, the husband of James Bulger’s mother, said that after he reached out to one creator asking them to take down their video, he received a reply saying: “We do not intend to offend anyone. We only do these videos to make sure incidents will never happen again to anyone. Please continue to support and share my page to spread ­awareness.”

    He really tried to take down his wife’s dead kid’s deepfake and got the creator responding “no offense, so like share and subscribe lel”

    Using the likeness of another person without that person’s express permission should be a jailable offense.

    • FigMcLargeHuge@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      29
      ·
      1 year ago

      They seem to be getting a pass on using copyrighted materials to feed these programs so I am doubting that we would get legislation protecting our own likenesses, or those of our loved ones. I bet you couldn’t even get lawmakers to understand what they would need to write into law. They (american lawmakers) all seem to be so up to speed on technology. /s

      • Otter@lemmy.ca
        link
        fedilink
        English
        arrow-up
        32
        ·
        1 year ago

        Not until someone starts making clips of lawmakers narrating their own crimes and unethical behavior, then they’ll get it done immediately.

        Not that I’m suggesting anyone do that…

          • DessertStorms@kbin.social
            link
            fedilink
            arrow-up
            17
            arrow-down
            3
            ·
            edit-2
            1 year ago

            How the fuck does someone go from “I want to punish this person” to “lets make porn of a woman in his life who has nothing to do with it, that’ll show 'em!”??

            Surely there are a million ways you can come up with to include whichever lawmaker in a deepfake that don’t include violating an unrelated woman?

            • FigMcLargeHuge@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Because that’s a news story I have already read about. This has happened to people. No need to be so sanctimonious.

  • pulaskiwasright@lemmy.ml
    link
    fedilink
    English
    arrow-up
    53
    ·
    edit-2
    1 year ago

    This feels so much like a cyberpunk story. It’s so dehumanizing and has such disregard for humanity that it feels like a perfect match for the genre.

    I’m really starting to understand how old people get to a point where they no longer want to keep with the times. This is gross.

  • duncesplayed@lemmy.one
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    1 year ago

    Anne Frank advertising baby clothes before discussing the horrors of the Holocaust

    Wow, that is amazingly inhumane.

    My first thought is they’re necessarily making characters who aren’t people. A person who has lived through the Holocaust just cannot cheerfully peddle baby clothes. I don’t mean that it’s physically not possible because she’s dead: I mean in terms of the human psyche, a person just flat-out psychologically could not do that. A young boy who succumbed to torture and murder psychology cannot just calmly narrate it.

    So obviously, yeah, it’s quite a ghoulish and evil thing to take what used to be a person, and a figure who has been studied and mourned because of their personhood, because we can relate to them as a person, and just completely strip them of their personhood and turn them into an inhumane object.

    But then that leads to me the question of, who’s watching these things, and why? The article says they got quite a lot of views. Is it just for shock value? I don’t quite understand.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      But then that leads to me the question of, who’s watching these things, and why? The article says they got quite a lot of views. Is it just for shock value? I don’t quite understand.

      Once again the entertainment can claim to be “educational” or “informative” and piously claim it doesn’t “condone” what it’s presenting (for entertainment purposes) but “true crime” hogs will gobble it up and oink all the while anyway.

  • BareHandedPoopScoop@waveform.social
    link
    fedilink
    arrow-up
    37
    arrow-down
    2
    ·
    edit-2
    1 year ago

    This is inarguably horrible but the use of AI seems irrelevant. You could make this same thing with any animation tool. It’s the idea that’s disgusting.

    Do you think AI is mentioned because it makes the article seem more up to the minute and in keeping with current tech trends?

    “A man drew a disgusting picture of a horrible event using pencils and paper this week.”

    “Pencils and paper are so awful.”

          • Serdan@lemm.ee
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            1 year ago

            I don’t see the difference between doing this with AI or doing it with Photoshop. It’s horrible independently of the tools used.

            • This is fine🔥🐶☕🔥@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              5
              ·
              1 year ago

              Cool, so you see no problem with these ghouls being even more efficient with AI because governments have zero fucking clue about new technology?

              Are you too stupid to understand why that is a horrible thing?

              • Serdan@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                I think there should be legal avenues to shut down people who do that shit.

                It doesn’t matter what tools they’re using. The solution is the same regardless.

                • These people are using likeliness of actual people without their (or their heirs’/parents’ in case of dead people) permission to make money off of tragedies. I don’t know about you, but I think we as a society should have laws from preventing this from happening.

              • src@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                So you would support government limiting an individual’s right to run software on their computer because you don’t agree with what the software outputs?

                That’s absolute nonsense. The entire premise of these machine learning models is that they accept any arbitrary input, you would want to neuter that?

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      If pencils and paper were actively causing a spike of worker precarity, a sudden increase in fraud and identity theft and misuse of people’s personas against their will and without their consent, I wouldn’t blame people for being upset at those fancy new pencils or paper instead of smugly telling them how actually berdly-actually the technology in a vacuum with no one using it is actually harmless.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    30
    ·
    1 year ago

    “Hello, my name is James Bulger,” says the image in one TikTok video, made in the likeness of the British 2-year-old who was abducted in 1993 as his mother paid for groceries.

    “If my mom turned right, I could have been alive today. Unfortunately, she turned left,” the childlike voice says, citing what James’s mother once said was one of her biggest regrets: If she had turned right, she would have seen her son being led away by the two 10-year-olds who later tortured and killed him

    Yup

  • Kara@kbin.social
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    1 year ago

    The 2 biggest psychopaths join forces, true crime people, and tech bros

    • azuth@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      True crime shows have existed with real (paid) actors in mainstream media for decades. Certainly made more money compared to ‘content creators’.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        There’s a stark difference between reporting on or dramatizing a crime (often with family involvement) and creating deepfakes of dead kids to boost your social media presence.

  • eee@lemm.ee
    link
    fedilink
    arrow-up
    25
    ·
    1 year ago

    This is eerily similar to that episode of Black Mirror…

    • 667@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      This is exactly that episode of Black Mirror. The use case is different, but the concept is identical.

  • fitz@linkopath.com
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    Reminds me of that Battlestar Galactica spinoff Caprica where the dad recreates his daughters life in a machine by recreating her life from her social media presence…

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    This is the best summary I could come up with:


    Now, some content creators are using artificial intelligence to recreate the likeness of these deceased or missing children, giving them “voices” to narrate the disturbing details of what happened to them.

    TikTok’s guidelines say the advancement of AI “can make it more difficult to distinguish between fact and fiction, carrying both societal and individual risks,” and ask that users who share synthetic or manipulated media showing realistic scenes include a disclaimer noting that the content is not real.

    Felix M. Simon, a communication researcher at the Oxford Internet Institute, said he was confident that the videos mentioned in this piece were produced using “one or several AI tools,” but could not say which software exactly.

    “They appear to be created with some form of AI tools and bear some of the typical hallmarks of cheaper AI-generated videos,” such as an anime or comic-like aesthetic and polished skin, he said in an email.

    Simon cautioned that the videos — which are often accompanied by dramatic or sorrowful music, or show children with scars and bloodied faces — “have the potential to re-traumatize the bereaved.”

    Cory Bradford, a TikToker who has gained almost 1 million followers producing history videos, said that while he generally avoids using AI in his own posts, those who do are likely trying to boost engagement, especially on a platform where the audience skews younger.


    I’m a bot and I’m open source!