When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.

  • aard@kyu.de
    link
    fedilink
    English
    arrow-up
    40
    ·
    1 year ago

    It’s been shown over and over again for the last two decades that you can’t build a reliable archive on platforms outside of your control. Stuff like Instagram can be useful for trying to draw traffic to your own platform - but you always should treat those platforms as throwaway content.

  • topinambour_rex@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    An important point, buried deeply in the article :

    She said she has observed differential treatment across the social media platform, depending on the race of the subject in the image in question or who posted it.

    “We’ve seen this time and again, Meta taking down content by and about people of color,” she said. “While similar content by and about white people remains up.”

  • Bruno Finger@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Unfortunately those are not platforms meant for archiving, it’s in their own TOS they can take down any content or entire accounts at their own will and for undisclosed reasons. Also those platforms are subject to shitty practices like report abusing, yet another reason to put your content in danger.

    Also unfortunately they are very attractive because of the influence a person can build with large network of followers in those platforms and how they can leverage that.

    I don’t see a plausible solution or alternative. Instagram, Facebook, are the platform of the masses and where your content has the highest exposure. People today aren’t browsing “the web” like we used to back then. Instagram/Facebook are the internet.