You’ve gone home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to contract an STI, so… what now?

A company called Calmara wants you to snap a photo of the guy’s penis, then use its AI to tell you if your partner is “clear” or not.

Let’s get something out of the way right off the bat: You should not take a picture of anyone’s genitals and scan it with an AI tool to decide whether or not you should have sex.

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    97
    ·
    8 months ago

    I’m pretty sure that gonorrhea, chlamydia, and HIV don’t generally have visible symptoms. Just use a condom.

    • circuscritic@lemmy.ca
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      2
      ·
      edit-2
      8 months ago

      What part of AI don’t you understand?

      If you can’t trust AI medical startups operating out of Silicon Valley with pictures of your genitals, well…THEN WHO CAN YOU TRUST?

      I mean, to be fair, it also looks like they might be partially financially backed by a foreign authoritarian regime, and they usually have pretty good AI models…so…

      • RatBin@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        8 months ago

        We are reaching the phase where ai is de facto a magic spell to be cast on reality, and ai startup are hyping this up. That and taking pics of stranger’s genitals is a dick move.

        • circuscritic@lemmy.ca
          link
          fedilink
          English
          arrow-up
          13
          ·
          8 months ago

          Yes, I agree. AI is magic and everyone should submit pictures of their genitals.

          Hell, I’ve started converting my dick pics into ASCII art and having ChatGPT diagnose me for STI’s.

          AI BABY WOOOOOO HOOOOO

        • DudeDudenson@lemmings.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          I’m just happy we moved to an AI bubble to raise stock prices instead of continuing to lay off essential personnel to do it

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 months ago

            AI is not even at the point yet well you can lay off workers and just have the AI do it reliably and safely

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      8 months ago

      Man, what about false positives? Ruined date night at minimum, possibly ruined reputation, relationships.

      Sorry, we put a picture of your junk into this box. We don’t know what’s in it, or what it does with the picture, but it says you have chlamydia, and I think the box looks trustworthy. Here’s your divorce papers.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 months ago

      Or if still concerned after the fact, a doctor. Despite what your GOP neighbor might tell you, they’re not all evil quacks and don’t typically take pictures of your stuff either.

  • smileyhead@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    37
    ·
    8 months ago

    Single reason why this is suspicious from the start:

    Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      8 months ago

      Nah, they’d just throw up a disclaimer “Not true medical advice, consult a doctor for actual confirmation” and they’d probably be in the clear

  • Blaster M@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    8 months ago

    This definitely won’t be misused in any way that would completely destroy the good name of the person taking/in the frame of the image. It’s just one “probable cause” search from a bad day.

    • answersplease77@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      what are the chances they build a database to blackmail any individual they want in the future and just say it was leaked

  • systemglitch@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    3
    ·
    8 months ago

    I could care less who sees my junk. I also would not let someone take pictures of it so I can fuck them. I’m galaxies away from being that desperate.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    14
    ·
    8 months ago

    Some STIs, in some situations, have a visible presentation that could be detected.

    A false positive is a good thing here, a false negative is a bad thing here. There’s no way this app will not have huge false negatives.

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 months ago

    Maybe they will use the photo to match it with doctor notes and photos medically taken of the same penis or vagina to then illegally match them to illegally obtained health records. Probably not though.

  • bbuez@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    8 months ago

    Finally, using this we’ll be able to train AI models so we can know what super-gonaherpes looks like