As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • Spyder@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    5 months ago

    Do your patients know that their information is being transcribed in the cloud, which means it could potentially be hacked, leaked, tracked, and sold? How does this foster a sense of distrust, and harm the patients progress?

    Could you leverage this information and the possibility of being sued if information is leaked with the bureaucrats?

  • BurningRiver@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    I would suggest that that first action item would be is to ask for (in writing) are 1) data protection and 2) privacy policies. I would then either pick it apart, or find someone who works in cybersecurity (or the right lawyer) to do that. I’ve done it a few times and talked my employer out of a few dodgy products, because the policies clearly try to absolve the vendor of any potential liability. Now, whether the policies truly limit liability would have to be tested in court.

    You could also talk about how data protection, encryption, identity and access management, and governance is actually really expensive, but I’d first start poking holes in the actual policies to create doubt.

  • 7heo@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    I would have work sign a legal discharge that from the moment I use the technology, none of the recordings or transcription of me can be used to incriminate me in case of an alleged malpractice.

    In fact, since both are generated or can be generated in a way that both sounds very assertive but also can be adding incredibly wild mistakes, in a potentially life and death situation, they legally recognise potentially nullifying my work, and taking the entire legal responsibility for it.

    As you can see in the most recent example involving Air Canada, a policy has been invented out of thin air. Such policy is costing the company. In the case of a doctor, if the administration of the wrong sedative, the wrong medication, or if the wrong diagnosis was communicated to the patient, etc; all that could have serious consequences.

    All sounding (using your phrasings, etc) like you, being extremely assertive, etc.

    A human doing that job will know not to derive from the recording. An AI? “antihistaminic” and “anti asthmatic” aren’t too far off, and that is just one example off of the top of my head.

  • privsecfoss@feddit.dk
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    I don’t where you live. But almost all of bigtech US cloud is problematic (Read: Illegal to use) for storing or processing of Personal information according to the GDPR if you’re based in the EU. Don’t know about HIPPA and other non-EU legislation. But almost all cloudservices use US bigtech as a subprocessor under the hood. Which means that the use of AI and cloud is most likely not GDPR-complaint. Which you could mention to the right people and hope they listen.

    Edit: It’s illegal to use for the processing of the patients PII, because of transfer to insecure third countries and because bigtech uses the data for their own purposes without any legal basis.

    Edit 2: The same is the case with your, and your colleagues PII.

    In my opinion privacy and GDPR is the same in this case. I think most public authorities is required to have a DPO, fx hospitals or the relevant health authority. The DPO can help answer your and your bosses questions on the mentioned questions.

    Hope you figure it out.

    • pearsaltchocolatebar@discuss.online
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      5 months ago

      You don’t have to use a cloud service to do AI transcription. You don’t even need to use AI. Speech to text has been a thing for like 30+ years.

      Also, AWS has a FedRAMP authorized Gov Cloud that’s almost certainly HIPAA (and it’s non-us counterparts) compliant.

      Also also, there are plenty of cloud based services that are HIPAA compliant.

  • macniel@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Shouldn’t that be a HIPAA violation? Like you can’t in good conscious guarantee that the patient data isn’t being used for anything but the healthcare.

    • Szymon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      It is until they prove it isn’t, which they might not be able to do. Many trusted 23andme only to see private data stolen. Make the company prove the security in place and the methods ensuring privacy, because you’ll essentially be liable for any failures of the system from a lack of due diligence.

      • lewdian69@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Voice recognition dictation has been used in the medical field for over a decade, probably even longer. My regional health system of multiple hospitals and clinics has been using an electronic based, like Dragon dictation, solution since at least 2012. Unfortunately in this case op is being overly paranoid and behind the times. I’m all for privacy but the HIPAA implications have already been well sorted out. They need to either learn to type faster or use the system provided that will increase their productivity and save the health system an fte that used to be used on their transcriptionist which can not be used more directly to care for patients.

        • BearOfaTime@lemm.ee
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          5 months ago

          “Overly paranoid”, with the practically-daily breaches of cloud-based systems today?

  • Ironically, GPT can kinda get you started here…

    To present your case effectively to your bosses and colleagues, focus on simplifying the technical aspects and emphasizing the potential risks associated with using a cloud-based AI transcription service:

    1. Privacy Concerns: Explain that using a cloud-based solution means entrusting sensitive biometric data (your voice) to a third-party provider. Emphasize that this data could potentially be accessed or misused without your consent.

    2. Security Risks: Highlight the risks of data breaches and unauthorized access to your voice recordings stored in the cloud. Mention recent high-profile cases of data breaches to illustrate the potential consequences.

    3. Voice Cloning: Explain the concept of voice cloning and how AI algorithms can be trained to mimic your voice using the data stored in the cloud. Use simple examples or analogies to illustrate how this could be used for malicious purposes, such as impersonation or fraud.

    4. Lack of Control: Stress that you have no control over how your voice data is used or stored once it’s uploaded to the cloud. Unlike a local solution where you have more oversight and control, a cloud-based service leaves you vulnerable to the policies and practices of the provider.

    5. Legal and Ethical Implications: While you acknowledge that there may be existing recordings of your voice online, emphasize that knowingly contributing to the creation of a database that could potentially be used for unethical or illegal purposes raises serious concerns about professional ethics and personal privacy.

    6. Alternative Solutions: Suggest alternative solutions that prioritize privacy and security, such as using local AI transcription software that does not upload data to the cloud or implementing stricter data protection policies within your organization.

    By framing your concerns in terms of privacy, security, and ethical considerations, you can help your bosses and colleagues understand the potential risks associated with using a cloud-based AI transcription service without coming across as paranoid. Highlighting the importance of protecting sensitive data and maintaining control over personal information should resonate with individuals regardless of their level of technical expertise.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      5 months ago

      My biometric data, in this case my voice. Training an AI, tailored to my voice, out of my control, hosted as a cloud solution.

      Of course there is an aspect of patient confidenciality too, but this battle is already lost. The data in the medical records is already hosted outside of my hospital.

      • SheeEttin@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Sounds like a weak argument. They’re not going to be inclined to operate a local ML system just for one or two people.

        I would see if you can get a quote for locally-hosted transcription software you can run on your own, like Dragon Medical. Maybe reach out to your IT department to see if they already have a working relationship with Nuance for that software. If they’re willing to get you started, you can probably just use that for dictation and nobody will notice or care.