As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • InEnduringGrowStrong@sh.itjust.works@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    9 months ago

    Ironically, GPT can kinda get you started here…

    To present your case effectively to your bosses and colleagues, focus on simplifying the technical aspects and emphasizing the potential risks associated with using a cloud-based AI transcription service:

    1. Privacy Concerns: Explain that using a cloud-based solution means entrusting sensitive biometric data (your voice) to a third-party provider. Emphasize that this data could potentially be accessed or misused without your consent.

    2. Security Risks: Highlight the risks of data breaches and unauthorized access to your voice recordings stored in the cloud. Mention recent high-profile cases of data breaches to illustrate the potential consequences.

    3. Voice Cloning: Explain the concept of voice cloning and how AI algorithms can be trained to mimic your voice using the data stored in the cloud. Use simple examples or analogies to illustrate how this could be used for malicious purposes, such as impersonation or fraud.

    4. Lack of Control: Stress that you have no control over how your voice data is used or stored once it’s uploaded to the cloud. Unlike a local solution where you have more oversight and control, a cloud-based service leaves you vulnerable to the policies and practices of the provider.

    5. Legal and Ethical Implications: While you acknowledge that there may be existing recordings of your voice online, emphasize that knowingly contributing to the creation of a database that could potentially be used for unethical or illegal purposes raises serious concerns about professional ethics and personal privacy.

    6. Alternative Solutions: Suggest alternative solutions that prioritize privacy and security, such as using local AI transcription software that does not upload data to the cloud or implementing stricter data protection policies within your organization.

    By framing your concerns in terms of privacy, security, and ethical considerations, you can help your bosses and colleagues understand the potential risks associated with using a cloud-based AI transcription service without coming across as paranoid. Highlighting the importance of protecting sensitive data and maintaining control over personal information should resonate with individuals regardless of their level of technical expertise.