I understand the fight will be hard and I'm not getting into it if I cant present something they will understand. I'm definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.
FlappyBubble
I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.
The training data is to be per person, resulting in a tailored model to every single doctor.
Thaks fot he advice but I'm not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.
It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).
I'm a psychiatrist in the field of substance abuse and withdrawal. Sure there's a shortage of us too but I want the hospital to understand the problem, not just me getting to use a old school secretary by threatening going to another hospital.
As a medical doctor I strongly object to this. Generics are tightly regulated. The substance is the same. What can vary is the binding materials and alike. In very, very rare cases a patient can be allergic to a substance that is specific to a certain brand (and not part of the active substance). This has happened to me only twice. In some countries anticonvusants are the exception where generics aren't used, but that is not practiced everywhere.
I'm not sure what exact service will be used. I won't be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.