FlappyBubble

joined 2 years ago
[–] FlappyBubble@lemmy.ml 1 points 1 year ago (1 children)

I'm not sure what exact service will be used. I won't be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.

[–] FlappyBubble@lemmy.ml 11 points 1 year ago

I understand the fight will be hard and I'm not getting into it if I cant present something they will understand. I'm definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.

[–] FlappyBubble@lemmy.ml 13 points 1 year ago* (last edited 1 year ago) (1 children)

I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.

The training data is to be per person, resulting in a tailored model to every single doctor.

[–] FlappyBubble@lemmy.ml 4 points 1 year ago (5 children)

Thaks fot he advice but I'm not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.

[–] FlappyBubble@lemmy.ml 1 points 1 year ago

That's correct! I'm not againt using technology to cut costs or providing better healthcare. My question is entirely about the privacy implications.

[–] FlappyBubble@lemmy.ml 5 points 1 year ago (3 children)

Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I'm certain that this is just the first of many reforms without proper analysis of privacy implications.

[–] FlappyBubble@lemmy.ml 11 points 1 year ago* (last edited 1 year ago) (9 children)

It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).

I'm a psychiatrist in the field of substance abuse and withdrawal. Sure there's a shortage of us too but I want the hospital to understand the problem, not just me getting to use a old school secretary by threatening going to another hospital.

[–] FlappyBubble@lemmy.ml 16 points 1 year ago* (last edited 1 year ago) (1 children)

My question is not a legal one. There probably are legal obstacles for my hospital in this case but HIPAA is not applicable in my country.

I'd primarily like to get your opinions of how to effectively present my case for my bosses against using a non local model for this.

[–] FlappyBubble@lemmy.ml 2 points 2 years ago (3 children)

Bad but expected given that they are Chinese based. I use several of their cameras but only after kepping them isolated from the Internet and segmented from the rest of my network. I only access the streams from my NAS which in turn access the camera streams from a dedicated NIC.

[–] FlappyBubble@lemmy.ml -1 points 2 years ago (1 children)

Sorry I really don't follow what you wrote in that comment. Can you write something coherent and with references to sources?

[–] FlappyBubble@lemmy.ml 10 points 2 years ago (3 children)

What is the problem with GrapheneOS?

[–] FlappyBubble@lemmy.ml 26 points 2 years ago (3 children)

As a medical doctor I strongly object to this. Generics are tightly regulated. The substance is the same. What can vary is the binding materials and alike. In very, very rare cases a patient can be allergic to a substance that is specific to a certain brand (and not part of the active substance). This has happened to me only twice. In some countries anticonvusants are the exception where generics aren't used, but that is not practiced everywhere.

view more: ‹ prev next ›