I have many patients who google their symptoms in order to come up with a diagnosis. I also have many who google tests and treatments. Right now I have very few who use ChatGPT, but one in particular seems to have an ongoing relationship or dialogue where they exchange medical history, prior treatments, new treatment ideas and tests to request that I, the primary cre physician, order.
What I put in a patient’s chart is protected by HIPAA, the privacy law in this country. That law doesn’t apply to ChatGPT from what I understand and that doesn’t seem right to me.
ChatGPT knows the identity of my patient and their medical history, their test results and medications. I have only used the app for writing purposes just to see what it can do, but I don’t see any consents or privacy warnings when I do that. I actually suspect ChatGPT could “learn” and then share my ideas or concepts that I asked it to flesh out for me.
I think the use of AI for medical advice when health information is entered into its vast data base needs to be regulated or at least should require informed consent about what could possibly happen to individual patients’ otherwise protected health information.











0 Responses to “Could ChatGPT for Medical Advice Break Privacy Laws?”