Privacy Matters

HOW IS MY PRIVACY PROTECTED?

Here are some ways Dr. Fox works to protect your privacy at a higher level than is commonly found in many medical practices:

1. Appointments are not scheduled “back to back” or double-booked. Your time is reserved solely for you. Have you ever sat in a crowded waiting room and seen someone you’d rather not see? Dr. Fox leaves 15 minutes between each patient, so patients can come and go without seeing each other if they are being seen in person. For many clinics, packing more patients in per day is much more profitable. So why does Dr. Fox schedule patients this way? —It’s better for patients.

2. Having more time in between patients also gives Dr. Fox time to review your records before seeing you again. This not only improves your quality of care, it helps your confidentiality. Have you ever been accidentally mixed up with another patient because your doctor was rushed and mixed you up with someone else? Dr. Fox had this when she was a patient. She didn’t like it, so she resolved not to do that to her own patients.

3. Your personal health information will not be transmitted to an insurance company. Dr. Fox can provide you with a superbill receipt that you can choose to send to your own insurer if you wish. Be aware that insurers store your medical information and diagnoses for decades. Of course, that’s not all they track these days. Insurers are joining forces with data brokers to collect personal details about race, education, TV, habits, marital status, the safety of the neighborhood you live in, whether you’re behind on your bills, what you order online, do you buy plus size clothing, etc. Patient advocates tend to say the insurance industry’s data gathering runs counter to its obligation to keep your medical information confidential, but unfortunately, HIPAA rules only protect medical information, not where you live or the size of clothing you buy.

4. Your personal health information will have no online EHR (Electronic Health Record) presence. EHRs can be convenient for doctors. They have forms built-in for communication with patients, prescribing functions, payment functions, video functions for telepsychiatry, and more. Pulling all these things together on your own is a lot of work (and money) for your doctor. So why not take the convenient route as many do?

One reason is that EHR (Electronic Health Record) breaches are a common source of privacy breaches regarding private medical information.

Another reason is that EHR records might not stay in that EHR. They might need to be “migrated” if one EHR company gets purchased by another. Many EHRs are owned by private equity and can be sold. If a medical practice uses an EHR that is owned by private equity, having one company purchase another is a real consideration. There is an increased risk of data breech when your information is being relocated.

5. Dr. Fox does not use AI (Artificial Intelligence) to listen to patients during session and type up the notes. Why not use AI to listen for me and write my notes for me? It does save on paperwork time, so doctors tend to like it. Administrators like it, too. Why? It’s a way for doctors to see more patients per day. The clinic makes more money! It’s administrator heaven. (I don’t believe rushing patients is necessarily good for their care).

But the bigger problem here involves confidentiality. The party line with AI developers is that they use patient data that is “de-identified.” That means they don’t include identifying data (such as your name or date of birth) that legally has to be protected due to HIPAA law. (HIPAA has a list of items that can identify you and have to be protected). Lots of medical practices are on board with this and using AI due to this reassurance.

But should this reassurance be reassuring? Unfortunately, your private health information might not stay private when your doctor uses AI. One example is when your doctor decides to use an AI chatbot to fill out something tedious like an insurance form. This means your private health information is now no longer stored in the clinic, hospital or office. Your private health information has been sent out to massive OpenAI servers that are not HIPAA-compliant. That is technically a data breach, but most doctors don't understand AI enough to realize.

Even the AI that’s made for healthcare can be a problem. Of course, these services promise to protect private health information by de-identifying data. So what’s the problem if these AI developers take out every bit of your HIPAA-protected data before your information gets funneled into the massive data set that is AI? I mean, it’s all anonymous then, right?

Um, not really. AI is so powerful it can re-identify patients even when HIPAA protected data is removed from their charts! Oh snap. Current legislation such as HIPAA was never designed to go up against something as powerful as AI. HIPAA has specific patient identifiers that by law have to be protected. The problem is, AI can figure out who you are and what is going on with you, even without those identifiers. AI is that powerful.

Even more thrilling, AI can take your de-identified data and make pretty accurate predictions about your future behavior. So a private equity company’s insistence that any HIPAA-protected data is “de-identified” before they use it doesn’t comfort me even one little bit. Of course, if you didn’t know about AI’s capability to re-identify patients, you’d probably think it was no big deal that AI used your “anonymized” data. Now you know.

6. Dr. Fox has a solo practice. That means no one else in the organization is going to be accessing your records. Even though these are protected records, sometimes a nosy person can access them, or be looking over the shoulder of someone accessing them, or even be within earshot.

Dr. Fox once worked with a non-physician supervisor who enjoyed snooping in patient records. Someone who is not involved in your treatment shouldn't be snooping in your record to see what they can see. (And yes, Dr. Fox reported the above-noted snooper).

Unfortunately, snooping is all too common in hospitals, clinics and group practices. EHRs can track this data to find violators, but your organization has to want to do something about it.

In Dr. Fox’s practice, nosy individuals are not entitled to your private information, and Dr. Fox won’t be sending out your private information out to huge AI servers in order to save herself some time doing paperwork. After all, in a treatment relationship, there needs to be trust.