The doctor has publicly identified himself as the person who released information to a conservative activist about the transgender care program at Texas Children’s. Citing “whistleblower documents,” the activist published a story in May 2023 saying Texas Children’s provided transgender care, which was legal at the time, “in secret.”
Texas Children’s on Monday declined to comment on the charges against Haim. In previous statements, hospital officials said its doctors have always provided care within the law.
Transgender care has become a popular talking point in Texas and other Republican-dominated states where lawmakers claim such treatment is harmful to children. It describes a range of different social, psychological, behavioral or medical interventions that support people whose assigned sex at birth does not align with their gender identity. This can include mental health counseling, hormone therapy or surgery, which is rare for people under 18.
Such treatment, which is supported by every major medical association in the U.S., was offered at Texas Children’s and other pediatric hospitals in Texas. Lawmakers have since implemented a statewide ban, and Texas Children’s said it would discontinue its program.
Meanwhile, Haim has publicly decried the investigation against him as “political.”
In the arraignment hearing, Ho said the indictment identified three different patients whose health information was compromised. Addressing reporters, Patrick declined to speak about the facts of the case but described the charges against his client as a “huge contradiction.”
This is a textbook HIPAA violation.
Didn’t Facebook and insurance companies got caight sharing data…
Whatever happened there… Probably nothing.
Facebook put tracking pixels on all sort of medical websites, including websites that provided abortion services. Many of the companies with the tracking pixels didn’t even have anyone in them who understood tracking pixel technology.
Data brokers regularly obtain medical information about people and even if it can’t legally be used to discriminate against someone, all this information gets in through the back door through personality profiles that measure things like “resiliency” and “tenacity.”
Did you need to take a break from work due to severe depression? Well that can’t be legally used against you when you apply for a new job. However, since you couldn’t tough it out at work, you do have a “less resilient” personality and that will be factored into a personality profile used to exclude you from jobs. They are doing this using AI now to try to get this information in through the backdoor and make it legal, and since many of the pieces of data from data brokers have no clear source of origination or clear consent obtained, these companies claim they are not relying on privileged medical information.
It’s appalling and discriminatory and these companies should all be destroyed, their offices bulldozed, and the earth salted afterwards. The executives running these companies should all be castigated for what they do. They are profiting off of a data brokerage hack turning medical information into discrimination and then trying to white wash it.
Eventually, large greedy law firms will see they can make a lot of money with class action suits for disabled people who were collectively harmed by white-washing medical problems into personality profiles and these companies and their practices will become huge liabilities because they can’t determine the data sources and whether consumers actually opted in. (And almost all consumers exploited by these nefarious practices monetized by sleazeballs don’t opt in.)
This a great write up that expanded my understanding on the issue. So thank you.
I doubt anything will change, the state is actively supporting this behavior.
Best people can do it is try to protect themselves which is mostly a futile exercise.
Everyone who works in these industries is slime and worst of all, they make posts about adhering to regulations with #compliance hashtags to try to trick people that a large part of what they are doing isn’t white-washing backdoor data that includes confidential information.
They are tricking the stupid corporations and masses to extract money, but will eventually become a liability because the data lacks consent. But all the scum working at all these terrible companies using AI and feigning compliance will eventually depart once the legal vultures start to circle, leaving shareholders to be the ones losing money.
Most of the value of the treachery of these companies is the victims never know that they have been exploited. Someone who takes time off work occasionally for severe depression may in fact be a worse worker and so companies profit from exploiting this trick and companies do not notify job candidates when they have been eliminated as candidates based on personality profiles that incorporate medical information or traits that are proxies for medical information and/or based on medical apps or health websites.
Virtually all health sites sell information to data brokers. Where do people think that goes? It goes into personality profiles to eliminate less healthy job candidates.
Some of these companies and data brokers are using AI to circumvent the design of laws, and pretend that complex programs are somehow compliant while achieving the same intent, but all the compliance hashtags don’t exculpate them from their sleaze. Many of these companies that try to white wash this data show how their practices aren’t discriminatory to women or minorities and that pacifies governments and seems palatable, but they make their money from getting rid of other protected classes of people that are less obvious.
It’s just pure slimebag grifting, the whole thing, and eventually people wise up because their is profit in suing these assholes for their avaricious racket.
Amen!
They also buy your tax data to cross reference income…
Good thing people got nothing to hide tho lol
It was H&R Block and all those other online tax services who did that. It was a terrible violation and I wish those companies would all be just broken up, sued into oblivion, and their CEOs personally liable for this on both a civil and criminal level. It was one of the worst data breaches, without being a breach, ever and just vomit inducing treachery.
Here is a hot take, most breaches ain’t breaches!!!
Its inside jobs… People exiting and securing their comp IMHO
The following post my be completely wrong based on new updates to HIPAA and previous suggestions that were not added as expected to revisions. There is one reply below this saying it’s wrong, and they are probably right. This whole post is probably mostly wrong, therefore. I’m leaving it here for now, but it’s incorrect.
It’s not a textbook HIPAA violation.
HIPAA has a good-faith exception allowing medical professionals to disclose private medical information when it’s in the best interest of the patient.
What is in the best interest of the patient?
Well, following all the rules of the government, which are all there for people’s safety, of course!
For example, Norma gets pregnant and abortion is legal and she has an abortion. She is relying on HIPAA to keep her medical privacy.
Abortion then becomes illegal after she had her abortion. A hospital worker, knowing that abortion is illegal, provides this information to the police so that they can monitor Norma to make sure she doesn’t get more abortions. This would be a good-faith exception to HIPAA because the medical worker is breaching Norma’s privacy in Norma’s best interest because he is worried she could break the law by having more abortions, and following the law is always in the interest of safety, no matter what. (Have doubts? Just ask ChatGPT if it’s ever safer to not follow public rules and regulations because of having a different personal belief system.) Norma then sues the medical worker and claims the good-faith exception violated HIPAA, and a court then is left to decide whether this worker was acting in Norma’s best interests, by helping make sure she follows the law, or doing something bad. If the court finds against the worker, it’s at best a slap on the wrist and small fine, but if the hospital worker is in a conservative court, the worker is going to win anyway.
Worst of all, as a patient, Norma can not opt-out of the good-faith exception. There is no mechanism in the HIPAA rules that allows her to say “You know that good faith exception? I am explicitly requiring you to close that loop-hole for me because I’m a private person, my family and I have different values, and it’s just easier for me this way. I don’t want to have to worry about you deciding something that would make me uncomfortable. If I want you to talk to someone, I’ll give explicit consent beforehand and even emergencies or unusual exceptions don’t change this.” There is no way to opt-out of this awful ambiguous rule. In the medical industry, you either accept their rules and regulations or you walk away and don’t get medical care.
So sadly, you’re actually totally wrong. I hope this doctor who breached patient privacy claims HIPAA wasn’t violated in just this way so that legislators realize how much they fucked up and so that patients no longer have to hope and pray their doctor doesn’t decide to break privacy in a patient’s supposed best interest. There are so many exceptions and rules change so much that it’s no wonder that women will no longer talk with doctor’s about periods, and women are even afraid to tell therapists about having been raped in certain states.
It’s honestly better for patients, especially women, to start seeing the medical establishment for what it is: a highly regulated arm of the government who does exactly what it’s told in order to keep getting high salaries and wages. Don’t adhere to the government rules? Goodbye high salaries! They don’t dare bite the hands that feeds, and women are luckily wising up to it.
If this doctor gets convicted, it will be because of the false pretenses he allegedly used. He is also only being charged by the federal government which is more liberal and if it were up to the state government of Texas this person never would have been charged. The situation is far more dire that this feel-good idea that there’s real enforcement over this sort of thing when the reality is there explicit loopholes written into the laws to allow it.
You seem very confident in your answer, but the actual text doesn’t seem to match your assertions?
https://www.hhs.gov/hipaa/for-professionals/breach-notification/index.html
But they cited ChatGPT. Surely that should lend them some authority, right?
Also to clarify, under the rules, certain actions may not constitute a breach to begin with and therefore the breach rules may not apply and also the exceptions may not apply.
The big difference is that all those exceptions only apply to an authorized party, i.e. a health care provider authorized to care for the patient. In this case, the doctor in question was never authorized - none of the patients were in his care.
good point
You’re either partly right at least or I’m at least not up to date on things. It looks like there are recent additions to the rules based on the abortion case Dobbs and in addition, some of the proposed changes I read about in an article may not have been added in. Many people were complaining about HIPAA preventing them from finding out about family members who were hospitalized and there were discussions about changing things, but you may be right and none of those changes were incorporated into the actual HIPAA rules.
When I read about proposed changes to HIPAA, I figured they would be passed because it seems like the trend is erosion of individual privacy always in the interest of whatever the government says, and I didn’t verify everything prior to my reply.
Good catch. It appears at least initially I’m wrong and you’re right. I’m going to research it more later, but it likely won’t change things.
It looks like there are updates to HIPPA based on concerns about Dobbs, so I am probably wrong overall.
But:
https://www.hhs.gov/hipaa/for-professionals/faq/488/does-hipaa-permit-a-doctor-to-discuss-a-patients-health-status-with-the-patients-family-and-friends/index.html
“Even when the patient is not present or it is impracticable because of emergency circumstances or the patient’s incapacity for the covered entity to ask the patient about discussing her care or payment with a family member or other person, a covered entity may share this information with the person when, in exercising professional judgment, it determines that doing so would be in the best interest of the patient. See 45 CFR 164.510(b).”
i may not be wrong after all?
https://www.law.cornell.edu/cfr/text/45/164.510 (3) Limited uses and disclosures when the individual is not present. If the individual is not present, or the opportunity to agree or object to the use or disclosure cannot practicably be provided because of the individual’s incapacity or an emergency circumstance, the covered entity may, in the exercise of professional judgment, determine whether the disclosure is in the best interests of the individual and, if so, disclose only the protected health information that is directly relevant to the person’s involvement with the individual’s care or payment related to the individual’s health care or needed for notification purposes. A covered entity may use professional judgment and its experience with common practice to make reasonable inferences of the individual’s best interest in allowing a person to act on behalf of the individual to pick up filled prescriptions, medical supplies, X-rays, or other similar forms of protected health information.
It doesn’t seem like this exception can be waived. What are emergency circumstances or incapacity? What if I don’t want anything disclosed based on someone else’s professional judgment?
I just still think there is way too much leeway to allow things to be shared based on the ambiguous language of the text.
*HIPAA