We’ve all signed those vague privacy statements when visiting our local hospital for medical care. But how many of us have actually read the fine print and understand that the most sensitive details of our medical lives may be shared with drug companies for research purposes? And that we, as patients, may be asked to take part in a clinical trial that not only brings financial benefits for the company doing the trial but for the hospital that shared our data in the first place?
A recently announced partnership between drug companies and 13 New York state hospitals to mine patient data for clinical trials brings such questions into sharp focus, raising troubling ethical and privacy concerns, according to several consumer and privacy advocates. As Businessweek reports, this project is just one of many new initiatives being created “as medical providers, software vendors, and health data businesses seek ways to profit from the flood of clinical data now being gathered electronically.”
This particular partnership, dubbed PACeR, spearheaded by such pharm giants as Pfizer, Merck, Roche and Johnson & Johnson, would be brokered by Quintiles, a CRO that does clinical trials, and it would work like this:
The participating hospitals would provide patient data scrubbed of key identifiers (like names, addresses, social security numbers etc.) to Quintiles, and drug companies could then mine that data for potential clinical trial candidates for a huge fee (between $50,000 and $250,000 per query), a hefty portion of which would accrue back to the hospitals. Quintiles would then let the hospitals know which patients the companies would like to recruit for trials, and the hospitals would ask the doctors of those patients to see if their patients were interested in participating in the clinical trial. According to Businessweek, only the hospitals would have access to the patients’ specific identification and patients would have the final say over whether they wanted to be recruited for the study.
All well and good? Not really, privacy and consumer advocates say. First of all, there is the issue of the transparency. Most patients have no idea that their medical data is being shared with commercial entities interested in doing research. Nor do they know that the hospitals involved in this partnership stand to benefit handsomely from sharing that data.
“In order for privacy to work, you’ve got to have complete transparency,” says Lillie Coney, associate director of the Electronic Privacy Information Center, a nonpartisan research center in Washington, D.C. “You have to tell patients why you’re collecting their information and what you are going to do with it.” Just getting patients to sign a vague privacy statement when they enter the hospital doesn’t cut it, says Coney, especially when the hospital stands to benefit financially from sharing your data.
“These patients are being contacted by hospitals or doctors who have a [financial] incentive here,” Coney says. “How free is the doctor going to be to say they don’t want to do this? After all, they have to practice at the hospital.”
Not only does this kind of arrangement put pressure on doctors to sign up patients for clinical trials; it also puts pressure on the patients themselves, who may not want to antagonize their physicians by saying no.
This kind of ethical dilemma, of course, predates the existence of electronic medical records. Some ethicists believe that doctors should never be paid or pressured to recruit patients for research because that blurs their roles as caregivers and puts the doctors in a position of doing something that may not be in the best interests of their patients. (For example, patients being recruited for these trials may be taking drugs that the researchers know nothing about, leading to potentially troublesome drug interactions). And PACeR raises the ante on such troubling conflicts because it introduces big profit incentives for hospitals as well as doctors.
There is another big problem with the project, and it lies in the nature of the all-seeing Web. The founders of PACeR make a big deal about how the data will be scrubbed of identifiers in compliance with HIPAA, so the drug companies won’t know who the individual patients are. However, new technologies make it very easy for computers to link the available patient data with other public information that’s out there on the Web, allowing companies to identify individual patients.
“I think that’s a valid concern given today’s technology,” says Dr. Michael Carome, deputy director of Public Citizen’s Health Research Group. “According to HIPAA, there are 20 identifiers that have to be removed before the data is shared for research purposes, such as name, address, social security numbers, date of admission, etc. But with today’s technology, that may not be enough. Maybe nothing is truly private no matter how stripped it is.”
In sum, both Carome and Coney agree, the hospitals participating in this ambitious project should think long and hard about whether they really want to be involved in this kind of endeavor, which could jeopardize their core mission: confidential patient care.
This process used to be called the dehumanization of persons. Now it is better called the commodification of patients. For hospitals and physicians to buy into it is appalling. Even worse, the so-called research that Quintiles might generate will be next to worthless. Insult upon injury.