There are (at least) two medical cultures in our world: treatment and research. The doctors you typically interact with were trained to treat patients, using the findings of medical researchers. Their primary professional concern is the welfare of their patients. This has some negative ramifications, only one of which I'll talk about today.
Medical researchers conduct studies. Sometimes people get helped in them, and sometimes they don't. But the progress of medical science depends on these studies. There are two basic ways to do empirical studies-- you can run an experiment yourself, or you can run analyses on data that have already been collected.
Running a study yourself is extraordinarily expensive, in terms of time and money. What would be great for medical research is if they had access to already-existing data. With this they can find trends.
There are many, many more treatment doctors than medical researchers. Why would a treatment doctor (hereafter simply "doctor") want to share the data of his or her patient? In general, they wouldn't. The logic is simple. Sharing patients' data is at least a minor breach, or a potential breach, in their privacy, and it does nothing to help the health of their patients. The doctors' mandate is to help their patients, so you can see that they have no incentive to share data.
The result of this is that medical researchers (hereafter "researchers") do not have access to treatment and outcomes for typical patients treated by doctors. The lack of coordination, combined with fears of privacy, have held back medical research, so more people suffer.
Medical privacy is important for two reasons:
1) Standard privacy concerns: “I just don’t want other people to know.”
2) Real, practical concerns “Nobody would date me,” “my insurance rates would go up.”
A government that has socialized medicine is in a unique position to stop this. They can enforce a centralized database of anonymous medical data on all patients, treatments and follow ups that can be used by medical researchers fight disease.
For countries with socialized medicine, this should be thought of as a part of what you pay as a citizen to have your data contribute to future citizens’ health, much like a tax. You pay taxes, you share data (or at least make it an opt-out policy; see below.)
But as I heard Tenenbaum say in the question period of a talk (see reference section), privacy is something everyone but the very sick care about.
The technical details are important. Every available step should be taken to ensure privacy. However, just because it won’t be perfect does not mean we should not do it. The potential loss of privacy for some individuals is an acceptable price to pay for the benefits of such a database.
Let’s take cancer as an example, as argued in Tenebaum and Shrager (2011). It kills millions of people, and is one of the primary killers in industrialized societies. Cancer, at an abstract level, is out-of-control cell reproduction. But this happens due to thousands of different kinds of mutations. As a result, it is a prime candidate for individualized medicine.
Oncologists (cancer doctors) treat patients with several medicines at once (“cocktails”). Doctors’ primary motivation is to make their patients well, so they have little incentive to contribute what was learned with each of their patients to a wider community. But it is exactly this information that is needed to scientifically explore individualized treatment.
Gleevec is a drug that helps only 3% of melanoma patients. Cancer drugs are typically tested on heterogeneous cancer patients. When a medicine can only help 3%, it becomes prohibitively costly, if not impossible, to find significance in research experiments. Millions of patients are required. Simply put, there are not enough cancer patients in the world to find drugs that help such a small percentage of people, if we are doing testing the way we’ve been doing it. And because many potential drugs will only help a small percentage of people, we'll never get them to market the way we're doing it.
“Cancer Commons” is a web based learning community that is a step in the right direction, as is Patients Like Me, a voluntary sharing site. But it should go much further than hoping people should volunteer. Policies like this should be opt-out, not opt-in, because many more people will participate. As Wikipedia says, regarding organ donation, "For example, Germany, which uses an opt-in system, has an organ donation consent rate of 12% among its population, while Austria, a country with a very similar culture and economic development, but which uses an opt-out system, has a consent rate of 99.98%."
Let's pool our data and beat this thing, finally, shall we?
Pictured: Image of a cell by MRK.
Tenenbaum, J. M. & Shrager, J. (2011). Cancer: A computational disease that AI can cure. AI Magazine, 32(2), 14--26.
(or watch the talk at http://videolectures.net/aaai2010_tenenbaum_cac/)