Curing the Health Care Debate

Curing the Health Care Debate

With Republicans controlling both chambers of the 114th Congress, health care is destined to stay at the top of the national agenda. A quarter of midterm voters say health care is their top concern, with half of these voters saying Obamacare is too far-reaching. Speaker Boehner says the House will vote to repeal Obamacare’s most unpopular elements. That may satisfy some Republicans, but it will only increase pressure outside the party for a “Republican alternative.” 

This is actually a great opportunity for Republicans — and Democrats — who want to move the health care debate beyond familiar arguments over insurance to the overarching goal of improving health. Issues of insurance access and affordability must be considered. But they are secondary to the main goals of a health system: curing people who get sick, and keeping people from getting sick in the first place. 

Reducing health costs is a function of curing disease, not the other way round. Washington has been working the problem backwards, focusing on finances first. This perplexes voters. A quarter century of insurance reforms has brought health maintenance organizations, “managed competition,” and now Obamacare mandates. Through it all, voters have noticed an alarming trend: they aren’t getting healthier, just more put upon. Voters ask Washington to fix health care, wanting better treatments and lower costs. Washington hears the message on costs and gets to work. Lost in translation is voters’ greater desire to hasten efforts to cure disease. New laws complicate an already convoluted health insurance system, frustrating voters’ efforts to access newer or better treatments. Washington thinks it has solved the problem, only to be taken aback when voters rebel.

Americans know from personal experience that cures and cost are closely linked. There is no way to reduce costs for a disease that is chronic and incurable. Blood sugar drugs for type 2 diabetics are fine while they work, but inevitably things worsen and the scale and cost of medical crises skyrocket. If diabetes could be stopped earlier, even if the treatment cost more, wouldn’t that be better, and cheaper, than trips to the hospital down the road? 

Americans see our health system for what it is, good and bad. They are living longer, which is good, and there are remarkable doctors and hospitals to save them if they are badly hurt. However, Americans also realize they are much more likely to develop chronic diseases, and uncured, their situation will worsen as they age. Who in Washington is working to solve that problem, they wonder?

The demand for better medicine is especially strong when it comes to diseases like Alzheimer’s. A Breitbart News poll released the week after the midterm elections asked voters if they would like a national effort to cure Alzheimer’s equivalent to the effort in the 1950’s to cure polio. By a 7 to 1 margin, 82% to 12%, poll respondents said yes, they would support such an initiative. As the article says, “in the world of public-opinion polling, not too many issues enjoy such massive support.” So, here is the great political opportunity in health care reform for a new coalition of Republicans, Democrats, and independents — tapping the vast political desire to solve disease.


A new approach to health starts with this question: how do we cure the life-threatening and chronic illnesses that pose the greatest threat to Americans’ lives and livelihoods? Framing the issue this way is critical. It restores the primacy of science and medicine over finance in curing disease. It recognizes that a few diseases pose disproportionate risk to the American way of life. It restores the traditional relationship between medical innovation and cost, recognizing that the best way to lower cost is to cure disease. Once a disease can be cured, or at least treated effectively, it can be insured. Until then, the prospect of care without end is destabilizing to insurers, and creates friction among consumers who want more, doctors who are pressured to do less, and the system’s long-term solvency (as patients with advanced disease seek costly, last-ditch measures).

Why do most in Washington fail to see that science is more important than finance? Because health policy over the last quarter century has been driven by the fusion of two ideologies that elevate finance and social science over medicine and technology. The first ideology is “Financialism,” the belief, spurred by Wall Street, that health decisions are best controlled using financial incentives and penalties. The second way of thinking, identified by former White House policy aide James P. Pinkerton, is “Scarcitarianism.” Scarcitarians believe the outlook for humanity is inherently limited, that industry and technology do more harm than good, and medical technology, by encouraging health consumption, is especially pernicious. The fusion of these ideas within both political parties, and their broad acceptance by elected and appointed officials and Washington policy shops, has crowded out those who actually created the 20th century’s greatest medical breakthroughs — scientists, physicians, and technologists.

Shaped by financialist and scarcitarian ideas, Democrats and Republicans have fought bitterly over how to reform health insurance, but they have done so within a shared belief: changing how people buy care will change health care itself. Conservatives and libertarians put their faith in market mechanisms, believing these will get people to consume health care in just the right amounts. Liberals distrust the market, preferring government to make purchasing decisions on the public’s behalf, and to ensure care is available to all. Either way, both groups seek to squelch demand by limiting supply. This, despite ample evidence that demand for cures is limitless, at least for those who get seriously ill. Steve Jobs, for example, spared no expense trying to stop his tumor once it spread, as did John D. Rockefeller, Sr. a century earlier on behalf of his grandson, stricken with scarlet fever. 

After his grandson’s death, Rockefeller poured a significant portion of his wealth into one of the most determined philanthropic efforts to cure disease. The Rockefeller Institute for Medical Research opened in 1901, delivering research and treatments for infectious disease. Its scope grew over the years. Today Rockefeller University encompasses every major area of medical research. The institute has hosted twenty-four Nobel Prize winners. Its scientists have made breakthroughs in understanding blood types, sleeping sickness, meningitis, heroin addiction, HIV, cancer, viruses, genetics, and human immunity. It is easier today to sum up the size of Rockefeller’s gifts than to account for their impact in hundreds of thousands, if not millions, of lives saved. 

Rockefeller, FDR, and their contemporaries were driven by a powerful sense of medical possibility. Yet the last forty years of health insurance debates have been dominated by an ideology, Scarcitarianism, that exults in limits and constraint. As described by Pinkerton, scarcitarians trace their intellectual roots to Thomas Malthus, the late 18th century English writer. Malthus believed Britain’s fast-growing population would outstrip its food supply causing economic and social mayhem. He was wrong. Britain’s agricultural and manufacturing inventions through the 19th century improved productivity much faster than population. Undaunted, scarcitarians dusted off Malthusian notions of limits, and applied it to health care, creating a deeply skeptical view of longer lifespans and medical technology. Drug and device makers, hospitals, even doctors encourage people to spend far more on care than they “should.” Scarcitarians believe people long past any reasonable hope of recovering from an illness continue to get pointless, though costly, care. So neither doctors nor the public can be trusted to keep the technological genie in the bottle. The only way to restore rationality to health care is to create a cadre of dispassionate, enlightened intermediaries, usually government agencies or insurers, to allocate ever dearer health resources among ever growing numbers of older Americans. 

Scarcitarians recoil at the suggestion that these ideas amount to death panels or anything so Orwellian. Far from it, they say, adding in the same breath that of course there should be some limit to what we spend on health care especially for the very old and the very ill. Yet it becomes startlingly clear where some scarcitarians would set these limits if liberated from voters’ scrutiny. Dr. Ezekiel Emanuel, a leading architect of both Clinton and Obama health policies, wrote in The Atlantic this summer that he will refuse intensive medical treatment if he gets a life-threatening illness after he turns 75. Emanuel described his octogenarian father after undergoing a major heart surgery, lamenting how he had slowed physically and cognitively. That’s not for me, Emanuel concludes — implying not for thee, either. Absent from Emanuel’s essay was any consideration that medicine fifteen years from now might be dramatically better. Medical technology is a self-deceptive extravagance in the scarcitarian worldview. Abnegation would do Americans a world of good.

The political dilemma for scarcitarians is how to construct agencies or processes to draw lines between “worthwhile” and “wasteful” medicine without suffering a political backlash. This is where Financialism lends a hand, providing opaque metrics like “quality adjusted life years” to assess the value of new treatments. Scarcitarians’ goal is to move the timeframe for valuing a therapy’s efficacy as close to death as possible. That shortens the timeframe over which new treatments might work, reducing their apparent value. 

The crux of the scarcitarians’ argument is that a treatment adding months to the lives of someone severely ill in their seventies or eighties is much less valuable to society than treatments that add decades to the lives of people in their prime working years. Well, of course this makes sense, at first blush. Except that diseases do not conform to accounting rules, they conform to biological ones, and biological dysfunction can be fatal at any age. 

Indeed, it is impossible, or at least grossly unscientific, to say we will cure a disease but minimize research or treatment of a particular set of patients. We will not cure lung cancer, for example, by studying tumors and treatments only in middle-aged people. We need to know how a tumor in a seventy year old differs from that of a forty year old, and whether these differences are caused by genetics, smoking, gender or any of hundreds of other factors. In fact, cancer research has been transformed over the past decade by efforts to find patterns across different tumors. We now know tumors that occur in the same organ can vary widely, genetically speaking. Conversely, tumors in different parts of the body share genetic drivers. Researchers made these discoveries because they looked at all kinds of patients, young and old, people who lived years after treatment and those who died quickly. If we decide that only people with disease severity below some threshold should get the latest treatments, everyone — young and old — will lose.

The last century of medical progress demonstrates that the path from idea to cure can be long and arduous, expensive at first, but eventually cheap, effective, and widely available. New cures often begin with a surgeon or drug researcher fighting a cruel or incurable illness. Early patients are “high risk.” They consent to research after widely used options, if they exist, fail. Patients undergoing experimental surgery are more likely to die or suffer complications. Drugmakers try to weed out toxic substances in the lab. Even so, the first patients to get a new drug face long odds. Gradually, the odds improve. Surgical techniques and drugs are refined to work better with fewer side effects. Clinical trials prove whether a new idea is equal to or better than current approaches. Knowledge spreads within the medical community. Soon, the “cutting edge” idea becomes part of the medical mainstream. What once cost millions and seldom worked now costs orders of magnitude less and works reliably. Over and over this pattern repeated itself during the twentieth century with infectious disease, battlefield care, heart disease, organ transplants, joint replacements, HIV, and cancer.

The formidable medical challenges we now face are the unpleasant side effects of the wildly successful past century of abundance. We have a safe and robust food supply, which is good, but it is filled with high-calorie, addictive products, which is bad. Few of us work grinding days in the fields or on factory floors, which is good for our longevity but bad for our metabolism. We have powerful antibiotics but we overuse them, hastening the day when they no longer work. We live longer, making age-related disease more common, and new ways of life and production probably increase the incidence of chronic disease. Our health care system’s goal should be to tackle these challenges, to stop or slow diseases that result from 21st century society, not to try to freeze the clock or turn it back.

Members of Congress understand on some level that curing disease is popular. That’s why they publicize their support for medical research and respond quickly to terrible situations of rare, fatal illnesses. Many politicians point to votes to increase funding for the National Institutes of Health as proof that they believe in cures. It’s true, the NIH’s budget doubled from the 1990s to 2010, although it has stabilized since then at about $30 billion a year. That sounds like a lot until you compare it to the $300 billion that Medicare and Medicaid spend each year treating chronic and age-related diseases.[1] When it comes to illnesses that pose the greatest threat to Americans’ lives and livelihoods, the ratio of federal spending on care versus cure is about 9:1. 

The ratio worsens taking into account total system-wide spending on chronic care, including all public and private expenditures. The CDC estimates annual public and private spending on chronic care totals about $2 trillion, against total public and private investments in medical research and development of $130 billion. That’s an almost 16:1 ratio in favor of care versus cure.[2] Our total annual spending on medical R&D represents just six and a half weeks of the amount of bonds the Federal Reserve bought at the height of its quantitative easing program, and pales in comparison to Americans’ approximately $40 trillion of aggregate net worth. So the means to invest more in medicine are there to be tapped.

How does the system’s strong bias for care over cure manifest in practice? Consider our approach to type 2 diabetes, the largest chronic disease in America. Over the course of their illness, many diabetics cycle through treatments for blood sugar control, circulatory issues, asthma, arthritic pain, depression, and high cholesterol. Years of escalating symptoms and complex drug regimens culminate in a debilitating cascade of medical crises — kidney failure, glaucoma, peripheral nerve pain and circulatory collapse. Advanced diabetics undergo dialysis, hospitalizations, even amputations. The cost in dollars, lost productivity, and lives is very high. Generally the costs of diabetic care in the disease’s early stages is relatively low since many blood sugar medications are cheap. Later on, however, when the disease progresses, costs quickly get out of hand as treatments prove to be less and less effective. 

It would be better to stop diabetes much earlier. Changing our food supply and how people eat is one way, but that requires immense shifts in a number of industries. A faster, though intense medical solution, is gastric bypass surgery. This entails shrinking a person’s stomach and routing food farther down their digestive tract, reducing appetite and food absorption. Patients who follow a disciplined diet restore their metabolism, bringing blood sugar and cholesterol levels closer to normal. But the surgery costs four to five times the average annual cost to treat a diabetic with drugs. Private health insurers, who cover most diabetics until they move to Medicare, prefer the incrementally cheaper alternative — blood sugar medications — even though it results in much greater expenses and suffering over time. A system that favored cures over care would encourage approaches like gastric bypass surgery, and in doing so, also drive development of less extreme or less risky measures to reduce appetite.

Alzheimer’s Disease also illustrates how we overpay for care and underinvest in cures. Alzheimer’s has killed a president, leading politicians, authors, musicians, and millions of others. Sadly, it will take two of America’s winningest men’s and women’s college basketball coaches, Dean Smith and Pat Summitt. Who of us is immune? We do not know, yet. There is much still to learn. Alzheimer’s appears to start years before a person is aware of memory loss or impaired thinking. Some versions seem to be driven by genetic breakdowns while other forms are caused by a snowballing of genetic and non-genetic factors. It is clear we need to detect the disease in its earliest stages and learn how it spreads. A year and a half ago, pharma company Eli Lilly asked Medicare to reimburse use of a new brain imaging agent that helps visualize the spread of brain proteins that lead to Alzheimer’s. Medicare said no, we will not pay for it, except in very limited circumstances. A policy that erred on the side of cures would have made the opposite decision. Not only should the Lilly drug have been reimbursed, it and other companies should have been encouraged to develop better and cheaper early detection approaches. Lilly may try again in the next couple of years, this time with an imaging agent that detects the protein tangles in Alzheimer’s that actually kill neurons.

All is not lost in Washington. One bright star in the congressional firmament is the House Commerce Committee led by Fred Upton (R-MI) and his ranking Democratic member, Rep. Diana DeGette (D-CO). Under their leadership, the committee has held hearings over the last six months examining barriers to medical innovation. The committee is asking why so few new drugs are being approved by the FDA, except for rare conditions? Why is it so expensive to develop new drugs and devices? Why is the FDA creating uncertainty over the use of mobile health apps? Why is venture capital shying away from life science companies even as it flocks to companies writing software for insurers and hospitals? What could Washington do, or stop doing, to hasten innovation?


Upton’s 21st Century Cures Initiative provides a starting point for work to build a congressional health agenda that is pro-science and pro-consumer. Six initiatives, outlined below, would reinforce the best aspects of the current system while accelerating efforts to cure disease. Bringing scientists, technologists, and consumers closer together is important; this effort will require a durable political coalition that can withstand pushback from winners in the status quo.

First, Congress needs to reassure consumers and build trust. The Hippocratic Oath famously requires doctors to do no harm. So to, Congress should preserve elements of the current insurance and health payments system that earn consumers’ trust. Status quo forces will resist the shift from care to cure. It threatens to upend their business models. They will appeal to consumers’ understandable anxiety about change. Making clear that key elements of the system on which consumers rely will not change is critical to earning voters’ trust.

It is most important to be clear, unequivocally, that Medicare benefits will not change. Yes, Medicare is at long-term risk of insolvency. However, it will be bankrupted, if it is, by the cumulative burden of incurable chronic disease, not by benefit formulas. Said in a positive light, Medicare can be saved in a politically acceptable way only by curing or slowing the onset of diabetes, dementia, and cancer. All other options are politically unpopular and practically ineffective. 

In terms of the insurance markets, protections for people with preexisting conditions must remain. The government and insurers should continue to fund high value preventive health measures like prenatal care and counseling for young mothers, universal vaccinations and flu shots, early detection of neurodevelopmental issues and access to therapies, regular dental and foot care, and tests like colonoscopies and mammograms that aid in early cancer detection. Any rules requiring minimum levels of insurance coverage should focus on catastrophic situations and preventive health. Minimum coverage rules should give states flexibility to link requirements to gender and age. A sixty year old women shouldn’t have to buy prenatal coverage but should be covered for mammograms, cognitive screening, and bone density tests. The converse is true for a twenty-year old.

Building consumer trust also means giving people confidence that parents and loved ones receiving institutional or home care are treated with dignity. Caring for someone with advanced disease or dementia can be difficult. Multiple medications — a function of not being able to cure chronic disease — complicate matters. Patients may become combative or violent. Regardless, we cannot countenance abuse or neglect. Congress should create a more public system for consumers to report abuses, and to publish the outcomes of investigations. Personnel dismissed for abuse violations in one place should not be able to work as a caregiver in another. Care providers should be required to publish data relating to patient population, staff expertise, safety incidents, and unplanned hospitalizations such as falls or malnutrition.

Second, as the Ebola outbreak has shown, we are woefully unprepared for a major infectious disease outbreak. Bill Gates put it this way at a medical conference this month:

“The world as a whole doesn’t have the preparedness for epidemics, and we’ve had a few flu scares that got us to do some minor things, but not enough. If [Ebola] had been twice as transmissive, we’d be in a lot of trouble, and there are agents that have a real chance of coming on in the next several decades that are far more transmissive than this is. What’s to stop some form of SARS showing up?” [3]

Preparing for epidemics like Ebola or highly virulent flus means developing diagnostics to identify infections quickly, designing drugs, and building manufacturing capacity. Entities like the Gates Foundation and private sector entrepreneurs are making great progress on the rapid diagnostics front. Likewise, there are a number of efforts under way on Ebola and other infectious diseases to develop therapies. 

However, as Gates indicates, there are a couple of areas where government leadership is required. The first is to increase research into the growing incidence of antibiotic resistant bacteria and the emergence of dangerous flu strains like SARS. Antibiotic resistance is an immediate challenge and more virulent flus are a potentially existential threat. More Americans die each year from drug resistant infections than HIV. Twice as many people died from the 1918-1919 Spanish flu than in the four years of world war that preceded it. 

The practical challenge of developing antibiotics and antivirals is that they have limited commercial value unless there is an outbreak, at which point demand overwhelms supply. We lack national infrastructure to produce diagnostics and treatments quickly and reliably. Yet, as we have seen with Ebola, if we wait for an outbreak to start producing, it will be too late. So we need a strategic reserve and a strategic manufacturing capacity, just as we have for military equipment and oil. Factories could be privately owned by US businesses, as is the case in defense industries, but they will need minimum purchase commitments by the government to keep personnel trained and machinery operating. In exchange for a government-backed marketplace, companies would provide discounted prices on treatments in the event of an outbreak, or if the products are not needed at scale, equivalent discounts on other products sold to public health programs. In addition, the US government could help fund antiviral production capacity at home by encouraging sales overseas, as we do with military hardware.

Third, we need to change the paradigm of age-related and chronic disease research. Consumers need to be involved long before they are diagnosed with a particular disease to help identify how diseases start and which treatments work best. Our current approach to treating disease is reactive. Someone is diagnosed when their symptoms compel them to go to a doctor or hospital. Unfortunately, many chronic conditions start before outward symptoms appear. By the time treatment starts, the disease process is established, making the situation harder, even impossible, to cure. A better approach is to detect disease risk or onset as early as possible and understand how the disease may progress in similarly situated people. In the same way, we will learn over time which treatments work best for which people.

Designing cures takes lots of time and data. Even the wealthiest and most knowledgeable people cannot afford to wait until they are seriously ill to work on a cure. Steve Jobs was one of the first people to get his tumor genome sequenced in search of a specially targeted drug combination to halt the tumor’s spread. Likewise, Dr. Ralph Steinman, a Nobel Prize-winning immunologist, mobilized colleagues to help fashion a treatment for his pancreatic cancer using his own discoveries. Sadly, neither Jobs nor Steinman survived. Since few of us know what will kill us, or when, we share an interest in getting as many others to participate in efforts to understand chronic disease. We all stand to gain from the hunt for cures. 

Moreover, we all have something to give. Each of us has unique genetic mutations and chemical markets that affect how our DNA works. We live in different environments affecting how we live and age. We acquire unique combinations of helpful bacteria that live inside us, called our microbiome. Some genomic anomalies can be deathly serious, like cystic fibrosis, and efforts are underway to develop therapies to cure these kinds of disease. However, most chronic diseases are the result of many small genetic or environmental impacts. In isolation each of these changes are inconsequential. In aggregate, over time, they increase the risk of getting disease, how fast it develops, and how well a drug works. Stopping chronic disease is like steering an aircraft carrier — you have to plan ahead. We need to approach efforts to cure chronic and age-related disease as though forty is the new sixty. That is, we need to understand what’s happening in our bodies in our forties and fifties if we want the years after sixty to be as disease-free as possible. Waiting until we are sixty or symptomatic is too late. We will have lost twenty plus years of useful data, and by then, disease processes may be too entrenched to cure.

Our health system is ill-prepared for this kind of paradigm shift. But history, and political leadership, shows how it might be done. Franklin Roosevelt created the March of Dimes to cure polio while in the White House. Within a month of launching in January 1938, the White House received 2.7 million dimes in the mail, notwithstanding the still widespread Great Depression. When the Salk vaccine was ready for testing in 1955, 1.8 million schoolchildren became “Polio Pioneers,” receiving the vaccine. Ethical standards for clinical trials have evolved a lot since 1955, but the communitarian ethos of the March of Dimes, inspired by FDR, points the way towards a similar national mobilization to defeat chronic disease.

However, instead of money, we need Americans to participate in building the largest and most valuable collection of biological samples in history. Participating Americans would give a little more at each annual checkup — blood or saliva — or a few hours every couple of years to take non-invasive tests of movement and memory. Data may even be gathered remotely as cellphones and wearable sensors advance. In aggregate, and over time, the United States will build the deepest knowledge about how chronic disease emerges and can be treated. 

There are a number of for-profit firms and academic consortia that have begun to build data collections, including for Alzheimer’s, autism, cancers, personal genomes, and fitness. These are valuable services, and should also be integrated into a larger data commons if they are willing. But the answers are not in genetic information or exercise schedules alone. We need a more systematic way to amass and associate data, and more open systems to share and integrate analyses. Studies that produce insights need to be retested and replicated so we do not chase phantoms. And, we must recognize that no one wants constant surveillance, even if it is voluntary. Americans will want to know insurers, employers, or credit agencies cannot access their information or use it against them. So privacy is critical. Finally, it will be important to provide a long-term commitment that information provided today will be repaid, if not in cures — since no one can guarantee that — in access to targeted and cutting edge therapies. 

Ironic though it is in the age of NSA snooping, only the government or a government-sanctioned entity can provide the enforcement mechanisms to protect how health information is collected or used. Beginning the data commons as a publicly driven effort makes sense since the ultimate beneficiaries are taxpayers, who will benefit through lower outlays by public health programs. Medicare, now funded on a pay as you go basis, could be saved by a national effort to pay it forward.

The best model for a new kind of data marketplace is the web, an effort which began as a government initiative but was privatized over time. It grew rapidly thanks to standard interoperability protocols and approaches to transmitting and presenting information. A similar approach could work here. Congress could authorize creation of a marketplace for health data, including a new type of health data company authorized to receive and market data. These companies would need to meet standards around data privacy and storage, and rules to ensure granular data (even anonymized) is only available to legitimate scientific endeavors or product developers. The data marketplace can be overseen by a private, not for profit entity similar to ICANN — the entity which oversaw early web companies — which would grant licenses to operate. A Board of data experts, technologists, privacy experts, and medical researchers could oversee work to create standards on data interoperability and privacy. 

Data collectors would be able to license information to researchers and product makers provided individuals who contributed data would be notified which researchers have accessed datasets that include their information. The contributors’ identity would be unknown to the researchers, but the researchers’ identity would be known to contributors. Over time, mechanisms could be developed to enable contributors to receive small royalty payments from product makers who used data sets to develop or test their products. In exchange for participating in the data commons, Americans would be eligible to participate in trials of new drugs or diagnostics relevant for their health situations with treatment costs back-stopped by Medicare if not covered by private health insurance.

A natural partner to data gathering firms would be life insurers, or health insurers providing supplemental Medicare plans, since they have the strongest interest in Americans living longer and reducing health costs as they age. However, to develop new insurance products linked to participation in the data commons, Congress may need to alter insurance laws to enable an explicit link between coverage and participation in a data collective, and to ensure a person’s status as a contributor to the data commons is excluded from insurers’ underwriting decisions.

A key part of improving the health system is to improve access to the most effective treatments. So the fourth measure Congress should take should be to end the geographic divide in treatment of life-threatening illnesses. If you get cancer and live in a big city like New York, Philadelphia, Los Angeles, Miami, or Boston, there’s a good chance you will be treated by a top doctor or at a top medical center. Unfortunately, the odds are much longer if you live in a town like Lincoln, Nebraska or Hattiesburg, Mississippi. Inhabitants of these cities, like the 100 million Americans who live in cities with fewer than 350,000 residents, are more than two hours by car or plane from top medical centers. 

This distance can be the difference between life and death. Data published by the National Cancer Institute, illustrate the geographic divide in treatment of ovarian cancer. A number of states with rural populations located farther from top cancer centers have disproportionately higher ovarian cancer death rates. This includes, Alabama, Arkansas, Iowa, Mississippi, Montana, Oregon, South Carolina, Tennessee, Vermont, Virginia, and West Virginia. Conversely, many states that have high ovarian cancer rates and top cancer hospitals in larger cities have disproportionately lower cancer death rates. This includes Colorado, Connecticut, Florida, Maryland, Michigan, New York, Pennsylvania, Utah, and Washington.

This is not to say that patients from rural areas have no chance to go to a premiere cancer center, but that their first visit to a premiere center for treatment is likely to occur later than people who live nearby. Getting care at a top center early in cancer treatment is important. Most cancer patients do not die from their first tumor but rather from a tumor recurrence or spread. The patient’s chance of living longer increases if doctors deal with the original tumor more effectively. Experience with new diagnostics to pinpoint tumor vulnerabilities or access to clinical trials with new anti-cancer therapies is concentrated in larger cancer centers. Many of these hospitals also do more tumor surgeries, which has been shown to correspond with better outcomes in some cancer types.[4]

Congress can start to fix the geographic divide by making it easier for people to travel to top medical centers and for top centers’ doctors to treat rural patients using video conferencing. Health insurance usually covers the cost for a patient to travel for a second opinion. However, this can be of limited value for a patient who needs to travel with a partner or children, or stay for weeks while undergoing treatment. This incremental cost is a small price to pay if it means more patients get better treatments sooner. 

To encourage telemedicine, Congress should treat consultations by physicians affiliated with National Cancer Institute Designated Centers of Excellence on an equal footing as in-person office visits. In addition, Congress should underwrite deployment of video consultation and high speed internet connections, grouping each NCI center with at least fifteen rural hospitals with meaningful cancer caseloads. There are 68 NCI-designated cancer centers. At a 15:1 ratio, investment in video streaming could enable experts in large cities to reach patients in a thousand rural hospitals. Once built, of course, video conferencing capability could be used for non-cancer related consultations, or for physicians to interact with patients and caregivers at home. Getting rural patients to top doctors earlier will also accelerate new drug testing by reducing the time it takes to enroll patients in clinical trials. Many top cancer trials are run at NCI-designated facilities, or the offices of affiliated oncologists.

Fifth, Congress should authorize states and localities to create medical enterprise zones to support life science startups. One model for doing this is the crash program in the 1990’s to treat HIV. The fight against AIDS was a tacit, if unofficial, enterprise zone. Many rules governing drug development were effectively discarded. Government and private sector scientists, regulators, physicians, and patients and families built formal and informal alliances to beat AIDS. New drug ideas were tested as quickly as possible. Insights into how the virus and treatments worked were widely shared. The best products got to market in record time. Drug pricing was a source of friction between patient advocates and drugmakers, but the strong public interest in getting drugs widely used led to a durable pricing regime including private insurers and public assistance for impoverished HIV patients. Activists engaged affected communities to save lives, organizing patients to sign up for clinical trials, and keeping pressure on the government to err on the side of getting new therapies to market. This political coalition had another benefit — few product liability lawsuits. Early HIV drugs had serious side-effects. Yet, in the more open environment that they were developed, and in the face of an incurable disease, there were fewer surprises for patients and less opportunity for trial lawyers.

Medical enterprise zones can build on lessons from the HIV fight. They would give device and drug startups access to labs, equipment, shared back-office services, and purchasing collaboratives. Companies would qualify for current or future tax breaks provided the startups build labs or manufacturing centers there. Costs to start an enterprise zones could be covered by federally backed bonds, to be repaid from royalties on product sales. Companies that relocate outside an enterprise zone would remain subject to royalty obligations for a time. In addition, product liability exposure for participating companies would be limited. Subject to state legislative action, punitive pain and suffering awards would be capped unless a company admits civil or criminal liability or a senior officer is convicted of the same. In exchange for these benefits, enterprise zone members would agree to auction intellectual property on any products or research abandoned, to provide most favored nation drug pricing to Medicaid plans of states in the enterprise zone, and to publish results and underlying data from all clinical trials of their products whether or not the trial result is favorable. 

Finally, Congress should enable new ways to pay for drugs and surgeries that have long-term benefits. Right now, treatments are paid for when they are given regardless of whether the benefit is for a day, a week, or a decade. As a result, new drugs that cure or defer a serious health problem for a long time are very expensive. A recent example is the debate over new Hepatitis C drugs. Hep-C is a chronic liver condition that unchecked causes liver failure or cancer. Either condition kills without a liver transplant, and even then the risks are high. Current treatments are only partially effective in stopping or slowing the virus and require patients to take drugs over many months. New Hep-C drugs seem to eradicate the virus permanently, but are priced at about $100,000 for a one-time three month treatment. Unfortunately, many of the 3.0 to 3.5 million Americans with Hep-C are on public health plans. Treating a million people on public health plans would cost $50 to $100 billion depending on how much of a price cut drugmakers give the public programs. 

This is the dilemma of paying for drugs up-front. Clearly we want Americans to be cured, but equally clearly, public health plans cannot afford most or all of the drug’s cost. One way to solve this problem is to pay drug companies some amount now to cover drug, with more paid out over time annually provided the drug continues to work. Spreading out payments could benefit everyone involved. In the case of Hep-C, drugmakers might get paid 25% of their desired price up-front, with the rest earned out over time with annual payments based on the patient’s age. Drug companies might even earn more under this annuitized price structure than under current pricing schemes if patients live longer disease-free. Public health plans could treat more people faster. Drugmakers and doctors would have a strong incentive to maintain contact with patients to assess drug efficacy. The same kind of annuitized pricing approach might also be used for some surgeries, like gastric bypass surgery with type 2 diabetics. Like Hep-C drugs, the surgery costs more upfront than existing drugs, but it is effective long-term in slowing or stopping diabetes.


Electoral volatility over the past few election cycles underscores voters’ disaffection with Washington. Both parties have let voters down. Both parties have paid the price. Voters want leadership and policies that will make a real difference in their lives. Health insurance reform has been bitterly contested, but a quarter century later there has been little improvement in Americans’ health. If anything, medical technology has advanced despite Washington.

Now is the moment to change that dynamic, to enact a set of health policies that are actually about health. This starts with agreement on a few core ideas. Cures are better and cheaper than care. Science enables finance, not vice versa. Curing the worst diseases will require the trust and involvement of as many Americans as possible. We are all in this together. 

The health policies outlined above would advance these goals — building voter and consumer trust, improving our preparedness for infectious disease, creating a new way for Americans to contribute to and share the benefits of medical research, ending the geographic divide in accessing top doctors and hospitals, cultivating medical startups using enterprise zones, and developing new ways to price drugs based on their long-term value.

Health care is complicated. However, our goals today are the same as they were at the dawn of modern medicine a century and a half ago. We need to understand how our bodies work, how to restore function after injury, how to kill pathogenic microbes, and how to slow or stop breakdowns as we age. The twentieth century brought tremendous progress against existential threats like smallpox and heart disease. Now we must overcome the unintended consequences of that progress, chronic disease that hastens aging and decay. Scientists and technologists have made remarkable progress in to understand genetics and key biological processes. 

Now they — and we — need a political and a policy framework that does everything possible to accelerate research and product development. We know drugs will need to be more targeted, in who uses them, and how they work within our bodies. We know our social safety net will be bankrupted unless we can solve chronic disease. Solving these issues, creating a health system for the 21st century, begins with an unremitting commitment to advancing cures, and building a coalition of political leaders, scientists, and consumers committed to bringing great medical discoveries to fruition.

[1] Centers for Medicare and Medicaid Services, “Chronic Conditions Among Medicare Beneficiaries, Chartbook: 2012 edition”, p. 22

[2] 75% of health costs are for chronic disease: Total health costs of $2.7 billion in 2011: US Medical R&D Costs:

[4] “Hospital Volume and Late Survival After Cancer Surgery,” Annals of Surgery • Volume 245, Number 5, May 2007