In the past hundred years, the U.S. has experienced dramatic improvements in medical therapies, mostly due to the field’s adoption of science. We’ve acquired technologies like antibiotics, randomized clinical trials, The Pill, pacemakers, organ transplants, high-resolution medical imaging: the list goes on and on. These developments have led to a higher quality of life, less disability, and lower rates of death. Before science-based medicine you could die of something as simple as an infection from a flesh wound. Today, because of antibiotics, it’s nearly unheard of.
Our arsenal of treatments keeps expanding, promising to extend people’s lives even further. For example, the past two decades have seen a drop in deaths from heart attacks thanks to small scaffolds, called stents, that are wedged in the coronary artery and restore blood flow to the heart. Even more revolutionary advances like gene therapy, nanotechnology and the cultivation of human stem cells are likely to stave off the specter of death even longer, while making our lives productive into very old age.
However, despite these advances, medicine in the U.S. has not kept pace in other ways. For example, the U.S. now spends the most on health care among developed nations–two to four times more per person. This spending represents a staggering 20 percent of our economy. And yet, we continue to achieve poorer results for our money and lag behind other industrialized nations in vital statistics such as infant mortality.
In addition, many Americans are not happy with how they are treated by doctors and by the healthcare system at-large. This discontent can be seen most clearly with end-of-life care. A 1995 study to understand and improve care for terminally-ill patients, the largest of its kind to-date, found that more than 40 percent of families were unhappy with the fashion in which their loved ones were cared for as they died.
Trust in doctors has eroded as well. A 2014 study by the Robert Wood Johnson Foundation found that only 34 percent of Americans have great confidence in the leaders of the medical profession. Back in 1966, nearly three-fourths of Americans felt the same way. In addition, only 58 percent of Americans agreed with the statement, “All things considered, doctors in your country can be trusted,” the U.S. a ranking of 24 out of the 29 countries where the question was asked.
How is this possible? How can physicians in the U.S. have access to the most technologically-advanced arsenal of treatments the world has ever known and still be falling behind in terms of cost, outcomes and patient satisfaction? There are a number of interrelated issues:
1) Our healthcare system focuses on treating diseases, not people.
The medical knowledge we gained in the 20th century had very narrow goals: stop people from dying. It was focused on treating short bouts of illness caused by a specific disease often localized to a particular organ or organ system. However, the CDC estimates that over half of adults in the U.S. suffer from one or more chronic diseases that cannot be cured, only managed. The costs of treating these diseases now represents 75 percent of the $2 trillion in U.S. annual healthcare spending. While we will always need acute care, managing chronic illness requires a different mindset. Physicians must consider not only the physical disease, but psychological, cultural, and socioeconomic factors that contribute to the illness. It is no longer enough to simply treat the most pressing symptom and wait for the patient to return when the condition gets worse.
2) Our payment structure reinforces the focus on disease by rewarding procedures, not cheaper interventions like prevention or care coordination.
In the 1950s, 60s and into the 70s, primary care physicians were well respected members of the community and helped patients navigate and coordinate more specialized care. Children of this era remember having a family doctor who would attend to all of their family’s medical needs. However, as Forbes columnist Todd Hixon beautifully summarizes:
In the 1980s and 1990s, as the cost of healthcare became burdensome for corporate and government payers, the dynamic changed. The federal government and the insurance companies created a structure of procedures and payment rates for each. Procedures based on higher levels of training and technology received higher fees. The Feds and insurers tried to push down prices of procedures, but at the same time they rewarded advances in medical knowledge and technology, and the result was highly trained specialists were well paid for performing sophisticated procedures, and family doctors were squeezed.
With money flowing to specialists, primary care doctors were forced to see more and more patients and had less time to spend with any one patient. Unpaid services such as preventive care and care coordination quickly went out the window. With the lure of greater prestige and earning potential, medical students funnelled into specialties, creating a deficit of primary care doctors. Today, it isn’t unusual for patients to be shuttled from one specialist to another with no one looking at the bigger picture of the patient’s well-being.
3) Treatment decisions are influenced by money, not necessarily what is best for the patient.
The procedure-based payment structure rewards doctors for doing more, even when it might be better to do nothing. As Sanjaya Kumar and David Nash write in their groundbreaking 2011 book Demand Better: Revive Our Broken Healthcare System:
Our healthcare delivery system spends more than 700 billion of its 2.3 trillion in annual health spending on medical care that does nothing to improve a patient’s health…seven hundred billion dollars every year. And, most alarmingly all that ineffective treatment and harmful care represents one-third of tests, treatments and procedures that physicians perform.
It’s not that physicians are looking to waste resources or get rich, but as Kumar and Nash note:
[Our current reimbursement] system and our cultural values serve up a ready answer to physician uncertainty as to what tests and treatments to order for their patients: more is better. When evidence is incomplete or conflicting about when to use a particular procedure, surgery or diagnostic test…some physicians will treat more aggressively, especially if piecework reimbursement rewards that.
Unfortunately, only about 20 percent of clinical procedures have solid scientific evidence to back them up. This means in many case physicians are flying blind and under great economic pressure to do more, even when it doesn’t necessarily serve the patient’s needs. With an arsenal of government-approved treatments available, there is great temptation to do “something,” even when it might be better to simply watch and wait.
4) Patients’ preferences, goals and values are marginalized.
When the patient is reduced to a vehicle for disease, the doctor becomes the most important person in the healthcare process. This may work fine when medical decisions are straightforward. But when there is ambiguity, a patient’s preferences, goals and values are essential in choosing the right course of action. The current culture of medicine (in addition to the economic incentives mentioned above) doesn’t encourage this kind of two-way communication. As Dutch social scientist Jozien Bensing notes:
…the discussion about norms and values inherent in every clinical judgment and decision seem to shift from the doctor’s consultation room to the conference room of the doctor’s professional association. If intentionally or unconsciously physicians do not want to negotiate with their patient about the usefulness of certain interventions, they can refer to the opinion of their professional association that is codiﬁed in guidelines and protocols instead, thereby shifting the responsibility for clinical decisions from a personal decision to a professional group decision.
All too often, the patient buys into the mindset that decisions about their health are best left to doctors. They become passive recipients, rather than active participants in their own care. This may have been acceptable when the aim of medicine was simply to keep people alive, but chronic conditions, in particular, require the patient to play a larger role in managing their own health.
Adding fuel to the fire is a rising tide of chronic illness. A glut of cheap calories in the American diet and a lack of daily activity have led to a dramatic increase in obesity and its associated conditions such as diabetes and heart disease, particularly in the last 30 years . Meanwhile, the Baby Boom generation, which represents a quarter of the U.S. population, is beginning to hit retirement age. This is a period of life when we become more vulnerable to illness and chronic conditions tend to accumulate.
As we’ve seen, the disease-based and doctor-centered medicine that brought us so far in the 20th century isn’t well equipped to mitigate and manage this growing tide of chronic disease. A new paradigm is needed that treats the whole patient and establishes a more balanced relationship between doctors and their patients.
Demand Better!: Revive Our Broken Healthcare System. Second River Healthcare Press, 2011. http://www.demandbetter.com
The Flexner Report – 100 years later. Yale Journal of Biology and Medicine.
Public Trust in Physicians — U.S. Medicine in International Perspective. The New England Journal of Medicine, 2014.
A controlled trial to improve care for seriously ill hospitalized patients. The study to understand prognoses and preferences for outcomes and risks of treatments (SUPPORT). New England Journal of Medicine, 1996.
The Doctor Patient Relationship is at a Cross Roads. Forbes website.
Bridging the gap: The separate worlds of evidence-based medicine and patient-centered medicine. Patient Education and Counseling, 2000.
How the Current System Fails People With Chronic Illnesses. Society of Actuaries website.
Accessed on March 17, 2015 at:
New Study Finds Less Than 25 Percent of New Doctors Work in Primary Care. George Washington University website:
Accessed on March 17, 2015 at:
An Unhealthy America: The Economic Burden of Chronic Disease. The Milken Institute.
Accessed on March 17, 2015 at:
The Power of Prevention. CDC National Center for Chronic Disease Prevention and Health Promotion.
Accessed on March 17, 2015 at:
Chronic Diseases and Health Promotion. CDC website.
Accessed on March 17, 2014 at: