A Search for Answers to Continuing Health and Mortality Disparities in the United States

By Alberto Palloni and James Yonker

Over the past twenty years or so, patterns of mortality and health conditions in the U.S. population have developed two strong, disturbing traits. First, compared to other high-income countries, U.S. mortality and health status rank as one of the worst. Second, mortality and health status disparities by race, income, and education, far from weakening over time, remained invariant or increased. In this article, we describe the nature and possible causes of the U.S. mortality disadvantage in later life (older than age 50) and review the most recent findings regarding health and mortality disparities by race and education. We argue that disparities by race and education are puzzling because they are remarkably persistent in all modern societies, and particularly acute in the United States, despite advances in medical technology and healthcare.

The Growing Gap in Mortality in High-Income Countries

Two recent reports issued by the National Academy of Sciences argue convincingly that Americans experience higher mortality and worse health than their peers in developed countries (National Research Council, 2011a, 2011b, 2013). Americans face shorter lives than individuals in other developed, high-income  countries. As of 2012, U.S. life expectancy at birth among females ranks last or next to last in a set of about seventeen high-income countries, trailing top performing countries by more than four years (National Research Council, 2013). Conditions among males are slightly better, as they rank slightly below the middle of the group of the seventeen countries and trail the top performers by 3.6 years. Empirical evidence reveals that the U.S. disadvantage is visible both in mortality between ages 0 and 50 and at ages older than 50. But Americans may fare a bit better if they survive to older ages: at least in the most recent life tables, life expectancies at ages 65 to 70 are closer to the average of peer countries, and at very old ages (75 years and older), the United States experiences an advantage (Palloni and Yonker, 2013). Aside from this regularity at very old ages, Americans simply do not live as long as their peers in other wealthy nations.

The U.S. disadvantage also is visible in a broad array of health status measures, including pediatrics outcomes (birth weight and length of gestation), adolescent and adult mental health,  child and adult obesity, diabetes, teenage pregnancy, and accidents and injuries (National Research Council, 2013). The lower ranking of the United States is sustained over a long arch of the life span, from birth to late adulthood. This suggests that the U.S. mortality and health disadvantage is a symptom of a generalized malaise that affects individuals at all stages of their life cycle, not the product of particularly bad conditions in one of them.

Intriguingly, the United States did not always experience the life expectancy (or health) disadvantage we see today. In the 1950s and 1960s, U.S. life expectancy at birth, and at ages older than forty-five, was above the average value of the seventeen countries referred to above (the “pooled benchmark”), infant mortality ranked among the five lowest, and the very old already enjoyed the same advantage they experience today, albeit starting at somewhat younger ages (around age 70 rather than 75).  However, sometime during the late 1970s and early 1980s, the pace of U.S. gains in life expectancy slowed abruptly, infant mortality stood out as an egregious outlier among the seventeen countries, and female mortality performed worse than male mortality and began falling behind more rapidly. From that point on, peer countries pulled away from the United States and have done so uninterruptedly for nearly thirty years.

What caused this reversal? Is the U.S. disadvantage in mortality the same at all ages? And if not, at which ages does the United States exhibit an advantage and which ages drag down the pace of progress in survival? Is the poor performance of the United States associated with the presence of large groups of minorities that experience poor health and mortality, or is it a generalized phenomenon?

Historical trends in life expectancies

Figure 1 (see below) shows the difference of life expectancy at birth (males and females) between the United States and the average of a set of seventeen high-income (peer) countries  (Palloni and Yonker, 2014). Figure 2 (below) shows differences in life expectancy at age 50 and between ages 0 and 50. In both figures, a positive value indicates that the United States had higher values in life expectancy. In 1955, both American females and males enjoyed relative life expectancy advantages and were comfortably situated in the upper half of the optimal-worst range. Females exceeded life expectancy in the pooled benchmark by 2.3 years and males by 1 year.

Unfortunately, these advantages did not last. By 1960, average mortality in comparison countries caught up to that of U.S. males and, by 1965, surpassed it by one year. With the exception of some years around 1980, from 1965 forward, U.S. males lagged in average life expectancy by between 1 and 2.6 years. By 2010, the U.S. disadvantage was 2.3 years, and life expectancy was closer to the worst performing country in the pooled benchmark. Put another way, 2010 U.S. male life expectancy was roughly the same as the average male life expectancy from ten years earlier.

Female life expectancy followed a somewhat different and more dramatic pattern. The 2.4-year U.S. female life expectancy advantage observed in 1955 was gone by 1965. From then until 1980, it tracked average female life expectancy, and beginning around 1980, yearly gains slowed and the female disadvantage grew substantially and continued to increase unabated. By 2010, the previous advantage had become a gaping disadvantage of about 3.1 years. Female life expectancy in the United States was very close to being the worst when compared to the benchmark countries, and took on values that these countries had experienced fifteen to twenty years earlier.

Figure 2 (see below) displays differences in life expectancy at age 50 and between ages 0 and 50, relative to the pooled benchmark. Note that the U.S. male disadvantage with respect to peers at age 50 already is in place by the late 1950s, whereas among females, it emerges only in the late 1970s, but then grows much more quickly than that of males. As with life expectancy at birth, there is a wholesale, steady, uninterrupted deterioration that has lasted more than thirty years. The current gap is of the order of 1.5 years for males and 1.9 years among females.

Explaining the U.S. disadvantage

Why does the United States underperform peer countries in the benchmark? Two panels of experts (National Research Council, 2011a, 2011b, 2013) reviewed the most important facts and concluded that there is not one smoking gun, but multiple explanations. First, an important fraction of the U.S. disadvantage relative to peer countries is due to higher infant and child mortality. Even though this quantity is influenced by high levels of infant and child mortality in the African American population, the United  States would continue to underperform peer countries even if everyone experienced the same levels of infant and child mortality as the non-Hispanic white population.

Secondly, the United States also underperforms, relative to peer countries, from ages 15 to 49, where the bulk of mortality is not because of infectious or chronic diseases, but from external causes such as accidents (especially involving motor vehicles) and violence (particularly homicide and suicides). The bulk of violence probably is due to deaths associated with firearms, whereas the role played by suicides is surely underestimated, as death by suicide usually is classified as another cause of death. In the United States and elsewhere, an important fraction of deaths occurring in the population between ages 15 and 30 is associated with drug consumption, but it is difficult to support this contention with empirical evidence because the necessary details are seldom retrievable from publicly available data.

Finally, the U.S. population experiences relatively poorer survival between ages 50 and 69, approximately. The most likely explanations here are exposures to conditions and behaviors that increase mortality risks due to cancers, Chronic Obstructive Pulmonary Disease (COPD), and cardiovascular diseases in general. Smoking, diet, and a sedentary lifestyle loom large as the real culprits. The pervasiveness of all of these habits in the U.S. population unquestionably has increased over the last twenty years or so.

This is all bad news. But there is a bright spot: the U.S. population appears to do well, better than most countries in the benchmark we use here, at ages above 70 or 75, approximately. We turn to this in the next section.

The puzzle of U.S. mortality and health at older ages

Mortality rates in the United States compare favorably to other high-income countries only at very old ages. In particular, age-specific mortality rates in the United States (male and female) are higher than peer countries for most of the life course, but at older ages (above ages 70 to 75) they are lower. While life expectancy at birth, at age 5, or at age 65 is lower in the United States than in peer countries, life expectancies at ages older than about 70 to 75 are, on average, more favorable in the United States. Americans who attain very old ages experience higher survival probabilities and higher residual life expectancy than their peers. This is referred to as the “U.S. mortality crossover” and the age at which the U.S. mortality advantage begins as the “crossover age.”

There is no clear consensus in the literature regarding the age beyond which the older population experiences lower mortality (Palloni and Yonker, 2013). In some accounts, this age is estimated to be between 80 and 85, in others between 70 and 75, and in others it is believed to be as low as 65, at least for females. The following two factors may explain the U.S. advantage at very old ages. First, the establishment of Medicare in the 1960s provided the majority of the older-than-age-65 U.S. population with relatively inexpensive access to key healthcare and services. Second, and most importantly, the quality of U.S. medical care—in prevention, detection, and treatment—for chronic conditions most commonly experienced at older ages is superior to that in many peer countries (Ho and Preston, 2010).

Mortality Disparities in the United States and Elsewhere: Race, Wealth, and Education

While the worldwide standing of health status and mortality in the United States has become worse over the years, both the United States and peer high-income countries share an enduring and remarkable trait: individuals who occupy the lower ranks in the social and economic hierarchies in all high-income countries experience worse health status and higher levels of mortality. In addition, in the United States (as well as in some peer countries), members of ethnic or national minority groups experience worse health and mortality regardless of their position in the social hierarchy. These health and mortality disparities by race or socioeconomic position are known as the SES (socioeconomic) mortality (health) gradient, or the “SES gradient” for short.

One would think that with the last century’s remarkable advances in medicine, health and mortality disparities across social groups within modern societies would have disappeared. But they have not—and the United States is no exception.

Nature of education health and mortality gradients

With the exception of some pre-industrial societies in the 1500s, individuals at the top of a social hierarchy have always enjoyed better health and longer lives than those at the bottom of the hierarchy. As a consequence of the discovery of germ theory and the massive revolution in public health and medical technology that followed, these disparities became duller over time, though continued to exist. During the past twenty years, many of the same high-income countries that greatly benefitted from consistent and uninterrupted gains in life expectancy have experienced an increase of these gradients. The gradients are not the same everywhere but, startlingly, they are as salient in countries with socialized medical care and health services as they are in societies dominated by market-driven healthcare systems, and they occur irrespective of the magnitude of national health expenditures.

First, it does not matter how we define an individual’s position in the social hierarchy. We can use income, wealth, occupation, or education and the gradients will occur all the same. Second, the gradient is observed when we use mortality indicators or any measure of health status we care to define. Different indicators target different dimensions of health, from self-reported health to diagnosed illnesses, disabilities, and limitations. However, with some exceptions (e.g., cancer), the gradient stubbornly persists and manifests itself with more or less the same strength irrespective of the health outcome measured.

Third, the gradient is expressed at all ages, though its magnitude varies over the life span: it increases from birth to about age 50, and then generally decreases at very old ages, perhaps as a result of disproportionate selection of the most frail among those in the lower strata. Similarly, while males and females often experience similar gradients, gender contrasts are quite powerful and can vary significantly across indicators of socioeconomic standing or health indicators. With few exceptions, however, what applies to males typically applies to females.

Finally, although health and mortality gradients vary by race and ethnic groups, the variability of within-ethnic-group-gradients is not large and when there are exceptions (for example, among some recent immigrant groups), these can be the result of selection of the migrant populations.

How large are mortality gradients in the United States?

First, what are the gradients by race? According to very recent estimates, the difference in life expectancy between African Americans and non-Hispanic whites in the United States is about 3.7 years for females and 5.3 years for males. These differences are only slightly smaller in absolute number (less than 1.5 years in each case) than those that prevailed in 1970. While differentials in infant and child mortality explain an important part of these contrasts, the disparities occur in all except the very old ages.

Second, estimates of mortality disparities by income are hard to come by and also not the best, as income is likely to change frequently, suddenly, or unexpectedly, so it is not a good measure of socioeconomic position. Yet, mortality disparities by income are observed all the same and are well-graduated (e.g., contrasts not only are between the very poor and the very rich, but also across the board). As of recent years (between 1995 and 2005), U.S. adults in the lowest quartile of income distribution experience mortality rates 1.5 to 2.5 times higher than those in the higher quartiles. Disparities by wealth—a better and more stable measure of position in the social hierarchy—confirm similar contrasts between individuals located in various positions in the wealth distribution spectrum.

Perhaps the most shocking gradient is the one involving educational attainment. Empirical estimates of the size of the education gradient in mortality are the sharpest, least controversial, and have a long history in high-income countries. They were first assessed in the United States in the early 1970s and have become the target of many studies and controversies about their nature. There is absolute unanimity in empirical studies on this score, as they repeatedly unearth similar contrasts between education groups. These contrasts are gradual, not just between the extremes, they have persisted for a long time, and may have become larger in the most recent past.

When translated into a life expectancy metric, U.S. mortality data during the late 1970s and early 1980s imply that an average individual with less than a high school education could expect to live about 43.5 years after age 25, whereas an average individual with at least some college education could expect to live 46.5 years, a difference of about three years. Twenty or so years later, we find that differences in mortality above age 25 between those with a high school education or less and those with at least some college are equivalent to a disparity of life expectancy at age 25 on the order of 3 to 3.5 years of life (Palloni, 2006). These estimates are broadly consistent with estimates from a recent study using the National Longitudinal Mortality Study, showing that the difference in life expectancy at age 25 between those with high school education or less and those with some college or more was about 2.8 years in 1981 to 1988, and grew to about 3.7 years in 1991 to 1998 (Meara, Richards, and Cutler, 2008). Education gradients are not the same by gender and ethnicity, but the within-group education estimates confirm rather than disprove the existence and time trends of the education gradients (Meara, Richards, and Cutler, 2008). The magnitude of these empirical estimates is rather large, as they amount to about 7 percent of expected years of life above age 25.

The increase in the education gradient over time does not easily fit with common intuitions about massive improvements in medical technology and better health for all. Continuous medical and technological progress is not accompanied by shifts in the distribution of health benefits, but instead takes place simultaneously with persistent or increased health inequalities. Interestingly, the persistence of the education gradient is occurring during a period of expanding aggregate income inequality and diminished social mobility, both in the United States and elsewhere in high-income countries.

The above state of affairs characterizes not only the United States, but also other highincome countries. This has been recently corroborated in a massive study using a multicountry, multi-continent publication database. In it, Baker et al. (2011) estimate that the relative mortality risks between those with low education (less than high school) and those with more education (high school and above) hover in the range 1.22 to 1.35 in North America, and between 1.34 and 1.42 in Europe. Once again, assuming a Western (female) mortality model pattern and levels of life expectancy of around 75 years, these relative risks translate into disparities in life expectancy at age 25 on the order of 3 to 3.5 years. These findings suggest that high-income countries, far from relegating health and mortality disparities by education to the dustbin of history, continue to experience them and, what is more baffling, show unequivocal signs that they are becoming sharper over time.

And why are education disparities so intriguing? Because they defy preconceived notions about humans’ ability to attenuate inequalities. After nearly 150 years of uninterrupted medical advancements and progress in delivering healthcare, they persist. Education disparities also are puzzling because even after massive amounts of research we cannot find convincing explanations. Part of the explanation is that there are important income and wealth differentials by education. But this is not all: individuals with different educational attainment are exposed to different amounts of health information and risk factors to avoid. But all of these accounts only amount to a partial, unsatisfactory, and incomplete explanation.


Perhaps the answer ought to be sought elsewhere. An important avenue of new research is emerging suggesting that the key to adult education, health, and mortality differentials must be sought in exposures that individuals with different educational attainment experienced during childhood. These are exposures that contribute to the formation of traits conducive to healthy lifestyles, and include individual discipline, confidence in the future, industriousness, etc. Were this research to turn out satisfactory explanations, it will have one major implication: that health and mortality disparities in one generation literally are inherited from the previous one and that, consequently, to break the cycle, interventions must target the child, not the adult. This will be a tall order, one that becomes ever more difficult to accomplish if the current trend to underfund public schools continues.

Alberto Palloni is the Samuel H. Preston Professor of Population at the Center for Health and Aging, University of Wisconsin-Madison in Madison, Wisconsin. James Yonker is a doctoral candidate in the Department of Sociology at the University of Wisconsin-Madison.

Editor’s Note: This article is taken from the winter 2014/15 issue of ASA’s quarterly journal, Generations, an issue devoted to the topic “Social and Health Disparities in America's Aging Population” ASA members receive Generations as a membership benefit; non-members may purchase subscriptions or single copies of issues at our online storeFull digital access to current and back issues of Generations is also available to ASA members and Generations subscribers at Ingenta Connect. For details, click here.

Join ASA to recieve an automatic subscription to Generations as well as many other benefits of membership!


Baker, D. P., et al. 2011. “The Education Effect on Population Health: A Reassessment.” Population and Development Review 37: 307–22.

Ho, J. Y., and Preston, S. H. 2010. “U.S. Mortality in an International Context: Age Variations.” Population  and Development Review 36(4):  749–73.

Meara, E. R., Richards, S., and Cutler, D. M. 2008. “The Gap Gets Bigger: Changes in Mortality and Life Expectancy, by Education, 1981–2000.” Health Affairs 27(2): 350–60.

National Research Council. 2011a. International Differences in Mortality at Older Ages: Dimensions and Sources. Washington DC: The National Academies Press.

National Research Council. 2011b. Explaining Divergent Levels of Longevity in High-Income Countries. Washington DC: The National Academies Press.

National Research Council, 2013. U.S. Health in International Perspective: Shorter Lives, Poorer Health. Washington DC: The National Academies Press.

Palloni, A. 2006. “Reproducing Inequalities: Luck, Wallets, and the Enduring Effects of Childhood Health.” Demography 43(4): 587–615.

Palloni, A., and Yonker, J. 2013. “Why Are We Getting Worse?” Paper presented at the Population Association of America Meeting, San Francisco, May 2012.