Age-adjusted mortality rates and public health discourse
A very enlightening statistic that’s practically unknown among the general public is age-adjusted mortality. Age-adjusted mortality works like this:
(1) The crude mortality rate in a population is simply the number of deaths in a year as a percentage of the total population.
(2) The age-adjusted mortality rate is the number of deaths in a year adjusted to a standard population. A standard population is a statistical construction of a population that maintains the same ratio of young to old people over time for the purposes of analysis, since age is by far the biggest risk factor in regard to annual mortality risk. In other words, if you only looked at crude mortality rates, you would conclude that a developing country with a very young population had much better health than a developed country with a very old one, which is the opposite of the truth if you control for the differences in the ages of the respective populations. This is also true across time within individual countries. For example, the media age of the US population was 28 in 1970, but right now it’s currently 39.
Age-adjusted mortality is usually expressed as the number of deaths in a year per 100,000 members of a standard population.
Here’s a chart that shows the change in the rate between 1900 and 2018. Updating the chart through 2023:
Age-adjusted mortality rate per 100K standard population:
2019: 715.2 (All-time low)
2020: 835.4 (First year of pandemic)
2021: 879.7
2022: 798.8
2023: 750.4
As you can see if you click on the link above, another way of putting this is that the pandemic increased the mortality rate to what it had been 25 years earlier. By 2023 things were back to where they were in 2009. With the decline in both Covid and overdose deaths in 2024, the number for last year will probably be pretty close to the 2019 pre-pandemic historical low.
A couple of general points:
The AAMR helps drive home how absurd the vague nostalgic belief that our grandparents etc. were healthier than we were because they had healthier lifestyles really is. Jimmy Carter was born at a time when an American’s risk of dying over the course of a year was nearly three times higher than it is today.
The difference in mortality risk between men and women is larger by orders of magnitude than a lot of health risks that get vastly more attention. For example, I blogged a few days ago about the debate regarding whether moderate alcohol consumption increases cancer risk. The argument that it does is based on assuming that the 10% to 20% increases in risk over a tiny baseline risk (the risk that someone will die from an alcohol-associated cancer if they don’t drink) is 100% a product of moderate alcohol consumption rather than to other factors. But a 10% to 20% increase in risk over a tiny baseline risk is still a tiny risk.
Meanwhile, my risk of dying in 2025 compared to a woman who is demographically identical to me to all respects other than biological sex is 40% higher! (Age-adjusted mortality risk for men in 2023: 884.2/100K. For women: 632.8/100K). Which is to say that the difference in mortality risk between men and women is the same as the difference in mortality risk for the general population between 1968 and 1935, i.e., an entire generation of public health advances, or nearly double the difference in mortality risk between pre-pandemic America and the height of the pandemic in 2021. And that’s the all-cause mortality risk, not the tiny risk of dying from the very small group of cancers where a very small increase in risk is associated with drinking one or two drinks per day.
Given the prevalence of maleness in the general population, the absence of attention to the masculinity epidemic is rather striking.*
*Kidding/not kidding