When it comes down to it, a huge swath of biological research has the end goal of expanding the length and quality of human lives. Even the most basic research can have an eventual role in curing disease, preventing deficits associated with aging, or providing a healthier environment to live in. While this seems like it's a valiant and noble goal, it may be, to some extent, for naught. Back in October 2016, researchers declared that the natural limit to the human lifespan is around 115 years old.1 "Natural limit" means the maximum average age that humans would live without any sort of disease cutting life short or artificial way of prolonging it. Basically, if everyone died of "old age", what would our lifespan be?
The paper, published in Nature, one of the highest profile scientific journals, looked a various mortality data from developed countries over the past century or so. All of these countries have shown a steady increase in life expectancy from 1900 to the present, with exceptions for people born in the late 1910's and early 1940's.2 Important lesson to take from these data: try not to be born during a world war. This, however, just represents average life expectancy, not maximum life expectancy. There are a lot of data to suggest that this increase in life expectancy over the past century is due to decreases in infant and childhood mortality.3 In the more recent past, there has also been an increase in the number of people reaching late-life, which pulls the average life expectancy up even further.4 None of this, however, is remotely helpful if we want to think about maximum life expectancy. To really understand this, let's think about how averages work.
Let's say you have five people.
The average life expectancy increases across the three time points, but that only tells us so much about the data. It says nothing about the range of ages people die at, which, in this case, are all the same. It says nothing about how much variability there is in the data, which is decreasing across the three time points. Most importantly, for our purposes, it says nothing about the maximum age. While the average age is increasing, the maximum is staying constant.
So how did these researchers determine that 115 specifically was the natural maximum of human life? The most basic analysis looked at the age of death for supercenturians (people over 110 years of age) over a period of time.5 This number increased rapidly from the 1970s to the 1990s, but has mostly held steady since 1995. People who are living to late-life, the average age of death hasn't increased in the past 20 years, suggesting that we have hit or come close to that maximum lifespan. The same pattern held true for maximum age at death per year as well as average age, which indicates that there aren't really any outliers. The conclusion was reinforced when looking at rates of change over time. When looking at what age has experienced the largest increase in people reaching, between the 1920s and the late 1970s, the age has increased from 85 to 100, and plateaued around that point. Rates of change over the last 100 years have increased dramatically for points between about 45 and 100 years of age, and then drop off, indicating that there has not been a major increase in the number of people reaching 100 in the last century.6 The authors observed two different trends in the data- one trending upward, the other trending downward, and split the data in order to determine if these were, in fact, different. The results of this analysis indicated that a positively sloped model of maximum age fits early time-point data, while a slightly negatively sloped model fits the later time point data, indicated that after the mid-1990s, a peak had been reached in the maximum age. Describing all these data is a little tough, so I've included the figures from the paper so you can visualize.
From Dong et al., 2016
All of the conclusions seem fairly sound from the data, the paper was published and became very popular in the popular news. You very probably read about it. And then all hell broke up.
The June 29th issue of Nature took some issue with the study. Four rebuttals and a reply from the original authors were published that week, and none of this was covered in the news as follow up. Essentially, multiple groups took strong issue with the statistical approaches Dong et al., had taken, accusing them of violating multiple good research and statistical practices and virtually nullifying all their findings. One issue was that the authors fit separate, independent models to each age, ignoring the fact that surviving to any one age isn't entirely independent of surviving to any other age- namely, the probability is higher that you survive to 80 than to 90 by virtue of simply having more time to have something happen resulting in death, but this isn't accounted for when coming up with the lines that fit the data for a single year. Statistics were also heavily biased by the number of supercenturians dying in a certain year, to the point that there was a very strong, extraordinary significant positive correlation between the number of deaths and the average age of those deaths, ranging from 111.39 in years where only one supercenturian died to 116.28 in years where upwards of 30 died.7
Dong et al., separated data into two groups of years, reporting an increase in age at earlier time points, and no change with a trend towards a decrease (though I would argue you could safely call it no change) at later time points. This lack of change in the second group helped make their case a plateau in maximum age. However, they never explain why they chose to split the data where they did, and indeed if you look at the data as a whole or partition it at a different point, you find a different result.8 The statistical analyses they performed on these data used only the oldest person who died in each year, meaning that even though the total sample size of supercenturians was 534, the actual sample sizes being included in this split analysis was 21 for one group and 12 for the other, which are pretty small for the type of analysis they're doing.9 Data also appears to potentially be skewed by the outlier of Jeanne Calment, who died in 1997 at the age of 122.10
There was also issue with the data themselves. The mortality database from which the data warns caution when looking at data for people over the age of 90 due to the data essentially being preprocessed and rounded and not providing raw numbers. Using the statistical methods that were promoted in the original paper is not necessarily the correct way to unpack these rounded numbers, which leads to some questions in validity.11
Confusingly, single authors at times argue that the results would have been different if they limited the time span they were looking at, or if they had expanded that time span. For example, how do you determine that a plateau is really a plateau and not just a much slower increase without continuing to look at the data for a longer period of time?12 As evidence of this point, there was also an apparently plateau between the years of 1968 and 1980, but this was followed by another increase. There is no reason to believe that the current plateau is the finally one.13
From a basic experimental design perspective, Dong et al. never compare their model fits to other possibilities. As illustration of this point, another model using the same data suggests that the maximum lifespan might reach 125 by the year 2070.14 Dong and colleagues provide no data on how they determined that the way they modeled the data is the most accurate, and you cannot have a hypothesis that cannot be disproven, thus suggesting that this paper breaks the cardinal rule of science.15
Dong and colleagues then fired back in a brief letter that addressed a very select few of these arguments; I'll let you determine how compelling you find this rebuttal to be.16 Either way, there is now a war in longevity research, and these scientists are definitely rolling their eyes at each other in the privacy of their own lab meetings.
I, on the other hand, absolutely love this situation. It's such a good example of so many things. The most basic is to always reserve judgment and be skeptical of articles that are being reported in the media. No one really cares about the methodology and statistics when they're reading something in Discover Magazine, in fact, you're probably not interested in it in this blog, but there is so much valuable information contained in these parts that are being cut out to tell the story. It's also a beautiful example of the way science should work. These scientists are volleying back and forth, fighting to disprove the others (it's a very negative and aggressive field), improving, explaining, and questioning. It beats scientists themselves over the head with the importance of strong statistical knowledge and application, even though we all fell asleep in our first year stats course. It also demonstrates that when an interesting, sexy piece of science news comes out, it's up to us to stick with it. See where discoveries go, what develops out of them, how they inspire other science. Maybe the thing that sounded so interesting is overdramatized or overgeneralized. Maybe something amazing comes out of it. Maybe everyone else in the field thinks it's a crock. Popular science media, unfortunately, doesn't often do this for us, but they should, because it shows how messy the process is and how skeptical and questioning we should be.