NEWS

Demystifying Statistics: Experts Discuss Common Misunderstandings

Nancy J. Nelson

It has been said that statistics can be used to prove anything. Likewise, with cancer statistics, there are more correct and less correct ways to use them.

One of the most common and complex misuses of statistics is related to 5-year survival rates—the percentage of people, on average, who survive their cancer for 5 years after diagnosis.

Cancer centers and public health agencies frequently cite increases in 5-year survival rates to measure the success of the war against cancer. In fact, for many cancers, including the most common ones—prostate, breast, lung, and colon—5-year survival rates have increased in the last few decades.

By this standard, the $45 billion spent by the National Cancer Institute since its inception has paid off. Not necessarily, argue three physicians from the Dartmouth Medical School in Hanover, N.H., in an article published in the June 2000 issue of the Journal of the American Medical Association.

They concluded that the most likely explanation for the increased survival rates is not that patients are living longer, but that cancer is simply diagnosed earlier. Using data from NCI’s Surveillance, Epidemiology, and End Results Program, the authors, H. Gilbert Welch, M.D., Lisa M. Schwartz, M.D., and Steven Woloshin, M.D., showed that the increases in survival correlate more with increases in incidence than with decreases in mortality for 20 cancers, including the most common ones.

Survival rates can increase in several ways, but such increases do not necessarily correlate with decreases in mortality. Finding cancer earlier, for example, can increase survival, but if early treatment is not effective, mortality will not change.

In general, if any factor—treatment, reduced exposure to carcinogens, or early detection—does not prolong life, mortality will not change. This means, assert the Dartmouth doctors, that increased survival rates themselves may not correlate with increased life expectancy or quality of life.

"In general, if there is a true increase in survival, then you will see an impact in mortality," said Lynn Ries, a statistician from NCI’s SEER Program. Many agree that the best indication of progress is a decrease in mortality rates. And certainly for several cancers, the mortality rates have been decreasing.

"But," Ries warned, "even if mortality rates do go down, you’re probably not going to see it in the population in a single year because a decrease in mortality will impact on the population gradually over time."

One in Eight Breast Cancer Statistic

Another number that gets misused and abused is the oft-cited statistic that, over the course of a woman’s lifetime, the overall risk of her getting breast cancer, on average, is one in eight or about 13%.

"People think that this statistic refers to a woman’s risk now," said Ries, "but we’re talking about a lifetime risk. It means if you follow eight women for their entire lives, then one of them, on average, is likely to develop breast cancer."

Actually, any woman’s risk of developing breast cancer is strongly linked to her age, and the one in eight statistic refers to an American woman’s risk at age 85. At 45, women have a one in 100 or 1% chance of developing breast cancer. (See "Chances of Developing Breast Cancer," above.)


View this table:
[in this window]
[in a new window]
 
Chances of Developing Breast Cancer Among U.S. Women
 
Number Versus Rate

According to several statisticians, another common statistical misunderstanding is the difference between cancer rates and the actual number of new cases of cancer or cancer deaths in a particular year.

The number of cancer deaths in the United States is increasing, but the cancer death rate is decreasing. The cancer death rate is the number of cancer deaths for every 100,000 in the population at risk in a given year. (See "U.S. Cancer Deaths and Mortality Rates," below.) Overall cancer mortality rates for the United States have been dropping since about 1991.


View this table:
[in this window]
[in a new window]
 
U.S. Cancer Deaths and Mortality Rates
 
"People confuse a rate with a number," said Ries. "When we talk about the death rates going down, people think that the number of deaths due to cancer is going down, and it’s not. The number of deaths has leveled off a little bit, but it’s going up. The population is increasing and getting older, so you would expect more deaths."

Age Adjustment

The notion that deaths are dependent on the age of the population brings up another difficult statistical concept—age adjustment.

The reason for age-adjusting is to make rates comparable at different times and different places.

Suppose, for example, someone wanted to compare the cancer death rates in Florida with those in Alaska for 1992 through 1996. If the number of deaths in each state is divided by the total population, Florida appears to have a cancer mortality rate that is three times higher than that in Alaska (262 deaths/100,000 versus 96 deaths/100,000). However, Florida’s population is much older than Alaska’s population. (See "Age Distribution in Florida and Alaska," above.)



View larger version (29K):
[in this window]
[in a new window]
 

 
So, to eliminate the effect of age on these rates, statisticians age-adjust the populations of the two states to the 1970 U.S. population—they, in effect, pretend that Alaska and Florida have same proportion of people in each age group as the 1970 U.S. population did. After age-adjusting, the cancer death rates are nearly identical—167/100,000 for Alaska and 166/100,000 for Florida. In fact, both states have nearly identical mortality rates by age group.

Age-Adjusting to Year 2000

Understanding age adjustment is particularly important now because, next year, all statistics generated by agencies in the U.S. Department of Health and Human Services will be adjusted to the age distribution of the 2000 population. (NCI’s SEER uses the age distribution of the 1970 U.S. population, and CDC’s National Center for Health Statistics uses the age distribution of the 1940 U.S. population.)

Because the population in the year 2000 is older than the 1970 population, it will appear that some of the cancer rates are increasing, when only the standard for age-adjusting is changing.

To try to head off the inevitable statistical confusion next year, Ries and others have developed educational modules to help scientists and researchers understand and explain the effects of the 2000 age adjustment.

"People are going to see rates increase by about 30%, but it’s not really an increase," Ries said. "We’ve been looking at the population as though it had the age distribution of 1970, which is young."



             
Copyright © 2001 Oxford University Press (unless otherwise stated)
Oxford University Press Privacy Policy and Legal Statement