Medical statistics that almost nobody understands, probably including you, even if you're very smart

Jun 05, 2009 13:58

A couple of weeks ago I went to APS, where there was a talk by Gerd Gigerenzer, lead author on this amazing paper about medical statistics and how terribly they mislead doctors, academics, lawmakers, the media, and the people actually making decisions about their health. I'm going to invest whatever social capital I have and say that everyone should really read my notes from his talk.*

I'll serialize them so the post doesn't take forever to read.

* exceptions include figent_figary, ponderduck, nfnitperplexity, and anyone else who already has a really good grasp of epidemiological statistics.

1) Reporting changes in risk

In England, a while ago, they reported that taking a certain contraceptive increased your risk of embolism by 100%. A lot of women thought it meant they had a 100% risk of dying! They went off the pill immediately. Epidemiologists calculated that this incident led to an additional 14,000 abortions that year. [I do not know what incident he's referring to or how accurate this is - MC]

What did the study actually find? Normally, one out of every 7,000 women had embolisms, and among the ones who were taking the pill, it was two out of every 7,000.

It's true that's a 100% increase (one additional death, divided by the base rate of 1 usual death = 100%). This is called RELATIVE RISK, and it is a standard, totally legitimate statistic. But outside of statistical analysis, almost no one understands it, including the vast majority of doctors.

Relative risk tells you how good a treatment is (or how bad a risk factor is) compared to normal. But it doesn't tell you how much it matters. For example:

Imagine that drug A has a side effect that causes a rare form of spleen cancer, and this year 6 people taking the drug died from spleen cancer, when without the drug only one would have. There will be headlines, "Drug A causes a 500% increase in spleen cancer deaths!"

Now imagine that drug B has a side effect that causes cardiac arrest, and 25,500 people died when normally 25,000 would have. PR people can brush that off as "a mere 2% increase."

The relative risk tells you something about what the drug does biologically, but it obscures the most important information: Drug A kills five people a year, and drug B kills five hundred!

(of course, in both cases you have to ask how many people the drug is helping).

Solution: Communicate using absolute statistics that relate to the population, not ratios with some other statistic: "Out of every 10,000 people who take drug A, the drug will cause 1 additional person to get spleen cancer."

2) Distortions caused by changes in diagnosis.

Should men over 50 get PSA (prostate cancer) screening?

In America it is pushed very heavily. But the latest evidence is that routine screening doesn't decrease your risk of dying from prostate cancer at all. If it's not picked up by screening you'll just detect it later, when you get sick, and the operation works just as well then as if you'd gotten it earlier. [again, I can't independently vouch for this information. - MC]

Also, the test has a high false positive rate (it detects benign tumors that wouldn't have become dangerous). And the operation can be pretty bad -- 1/3 - 2/3 of patients end up with permanent impotence or incontinence.

Because routing screening increases the # of people operated on, it artificially inflates the cure rate:

10,000 cases detected, 5,000 die: A 50% survival rate
100,000 cases detected, 5,000 die: a 95% survival rate

But you didn't actually prevent any deaths! You just detected people who wouldn't have died anyway.

This is especially common with cancer, because you often see older patients who have a small tumor, but are likely to die from something else long before the tumor becomes dangerous).

SURVIVAL RATE: how many ppl who are diagnosed live? (goes up if you loosen criteria and diagnose lots of people who aren't very sick)
MORTALITY RATE: how many ppl who are diagnosed die?

SOLUTION: Use mortality rate; it's harder to mess with just by changing diagnostic criteria.

If you want a percentage, look at the % who die out of the POPULATION, not out of the number diagnosed.

statistics, psych, notes, health

Previous post Next post
Up