Skip to main content

Breaking: 100% Chance We're All Going to Die Some Day


People love to cite them.  This is especially true in the context of medical research,  You know, so we all can be reminded of our inevitable mortality which will be brought on by some unpleasant disease because we ate too many rice cakes.

I would go so far as to say we have a love-hate relationship with statistics in that people are natural information seekers and simple numbers make it easy to understand complex things. But sometimes the simple numbers are associated with things we don't want to consider, causing a whole lot of cognitive dissonance, so we reject them as spurious tricks some nerd in an ivory tower came up with.

(Don't tell anyone but I'm a nerd in an ivory tower)

Research Mark. Possibly the best meme on the internet.
I've had this topic on my "what should I write about" list for a while, and a tweet by a member of my IBD tribe (Are people over the age of 40 allowed to use the word "Tribe"?) prompted me to sit down and write a, hopefully, clear how-to-read-common-statistics guide.  What are my qualifications?  I'm writing a blog, that's sufficient right?  I also spend half my time doing clinical research at Northwestern University and as part of that I run a whole lot of statistical analyses.  I've also taught advanced statistics courses to doctoral students.  I am, however, by no means a true expert on stats.  I do know enough to be dangerous.

Here are 3 statistical terms/keywords/phrases you often see in medical reporting and what the hell they actually mean:

1. Correlations.  You've probably heard the phrase "correlation does not mean causation."  Which is 100% true.  A correlation states 2 events or things are related to each other.  It does not mean that A causes B or B causes A.

One example is a large correlation between eating delicious chocolate ice cream and weight gain.  Let's say this correlation, reported as a small letter r, is 0.70.  Correlations range from -1 to 0 to +1.  If a correlation is positive (0 to 1) as A increases so does B (example:  heat on a stove and the temperature of a pot of water). If a correlation is negative (0 to -1) as A increases, B decreases (example: amount of water consumed and thirst).  So a correlation will never be 1.5 or 7.9 or -1200.  The closer the value is to 1 or -1, the bigger the relationship and if it is actually 1, we call this a "true relationship" in that when A happens B will always happen.  Or if it is -1, when A happens B will never happen.  True correlations almost never happen, especially in medical research.

Ok, nerd, what does that r = 0.70 for weight and ice cream actually mean?  A correlation of 0.70 is pretty big, by statistics standards.  When we see correlations this big in our results we generally get excited because it means we've found a strong relationship.  But in terms of how 0.70 explains the relationship between weight and ice cream, we have to think of a pie charts.  Mmmm, pie.
Literally my house on Sunday
There's a trick you can apply to correlations to find out how much of the pie chart is shaded in by the relationship of the 2 things the correlation is measuring.  You square it (not the pie, the correlation value), then multiply that by 100 to get a percentage.  The squared value of the correlation tells you, roughly, how much (or what percent) A explains B.  Or like this:

r = 0.70 (Ice cream sales and weight).  0.70 x 0.70 = 0.49 x 100 = 49%

So, this tells me that if I eat chocolate ice cream and gain weight, I can attribute 49% of my weight gain to eating chocolate ice cream.  That means that the other 51% is explained by other things.  What are these other things?  Could be genetics.  Could be exercise habits.  Could be metabolism rate.  Who knows?  Unless those things were also measured in the research study, we cannot answer the question.  There's a term for these things:  confounds.

2.  Odds Ratios.  These are a favorite to be picked up by the media.  Drinking wine every day makes you twice as likely to be awesome!  These types of headlines typically come from research studies reporting odds ratios, which is a more complicated statistical analysis than correlations.  If correlations are the equivalent complexity of the plot of a Curious George cartoon, odds ratios are more like the plot of Planet of the Apes.  So I won't get into the weeds of how they're calculated, but rather stick to how to understand them.  The important number to remember in odds ratios is 1, because as an odds ratio gets closer to 1 it's relevance decreases.  Odds ratios are always reported as a number followed by a range.  If the range includes 1 in it, even if the number is bigger than 1, the odds ratio is not valid.

Let's use the odds of a head injury in a car accident and not wearing my seatbelt.  I do a study and my statistics say the odds ratio of someone getting a head injury is 12.5 (Range: 8.2 - 17.9).  That means I'm 12.5 times more likely to bust my head in an accident if I don't wear my seatbelt, on average, but it ranges from 8.2 times to almost 18 times more likely.  This is an example of a good odds ratio (nowhere near 1, the range doesn't get close to it either) and the odds are significantly higher.

Now let's use the odds of drinking wine making people awesome.  I do another study and my statistics say the odds ratio is 1.9 (Range 0.37 - 3.5).  Notice the odds ratio is bigger than 1, but not by a lot, and my range passes under 1 (the 0.37).  So while, technically, people are 1.9 times more likely to be awesome after drinking wine, it would be wrong for me to report this as a significant increase in odds of awesomeness.  Even though that 3.5, or 3.5 times more likely to be awesome, looks pretty good, I have to take into account the whole picture.

3.  Percentages.  Another favorite of the media. My aforementioned tribe member tweeted an article on how using hair dyes or straightening products increases the odds of getting cancer in women by somewhere between 50% and 75%, depending on the march-of-death chemicals used.  She was rightfully concerned but skeptical about what the numbers mean.  Big percentages can look pretty scary, especially when we throw them out with the word "risk."  In statistics, the calculation is something called Relative Risk Ratio.  These are also more complicated stats, like odds ratios.

Any time you see percentages with risk increase, or decrease, you have to know the original risk value of the issue being discussed.  Without this context, the percentage is meaningless but can also look really fucking scary.

For example, if my risk of developing any type of cancer is 5%, and a study says using chemical hair dye increases my risk by 50%, then my new risk of getting cancer is 7.5%.  (5% x 0.50 = 2.5%; 5%+2.5% = 7.5%).  If a study says eating rice cakes increases my risk of cancer by 200% (note: there is no such study) then my new risk of cancer is 15%.

The thing about risk and odds is both of these statistical analyses rely on correlations at their foundation, so the same problems in over-interpreting them exist.  We use more complicated statistics to try to control for those confounds I mentioned, but we never can control for all of them.  It's literally impossible, but we try our best.  We also always report our statistics with cautious language.  Unfortunately this caution is lost in the hyperbole of today's media landscape and can leave people unnecessarily fearful.

So whenever you're reading the latest breaking medical news, drill down into the numbers a little.  Go to PubMed or Google Scholar and read the abstract.  You might even find the whole article available, depending on the publisher.  You'll probably see at least one of the terms in this blog entry and the cautionary way the stats are reported, because life, and people, are complicated.

Enjoy your chocolate ice cream, rice cakes, and wine.  And wear your seat belt.


Popular posts from this blog


I've been thinking a lot about how we live in an era of infinite access to infinite information (thanks, internet tubes!) yet we still fall into many of the well-established psychological laws, if we can call them that, of human behavior.  Don't worry, this isn't going to be some drawn out post on social psychology. Wikipedia is great for that.

I want to talk about bubbles.  Information bubbles, that is. And how each one of us lives in one to some extent, no matter how educated or enlightened we see ourselves to be. And even if we know we live in said bubble, it takes being shown information that directly conflicts with how you think things are, or should be, and the result is you feel kinda ew - the technical term for "ew" being cognitive dissonance.

I live in a bubble.

In my bubble is the world of academic medicine, academic health psychology, and a circle of psychologists dedicated to people living with chronic digestive illness.  I live in Chicago, a major me…

Everyone Can Fall Down the Rabbit Hole

A few months ago my 3 year old son uttered the words, "I hate you, mommy."  It was after I yelled at him for doing something wrong, which I've long forgotten what exactly the source of our exchange was. But I certainly can remember those words. I can hear them in my head if my brain decides, at random moments, to replay them.

My intellectual, clinical psychologist brain can explain this for days. He's 3, he doesn't know what he's saying, he learned the word hate somewhere else, presumably at preschool, as I discourage its free use in our house. He's using it to express his anger not his true feelings toward me because once he self-regulates (psychobabble for calms the F down) he tells me he loves me.  Blah blah blah.

Regardless of all that knowledge and shit I have from too much education, those words destroy me emotionally.  Maybe they hit me harder because of my profession because my head goes to all the subsequent pathology he'll surely go on to de…

Medical PTSD

“It is just an illusion here on Earth that one moment follows another one, like beads on a string, and that once a moment is gone, it is gone forever.”  - Kurt Vonnegut, Slaughterhouse Five 
A few years ago, my gastroenterologist wanted me to have something called an esophageal manometry to better understand how my newly diagnosed eosinophilic esophagitis may have been affecting how the muscles in my esophagus were functioning.  I work with the guys who wrote the book on esophageal disease, and these guys do a lot of manometries. I know all about esophageal manometry.

My mind immediately went to images of a small bowel enteroclysis I'd had at least a decade prior. My body grew tense and it was almost as if I was back in that cold room with the cold metal table and the cold radiologist, who just didn't believe me when I told her how bad my gag reflex was before she placed a tube down my throat to inject my small intestines with barium.

It took what seemed like forever to get th…