People love to cite them. This is especially true in the context of medical research, You know, so we all can be reminded of our inevitable mortality which will be brought on by some unpleasant disease because we ate too many rice cakes.

I would go so far as to say we have a love-hate relationship with statistics in that people are natural information seekers and simple numbers make it easy to understand complex things. But sometimes the simple numbers are associated with things we don't want to consider, causing a whole lot of cognitive dissonance, so we reject them as spurious tricks some nerd in an ivory tower came up with.

(Don't tell anyone but I'm a nerd in an ivory tower)

Research Mark. Possibly the best meme on the internet. |

*how-to-read-common-statistics*guide. What are my qualifications? I'm writing a blog, that's sufficient right? I also spend half my time doing clinical research at Northwestern University and as part of that I run a whole lot of statistical analyses. I've also taught advanced statistics courses to doctoral students. I am, however, by no means a true expert on stats. I do know enough to be dangerous.

Here are 3 statistical terms/keywords/phrases you often see in medical reporting and what the hell they actually mean:

1.

**Correlations.**You've probably heard the phrase "correlation does not mean causation." Which is 100% true. A correlation states 2 events or things are related to each other. It does not mean that A causes B or B causes A.

One example is a large correlation between eating delicious chocolate ice cream and weight gain. Let's say this correlation, reported as a small letter r, is 0.70. Correlations range from -1 to 0 to +1. If a correlation is positive (0 to 1) as A increases so does B (example: heat on a stove and the temperature of a pot of water). If a correlation is negative (0 to -1) as A increases, B decreases (example: amount of water consumed and thirst). So a correlation will never be 1.5 or 7.9 or -1200. The closer the value is to 1 or -1, the bigger the relationship and if it is actually 1, we call this a "true relationship" in that when A happens B will always happen. Or if it is -1, when A happens B will never happen.

*True correlations almost never happen, especially in medical research.*

Ok, nerd, what does that r = 0.70 for weight and ice cream actually mean? A correlation of 0.70 is pretty big, by statistics standards. When we see correlations this big in our results we generally get excited because it means we've found a strong relationship. But in terms of how 0.70 explains the relationship between weight and ice cream, we have to think of a pie charts. Mmmm, pie.

Literally my house on Sunday |

r = 0.70 (Ice cream sales and weight). 0.70 x 0.70 = 0.49 x 100 = 49%

So, this tells me that if I eat chocolate ice cream and gain weight, I can attribute 49% of my weight gain to eating chocolate ice cream. That means that the other 51% is explained

**. What are these other things? Could be genetics. Could be exercise habits. Could be metabolism rate. Who knows? Unless those things were also measured in the research study, we cannot answer the question. There's a term for these things:**

__by other things__**confounds.**

**2.**

**Odds Ratios**. These are a favorite to be picked up by the media. Drinking wine every day makes you twice as likely to be awesome! These types of headlines typically come from research studies reporting odds ratios, which is a more complicated statistical analysis than correlations. If correlations are the equivalent complexity of the plot of a Curious George cartoon, odds ratios are more like the plot of Planet of the Apes. So I won't get into the weeds of how they're calculated, but rather stick to how to understand them. The important number to remember in odds ratios is 1, because as a

**n odds ratio gets closer to 1 it's relevance decreases**. Odds ratios are always reported as a number followed by a range. If the range includes 1 in it, even if the number is bigger than 1, the odds ratio is not valid.

Let's use the odds of a head injury in a car accident and not wearing my seatbelt. I do a study and my statistics say the odds ratio of someone getting a head injury is 12.5 (Range: 8.2 - 17.9). That means I'm 12.5 times more likely to bust my head in an accident if I don't wear my seatbelt, on average, but it ranges from 8.2 times to almost 18 times more likely. This is an example of a good odds ratio (nowhere near 1, the range doesn't get close to it either) and the odds are significantly higher.

Now let's use the odds of drinking wine making people awesome. I do another study and my statistics say the odds ratio is 1.9 (Range 0.37 - 3.5). Notice the odds ratio is bigger than 1, but not by a lot, and my range passes under 1 (the 0.37). So while, technically, people are 1.9 times more likely to be awesome after drinking wine, it would be wrong for me to report this as a significant increase in odds of awesomeness. Even though that 3.5, or 3.5 times more likely to be awesome, looks pretty good, I have to take into account the whole picture.

3.

**Percentages.**Another favorite of the media. My aforementioned tribe member tweeted an article on how using hair dyes or straightening products increases the odds of getting cancer in women by somewhere between 50% and 75%, depending on the march-of-death chemicals used. She was rightfully concerned but skeptical about what the numbers mean. Big percentages can look pretty scary, especially when we throw them out with the word "risk." In statistics, the calculation is something called

**Relative Risk Ratio**. These are also more complicated stats, like odds ratios.

Any time you see percentages with risk increase, or decrease, you have to know the original risk value of the issue being discussed. Without this context, the percentage is meaningless but can also look really fucking scary.

For example, if my risk of developing any type of cancer is 5%, and a study says using chemical hair dye

*increases*my risk by 50%, then my new risk of getting cancer is 7.5%. (5% x 0.50 = 2.5%; 5%+2.5% = 7.5%). If a study says eating rice cakes increases my risk of cancer by 200% (note: there is no such study) then my new risk of cancer is 15%.

The thing about risk and odds is both of these statistical analyses rely on correlations at their foundation, so the same problems in over-interpreting them exist. We use more complicated statistics to try to control for those confounds I mentioned, but we never can control for all of them. It's literally impossible, but we try our best. We also always report our statistics with cautious language. Unfortunately this caution is lost in the hyperbole of today's media landscape and can leave people unnecessarily fearful.

So whenever you're reading the latest breaking medical news, drill down into the numbers a little. Go to PubMed or Google Scholar and read the abstract. You might even find the whole article available, depending on the publisher. You'll probably see at least one of the terms in this blog entry and the cautionary way the stats are reported, because life, and people, are complicated.

Enjoy your chocolate ice cream, rice cakes, and wine. And wear your seat belt.

*--T2*