Skip to main content

Breaking: 100% Chance We're All Going to Die Some Day


People love to cite them.  This is especially true in the context of medical research,  You know, so we all can be reminded of our inevitable mortality which will be brought on by some unpleasant disease because we ate too many rice cakes.

I would go so far as to say we have a love-hate relationship with statistics in that people are natural information seekers and simple numbers make it easy to understand complex things. But sometimes the simple numbers are associated with things we don't want to consider, causing a whole lot of cognitive dissonance, so we reject them as spurious tricks some nerd in an ivory tower came up with.

(Don't tell anyone but I'm a nerd in an ivory tower)

Research Mark. Possibly the best meme on the internet.
I've had this topic on my "what should I write about" list for a while, and a tweet by a member of my IBD tribe (Are people over the age of 40 allowed to use the word "Tribe"?) prompted me to sit down and write a, hopefully, clear how-to-read-common-statistics guide.  What are my qualifications?  I'm writing a blog, that's sufficient right?  I also spend half my time doing clinical research at Northwestern University and as part of that I run a whole lot of statistical analyses.  I've also taught advanced statistics courses to doctoral students.  I am, however, by no means a true expert on stats.  I do know enough to be dangerous.

Here are 3 statistical terms/keywords/phrases you often see in medical reporting and what the hell they actually mean:

1. Correlations.  You've probably heard the phrase "correlation does not mean causation."  Which is 100% true.  A correlation states 2 events or things are related to each other.  It does not mean that A causes B or B causes A.

One example is a large correlation between eating delicious chocolate ice cream and weight gain.  Let's say this correlation, reported as a small letter r, is 0.70.  Correlations range from -1 to 0 to +1.  If a correlation is positive (0 to 1) as A increases so does B (example:  heat on a stove and the temperature of a pot of water). If a correlation is negative (0 to -1) as A increases, B decreases (example: amount of water consumed and thirst).  So a correlation will never be 1.5 or 7.9 or -1200.  The closer the value is to 1 or -1, the bigger the relationship and if it is actually 1, we call this a "true relationship" in that when A happens B will always happen.  Or if it is -1, when A happens B will never happen.  True correlations almost never happen, especially in medical research.

Ok, nerd, what does that r = 0.70 for weight and ice cream actually mean?  A correlation of 0.70 is pretty big, by statistics standards.  When we see correlations this big in our results we generally get excited because it means we've found a strong relationship.  But in terms of how 0.70 explains the relationship between weight and ice cream, we have to think of a pie charts.  Mmmm, pie.
Literally my house on Sunday
There's a trick you can apply to correlations to find out how much of the pie chart is shaded in by the relationship of the 2 things the correlation is measuring.  You square it (not the pie, the correlation value), then multiply that by 100 to get a percentage.  The squared value of the correlation tells you, roughly, how much (or what percent) A explains B.  Or like this:

r = 0.70 (Ice cream sales and weight).  0.70 x 0.70 = 0.49 x 100 = 49%

So, this tells me that if I eat chocolate ice cream and gain weight, I can attribute 49% of my weight gain to eating chocolate ice cream.  That means that the other 51% is explained by other things.  What are these other things?  Could be genetics.  Could be exercise habits.  Could be metabolism rate.  Who knows?  Unless those things were also measured in the research study, we cannot answer the question.  There's a term for these things:  confounds.

2.  Odds Ratios.  These are a favorite to be picked up by the media.  Drinking wine every day makes you twice as likely to be awesome!  These types of headlines typically come from research studies reporting odds ratios, which is a more complicated statistical analysis than correlations.  If correlations are the equivalent complexity of the plot of a Curious George cartoon, odds ratios are more like the plot of Planet of the Apes.  So I won't get into the weeds of how they're calculated, but rather stick to how to understand them.  The important number to remember in odds ratios is 1, because as an odds ratio gets closer to 1 it's relevance decreases.  Odds ratios are always reported as a number followed by a range.  If the range includes 1 in it, even if the number is bigger than 1, the odds ratio is not valid.

Let's use the odds of a head injury in a car accident and not wearing my seatbelt.  I do a study and my statistics say the odds ratio of someone getting a head injury is 12.5 (Range: 8.2 - 17.9).  That means I'm 12.5 times more likely to bust my head in an accident if I don't wear my seatbelt, on average, but it ranges from 8.2 times to almost 18 times more likely.  This is an example of a good odds ratio (nowhere near 1, the range doesn't get close to it either) and the odds are significantly higher.

Now let's use the odds of drinking wine making people awesome.  I do another study and my statistics say the odds ratio is 1.9 (Range 0.37 - 3.5).  Notice the odds ratio is bigger than 1, but not by a lot, and my range passes under 1 (the 0.37).  So while, technically, people are 1.9 times more likely to be awesome after drinking wine, it would be wrong for me to report this as a significant increase in odds of awesomeness.  Even though that 3.5, or 3.5 times more likely to be awesome, looks pretty good, I have to take into account the whole picture.

3.  Percentages.  Another favorite of the media. My aforementioned tribe member tweeted an article on how using hair dyes or straightening products increases the odds of getting cancer in women by somewhere between 50% and 75%, depending on the march-of-death chemicals used.  She was rightfully concerned but skeptical about what the numbers mean.  Big percentages can look pretty scary, especially when we throw them out with the word "risk."  In statistics, the calculation is something called Relative Risk Ratio.  These are also more complicated stats, like odds ratios.

Any time you see percentages with risk increase, or decrease, you have to know the original risk value of the issue being discussed.  Without this context, the percentage is meaningless but can also look really fucking scary.

For example, if my risk of developing any type of cancer is 5%, and a study says using chemical hair dye increases my risk by 50%, then my new risk of getting cancer is 7.5%.  (5% x 0.50 = 2.5%; 5%+2.5% = 7.5%).  If a study says eating rice cakes increases my risk of cancer by 200% (note: there is no such study) then my new risk of cancer is 15%.

The thing about risk and odds is both of these statistical analyses rely on correlations at their foundation, so the same problems in over-interpreting them exist.  We use more complicated statistics to try to control for those confounds I mentioned, but we never can control for all of them.  It's literally impossible, but we try our best.  We also always report our statistics with cautious language.  Unfortunately this caution is lost in the hyperbole of today's media landscape and can leave people unnecessarily fearful.

So whenever you're reading the latest breaking medical news, drill down into the numbers a little.  Go to PubMed or Google Scholar and read the abstract.  You might even find the whole article available, depending on the publisher.  You'll probably see at least one of the terms in this blog entry and the cautionary way the stats are reported, because life, and people, are complicated.

Enjoy your chocolate ice cream, rice cakes, and wine.  And wear your seat belt.


Popular posts from this blog

Game of Crohn's

There are 2 days left in 2017 and I've considered writing this blog post for most of the previous 363.  It's on a topic that nobody wants to talk about. Ever.

Let me start by saying Crohn's disease is a stigmatized disease.  A recent study found it's more stigmatized than HIV/AIDS and genital herpes among the general public.  And this topic, I think, takes that stigma and multiplies it by about 8.

Most people who know anything about IBD know it involves a few main issues:  diarrhea, abdominal pain, and sometimes blood in said diarrhea.  Because Crohn's is a giving disease, it also comes with a slew of other problems including joint pains, eye inflammation, skin inflammation, bone density loss, mind-numbing fatigue, and fuzzy toenails.

Ok maybe not that last one. 

Around 1/3 of us develop "fistulizing disease."  Or what I like to call body termites.  Fistulas aren't unique to IBD and can happen for other reasons, but IBD is a main source of the fistula…

The Long Shot

I don't even know where to begin as my head is still spinning with the news I received today.  So I'm just going to put it out into the ether:

Entyvio (vedolizumab), which I started for my Crohn's disease about 6 months ago, did what no other approach has:  cleared my eosinophilic esophagitis. 

But wait, isn't Entyvio a drug for inflammatory bowel disease?  Yes.

Is Eosinophilic Esophagitis a type of inflammatory bowel disease?  Nope.

Are IBD and EoE related at all?  As far as we know today, no.  There are very few overlapping cases.

So WTF happened?

Without getting into the biomechanics of a drug that's way over my pay grade in medical understanding, my gastroenterologist had a theory that the way Entyvio works would block the cascade of eosinophils (a part of your immune system, a type of white blood cell) through it's magical way of selectively keeping my immune system from attacking my digestive tract.

She was fucking right.

Since being diagnosed with EoE in ear…


I've been thinking a lot about how we live in an era of infinite access to infinite information (thanks, internet tubes!) yet we still fall into many of the well-established psychological laws, if we can call them that, of human behavior.  Don't worry, this isn't going to be some drawn out post on social psychology. Wikipedia is great for that.

I want to talk about bubbles.  Information bubbles, that is. And how each one of us lives in one to some extent, no matter how educated or enlightened we see ourselves to be. And even if we know we live in said bubble, it takes being shown information that directly conflicts with how you think things are, or should be, and the result is you feel kinda ew - the technical term for "ew" being cognitive dissonance.

I live in a bubble.

In my bubble is the world of academic medicine, academic health psychology, and a circle of psychologists dedicated to people living with chronic digestive illness.  I live in Chicago, a major me…