Message from the Director
Telling the Story of the Results: Interpretation of Quantitative Data
In recent columns, I have discussed several of the principles we are using to build collaborations between IHR researchers and clinical and operational leaders in the learning health system of Kaiser Permanente Colorado and the Colorado Permanente Medical Group. These six principles are:
▪ Ask the right question
In this column, I’ll address the final principle – interpreting the findings of an operational data analysis or evaluating an intervention.
When thinking about this, one of our IHR programmers recently proposed a helpful definition – “interpreting the data is telling the story of the results”. In other words, interpretation is the process of converting numbers to a narrative. This can be a perilous process. Social psychologists and economists have shown that when we interpret quantitative data, we are susceptible to a host of cognitive biases that operate unconsciously and often lead us to mistaken conclusions and actions. It is in our nature to interpret data in ways that reinforce the story we want to hear, rather than the story the data are trying to tell us. Here are a few examples:
We fail to appreciate the expected variability in our measurements. When we weigh ourselves, we rejoice over a pound lost and agonize over a pound gained, even though we know deep down that the next day’s measurement is likely to negate the change. In our professional lives, we forget that member satisfaction with clinic visits or employee ratings of their work environment depend on a host of influences other than the quality of our service or the skill of our leadership. If those ratings are unfavorable, we can be tempted to launch an intervention and repeat the survey, focusing on areas where we received unfavorable ratings. This approach fails to recognize that high or low values of any measure tends to drift back toward the norm (statisticians call this “regression to the mean”). Unless we identify a comparison group, we may congratulate ourselves on improvements that are really explained by random variation in our measures, contextual changes in the environment, or simply the passage of time.
Our closely-held belief that our circumstances are unique may lead us to unwarranted skepticism about the effectiveness of an intervention. We may under-interpret data when we consider adopting an intervention that has been successful in another site (for us, even another KP region). This tendency is so widespread that it has its own acronym, N-I-H, which stands not for the National Institutes of Health, but rather for “Not Invented Here”. Overcoming this problem requires innovators to assess contextual factors that might limit dissemination and adopters to endorse opportunities for replication.
How do we get it right? Even experienced researchers misinterpret data. Here are three suggestions for counteracting our biases and interpreting data more accurately:
Encourage disagreement. Other people inevitably see different stories in data than we do. It’s important to surround ourselves with colleagues whose biases differ from ours, and to create professional environments where it is safe for others to challenge our narratives.
Less is more. In research, we regularly need to condense the findings of a five-year, multi-million dollar study into four tables and a figure for publication in a scientific journal. Too often, operational decision-makers are confronted with stacks of multi-colored charts and complex tables that obscure the story of the findings. Data analysis is inevitably interpretive, and our goal in presenting our findings should not be to prove that we have done a lot of work, but rather to get to the essence of the data, the heart of the story.
John F. Steiner, MD, MPH
HCSRN 2017 is Around the Corner!
The 2017 Health Care Systems Research Network will be held in San Diego, CA from March 21-23. Poster and Panel submissions will be accepted until Friday, October 7. Register by December 16 to save on registration.
Studying Access to Naloxone
IHR Investigators Ingrid Binswanger, MD, MPH, and Jason Glanz, PhD, recently received an R01 grant from the National Institute on Drug Abuse (NIDA) to study the safety and impact of expanded access to naloxone, a medication that reverses the effects of an opioid overdose, in primary care practices. The study is a partnership with researchers at Denver Health. In 2015, Colorado passed legislation to approve standing orders for naloxone, which will allow pharmacists to provide naloxone without provider-generated prescriptions. In this new environment, teams at the IHR and Denver Health will have the opportunity to study the wide scale adoption of this new practice and its implications.
IHR Evaluation Project Manager Marisa Allen, PhD, recently co-presented a training for the Colorado Evaluation Network. The presentation focused on educating evaluators and researchers about a structured approach called Emergent Learning, which can be used to improve evaluation design, debrief evaluation results, and integrate multiple sources of evidence as a means to strengthen the quality of evaluation projects. Learn more about upcoming events.
Check out the most recent IHR Publications here.