Skip to main content

Translating Evidence

What is the best approach to convey evidence to a diverse audience of consumers?

This is the fundamental question for the translation of evidence. Evidence should be useful for researchers, but also for policymakers and practitioners, and methods and tools appropriate for one audience may not be optimal for another. The STEPP Center endeavors to facilitate research designed to determine best practices for reporting effects and conveying uncertainty visually, in documents, webtools, and software.

Current Research Projects

The center is beginning to engage in research on how statistical summaries in research reviews are understood by users. Research has involved how users interpret effect sizes representing program impacts, understand different ways of representing uncertainty, interpret heterogeneity of findings, and standards for reporting research.

What We Have Learned in 20 Years of IES Randomized Trials

The US Institute of Education Sciences has supported over 300 randomized trials of education interventions, products, and services. This project will explore what has been learned from the trials supported by IES over the past 20 years, with particular emphasis on effect sizes, statistical significance tests, innovations in research designs, and the development of the scientific workforce.

Involves: Larry Hedges, Beth Tipton, Chris Klager, Maddy Mullaney, Natalie Simbolon

Clearinghouse Collaborations

Evidence-based policy and practice has led to the creation of a new institutional form: The research clearinghouse. Dozens of research clearinghouses have developed and, although they face many of the same technical problems, they proceed independently. This project aims to foster collaborations and coordination among clearinghouses across the world.

Involves: Larry Hedges, Beth Tipton

Research Focus Areas

Interpretation of Effect Sizes

Hedges, L. V. & Olkin, I. (2016). Overlap between treatment and control group distributions of an experiment as an effect size measure. Psychological Methods, 21, 61-68. DOI:10.1037/met0000042.

Hedges. L. V. (2008). What are effect sizes and why do we need them? Developmental Psychology Perspectives, 2, 167-171. DOI:10.1111/j.1750-8606.2008.00060.x.

Konstantopoulos, S. & Hedges, L. V. (2008). How large an effect can we expect from school reforms? Teachers College Record, 110, 1613-1640. TCID:15151.

Representing Uncertainty

Kim, Y.S., Wallis, L. A., Krafft, P, & Hullman, J. (2019). A Bayesian cognition approach to improve data visualization. ACM Human Factors in Computing Systems (CHI) 2019. DOI:10.1145/3290605.3300912.

Phelan, C., Hullman, J., Kay, M., & Resnick, P. (2019). Some prior(s) experience necessary. ACM Human Factors in Computing Systems (CHI) 2019. DOI:10.1145/3290605.3300709.

Hullman, J., Resnick, P., & Adar, E. (2015). Hypothetical outcome plots outperform error bars and violin plots for inferences about reliability of variable ordering. PLOS ONE, e0142444. DOI:10.1371/journal.pone.0142444.

Representing Heterogeneity of Findings

Hedges, L. V. & Schauer, J. M. (2019). Statistical analyses for studying replication: Meta-analytic perspectives. To appear in Psychological Methods. DOI:10.1037/met0000189.

Borenstein, M., Higgins. J. P T., Rothstein, H. R., & Hedges, L. V. (2017). I2 is not an absolute measure of heterogeneity in a meta-analysis. Journal of Research Synthesis Methods, 8, 5-18. DOI:10.1002/jrsm.1230.

Standards for Reporting Research

Grant, S., Mayo-Wilson, E., Montgomery, P., MacDonald, G., Michie, S., Hopewell, S., & Moher, D. (2018). CONSORT-SPI 2018 explanation and elaboration: Guidance for reporting social and psychological intervention trials. Trials, 19(1). 406. DOI:10.1186/s13063-018-2735-z.

Education & Training

To make these methods more accessible to researchers the Center provides tutorial papers, online tools and resources including working papers, seminars and short courses that train students and practitioners, and professional development institutes for established researchers