How do we make sense of competing results from multiple sources about an intervention or policy?
This is the fundamental question for the synthesis of evidence. The answer depends on the scope and size of the comparison. Meta-analysis provides tools for summarizing the degree of similarity of results across different evidence sources and for exploring why differences may occur. The STEPP Center endeavors to facilitate research extending methods for meta-analysis to answer pressing problems facing research today.
Current Research Projects
The center engages in developing meta-analysis methods for all types of syntheses. Research has involved methods for converting outcomes to comparable effect sizes, comparing small numbers of experiments on the same intervention in replications studies, and conducting meta-analyses, publication bias as well as the implementation of methods in a variety of examples. This research has been made accessible through the development of software, as well as an in-person training workshop.
Replication is a central concept in the logic and rhetoric of science, yet the methodology of replication is surprisingly underdeveloped. This project explores fundamental questions of how to define the concept of replication, the statistical analyses that are appropriate, and how to design empirical studies to evaluate replication.
Involves: Larry Hedges, Jake Schauer
Meta-Analysis: Modern Methods from Effect Sizes to Meta-Regression
The book by Hedges and Olkin was a major reference work on statistical methods in meta-analysis for many years. This project updates that reference, including over 40 years of innovations.
Involves: Hedges, Tipton, Katie Coburn, Rrita Zejnullahi
Meta-Analysis with Dependent Effect Sizes
Primary studies often report more than a single outcome relevant for a meta-analysis. For example, a study might report the effect of an intervention on the same participants at 1-month and 6-months post intervention, or on multiple outcomes measured at the same time. The effects that are reported are therefore statistically dependent and this dependence must be accounted when pooling them together using meta-analysis. This project focuses on the development, testing, and implementation of robust methods for handling this dependence.
Involves: Beth Tipton, James Pustejovsky
Research Focus Areas
Hedges, L. V. & Olkin, I. (2016). Overlap between treatment and control group distributions of an experiment as an effect size measure. Psychological Methods, 21(1), 61-68. DOI:10.1037/met0000042.
Pustejovsky, J. E., Hedges, L. V., & Shadish, W. L. (2014). Design-comparable effect sizes in multiple baseline designs: A general modeling framework. Journal of Educational and Behavioral Statistics, 39, 368-393. DOI:10.3102/1076998614547577.
Shadish, W. R., Hedges, L. V., Pustejovsky, J. E., Boyajian, J. G., Sullivan, K. J., Andrade, A., & Barrientos, J. L. (2014). A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic. Neuropsychological Rehabilitation, 24, 528-553. DOI:10.1080/09602011.2013.819021.
Shadish, W. R., Hedges, L. V., Pustejovsky, J., Rindskopf, D. M., Boyajian, J. G., & Sullivan, K. J. (2014). Analyzing single-case designs: d, g, hierarchical models, Bayesian estimators, generalized additive models, and the hopes and fears of researchers about analyses. In T. Kratochwill & J. Levin (Eds.). Single-case intervention research: Statistical and Methodological advances (pp. 247-281). Washington, DC: American Psychological Association.
Hedges, L. V., Pustejovsky, J. E., & Shadish, W. A. (2013). A standardized mean difference effect size for multiple baseline designs. Journal of Research Synthesis Methods, 4, 324-341. DOI:10.1002/jrsm.1086.
Hedges, L. V., Pustejovsky, J. E., & Shadish, W. A. (2012). A standardized mean difference effect size for single case designs. Journal of Research Synthesis Methods, 3, 224-239. DOI:10.1002/jrsm.1052.
Hedges, L. V. (2011). Effect sizes in three level designs. Journal of Educational and Behavioral Statistics, 36, 346-380. DOI:10.3102/1076998610376617.
Hedges, L. V. (2009). Adjusting a significance test for clustering in designs with two levels of nesting. Journal of Educational and Behavioral Statistics, 34, 464-490. DOI:10.3102/1076998609337251.
Hedges, L. V. (2009). Effect sizes in studies with nested designs. In H. Cooper, L. V. Hedges, & J. Valentine (Eds.). The handbook of research synthesis and meta-analysis (2nd ed.) (pp. 337-356). New York : The Russell Sage Foundation.
Hedges, L. V. (2007). Correcting a significance test for clustering. Journal of Educational and Behavioral Statistics, 32, 151-179. DOI:10.3102/1076998606298040.
Hedges, L. V. (2007). Effect sizes in cluster randomized designs. Journal of Educational and Behavioral Statistics, 32, 341-370. DOI:10.3102/1076998606298043.
Hedges, L. V. & Schauer, J. (2019). Statistical methods for studying replication: Meta-analytic perspectives. Psychological Methods, 24, 557-570.
Hedges, L. V. & Schauer, J. (2019). More than one replication study is needed for unambiguous tests of replication. Journal of Educational and Behavioral Statistics, 44, 543-570.
Hedges, L. V. & Schauer, J. (2019). Consistency of effects is important in replication. Psychological Methods, 24, 576-577.
Hedges, L. V. (2019). The statistics of replication. Methodology, 15(Supplement), 3-14.
Schauer, J. & Hedges, L. V. (in press). Assessing heterogeneity and power in replications of psychological experiments. Psychological Bulletin.
Schauer, J. & Hedges, L. V. (in press). Reconsidering methods for assessing replication. Psychological Methods.
McShane, B. B. & Böckenholt, U. (2018). Multilevel multivariate meta-analysis with application to choice overload. Psychometrika, 83(1), 255-271. DOI:10.1007/s11336-017-9571-z
Borenstein, M., Higgins, J. P T., Rothstein, H. R., & Hedges L. V. (2017). I2 is not an absolute measure of heterogeneity in a meta-analysis. Journal of Research Synthesis Methods, 8, 5-18. DOI:10.1002/jrsm.1230
Tipton, E. & Shuster, J.J. (2017) A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach. Statistics in Medicine, 36(23), 3621-3635. DOI:10.1002/sim.7352.
McShane, B. B. & Böckenholt, U. (2017). Single-paper meta-analysis: Benefits for study summary, theory testing, and replicability. Journal of Consumer Research, 43(6), 1048-1063. DOI:10.1093/jcr/ucw085.
Hedges, L. V. (2016). Comment on “Misunderstandings about Q and ‘Cochran’s Q test’ in meta-analysis.” Statistics in Medicine, 35, 496-497. DOI:10.1002/sim.6763.
Hedges, L. V. (2016). Applying meta-analysis to structural equation modeling. Journal of Research Synthesis Methods, 7, 209-214. DOI:10.1002/jrsm.1214.
Tipton, E. & Pustejovsky, J. E. (2015). Small-sample adjustments to multivariate hypothesis tests in robust variance estimation in meta-regression. Journal of Educational and Behavioral Statistics, 40(6), 604-634. DOI:10.3102/1076998615606099.
Tipton, E. (2015). Small-sample adjustments for robust variance estimation with meta-regression. Psychological Methods, 20(3), 375-393. DOI:10.1037/met0000011.
Shadish, W. R., Hedges, L. V., Pustejovsky, J., Rindskopf, D. M., Boyajian, J. G., & Sullivan, K. J. (2014). Analyzing single-case designs: d, g, hierarchical models, Bayesian estimators, generalized additive models, and the hopes and fears of researchers about analyses. Pages 247-281 in T. Kratochwill and J. Levin (Eds.). Single-case intervention research: Statistical and Methodological Advances. Washington, DC: American Psychological Association. DOI:10.1037/14376-009.
Tipton, E. (2013). Robust variance estimation in meta-regression for binary dependent outcomes. Research Synthesis Methods, 4(2), 169-187. DOI:10.1002/jrsm.1070.
Hedges, L. V., Tipton, E. & Johnson, M. (2010). Robust variance estimation for meta-regression with dependent effect size estimators. Journal of Research Synthesis Methods, 1, 39-65. DOI: 10.1002/jrsm.5.
Hedges, L. V. & Pigott, T. D. (2004). The power of statistical tests for moderators in meta-analysis. Psychological Methods, 9, 426-445. DOI:10.1037/1082-989X.9.4.426.
Hedges, L. V. (2002). How can survey research contribute to evidence-based social policy? In C. T. Fitz-Gibbon (Ed.). Contributions to evidence-based social policy. Durham, UK: University of Durham.
Hedges, L. V., Johnson, W., Semaan, S., & Sogolow, E. (2002). Theoretical issues in the synthesis of HIV prevention research. Journal of Acquired Immune Deficiency Syndromes, 30, S8-14. PMID:12107356.
Hedges, L. V. & Pigott, T. D. (2001). The power of statistical tests in meta-analysis. Psychological Methods, 6, 203-217. DOI:10.1037/1082-989X.6.3.203.
Gurevitch, J. & Hedges, L. V. (1999). Statistical issues in ecological meta-analysis. Ecology, 80, 1142-1149. DOI:10.2307/177061.
Hedges, L. V., Gurevitch, J., & Curtis, P. (1999). The meta-analysis of response ratios in experimental ecology. Ecology, 80, 1150-1156. DOI:10.2307/177062.
Amrhein, V., Gelman, A., Greenland, S., & McShane, B. B. (2019). Abandoning statistical significance is both sensible and practical (No. e27657v1). PeerJ Preprints. DOI: 10.7287/peerj.preprints.27657v1.
Hedges, L. V. (2017). Plausibility and influence in selection models: A comment on Citkowicz and Vevea, Psychological Methods, 22, 42-46. DOI: 10.1037/met0000108.
McShane, B. B., & Gal, D. (2017). Statistical significance and the dichotomization of evidence. Journal of the American Statistical Association, 112(519), 885-895. DOI: 10.1080/01621459.2017.1289846.
McShane, B. B., Böckenholt, U., & Hansen, K. T. (2016). Adjusting for publication bias in meta-analysis: An evaluation of selection methods and some cautionary notes. Perspectives on Psychological Science, 11(5), 730-749. DOI: 10.1177/1745691616662243.
Hedges, L. V. & Vevea, J. (2005). Selection method approaches to publication bias. In H. Rothstein, A. Sutton, & M. Borenstein (Eds.). Publication bias in meta-analysis (pp. 145-174). New York, NY: John Wiley. DOI: 10.1002/0470870168.ch9.
Hedges, L. V. & Vevea, J. L. (1996). Estimating effect size under publication bias: Small sample properties and robustness of a random effects selection model. Journal of Educational and Behavioral Statistics, 21, 299-332. DOI: 10.3102/10769986021004299.
Vevea, J. L. & Hedges, L. V. (1995). A general linear model for estimating effect size in the presence of publication bias. Psychometrika, 60, 419-435. DOI: 10.1007/BF02294384.
Hedges, L. V. (1992). Modeling publication selection effects in meta-analysis. Statistical Science, 7, 246-255. DOI: 10.1214/ss/1177011364.
Hedges, L. V. (1984). Estimation of effect size under nonrandom sampling: The effects of censoring studies yielding statistically insignificant mean differences. Journal of Educational Statistics, 9, 61-85. DOI: 10.3102/10769986009001061.
Miller, D. I., Nolla, K. M., Eagly, A. H., & Uttal, D. H. (2018). The development of children’s gender‐science stereotypes: a meta‐analysis of 5 decades of US draw‐a‐scientist studies. Child development, 89(6), 1943-1955. DOI:10.1111/cdev.13039.
Bediou, B., Adams, D. M., Mayer, R. E., Tipton, E., Green, C. S., & Bavelier, D. (2018). Meta-analysis of action video game impact on perceptual, attentional, and cognitive skills. Psychological Bulletin, 144(1), 77-110. DOI:10.1037/bul0000130.
Uttal, D. H., Meadow, N. G., Tipton, E., Hand, L. L., Alden, A. R., Warren, C., & Newcombe, N. S. (2013). The malleability of spatial skills: A meta-analysis of training studies. Psychological Bulletin, 139(2), 352-402. DOI:10.1037/a0028446.
Johnson, W., Hedges, L. V., Ramírez, G., Seeman, S., Norman, L., Sogolow, E., Sweat, M., & Diaz, M. (2002). HIV prevention research for men who have sex with men: A systematic review and meta-analysis. Journal of Acquired Immune Deficiency Syndromes, 30, S118-129. PMID:12107365.
Mullen, P.D., Ramírez, G., Strouse, D., Hedges, L. V., & Sogolow, E. (2002) Meta-Analysis of the effects of behavioral HIV prevention interventions on the sexual risk behavior of sexually experienced adolescents in US controlled studies. Journal of Acquired Immune Deficiency Syndromes, 30, S94-105. PMID:12107363.
Neuman, M. S., Johnson, W. D., Semaan, S., Flores, S. A., Peersman, G., Hedges, L. V., & Sogolow, E. D. (2002). Review and meta-analysis of HIV prevention intervention research for heterosexual adult population in the United States. Journal of Acquired Immune Deficiency Syndromes, 30, S106-117. PMID:12107364.
Semaan, S., DesJarlais, D., Sogolow, E., Johnson, W., Hedges, L., Ramírez, G., Flores, S., Norman, L., Sweat, M., & Needle, R. (2002). A meta-analysis of the effect of HIV prevention programs on the sex behaviors of drug users in the United States. Journal of Acquired Immune Deficiency Syndromes, 30, S73-93. PMID:12107362.
Hedges, L.V. (2000). Using converging evidence in policy formation: The case of research on class size. Evaluation and Research in Education, 14, 193-205. DOI:10.1080/09500790008666972.
Gurevitch, J., Morrison, J. A., & Hedges, L. V. (2000). The interaction between competition and predation: A meta-analysis of field experiments. American Naturalist, 155, 435-453. DOI:10.1086/303337.
Cooper, H. M., Hedges, L. V., & Valentine, J. (Eds.) (2019). The handbook of research synthesis and meta-analysis (3rd ed.). New York: The Russell Sage Foundation.
Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. London: John Wiley. [Chinese edition, 2012]
Cooper, H. M., Hedges, L. V., & Valentine, J. (Eds.) (2009). The handbook of research synthesis and meta-analysis (2nd ed.). New York: The Russell Sage Foundation.
Cooper, H. M. & Hedges, L. V. (Eds.) (1994). The handbook of research synthesis. New York: The Russell Sage Foundation.
Draper, D., Gaver, D. P., Goel, P. K., Greenhouse, J. B., Hedges, L. V., Morris, C. N., Tucker, J. R., & Waternaux, C. (1993). Combining information: Statistical issues and opportunities for research. Washington, D.C.: American Statistical Association.
Cook, T., Cooper, H. M., Cordray, D., Hedges, L. V., Light, R. J., Louis, T., & Mosteller, F. (1991). Meta-Analysis for Explanation. New York: The Russell Sage Foundation.
Hedges, L. V., Shymansky, J. A., & Woodworth, G. (1989). A practical guide to modern methods of meta-analysis. Washington, D.C.: National Science Teachers Association.
Hedges, L. V. & Olkin, I. (1985). Statistical methods for meta-analysis. New York: Academic Press.
Education & Training
To make these methods more accessible to researchers the Center provides tutorial papers, online tools and resources including working papers, seminars and short courses that train students and practitioners, and professional development institutes for established researchers.