Skip to main content
Article
Synthesizing Evidence in Public Policy Contexts: The Challenge of Synthesis When There Are Only a Few Studies
Evaluation Review (2016)
  • Jeffrey C. Valentine
  • Sandra Jo Wilson
  • David Rindskopf
  • Timothy S. Lau
  • Martha Yeide
  • Robin LaSota
  • Emily E. Tanner-Smith
  • Lisa Foster, Liberty University
Abstract
For a variety of reasons, researchers and evidence-based clearinghouses
synthesizing the results of multiple studies often have very few studies
that are eligible for any given research question. This situation is less
than optimal for meta-analysis as it is usually practiced, that is, by
employing inverse variance weights, which allows more informative studies to contribute relatively more to the analysis. This article outlines the choices available for synthesis when there are few studies to
synthesize. As background, we review the synthesis practices used in
several projects done at the behest of governmental agencies and private
foundations. We then discuss the strengths and limitations of different
approaches to meta-analysis in a limited information
environment. Using examples from the U.S. Department of Education’s
What Works Clearinghouse as case studies, we conclude with a discussion
of Bayesian meta-analysis as a potential solution to the challenges
encountered when attempting to draw inferences about the effectiveness
of interventions from a small number of studies.

Keywords
  • methodological development,
  • content area,
  • education,
  • research synthesis,
  • systematic review,
  • meta-analysis,
  • Bayesian statistics
Disciplines
Publication Date
October, 2016
DOI
10.1177/0193841X16674421
Citation Information
Jeffrey C. Valentine, Sandra Jo Wilson, David Rindskopf, Timothy S. Lau, et al.. "Synthesizing Evidence in Public Policy Contexts: The Challenge of Synthesis When There Are Only a Few Studies" Evaluation Review (2016)
Available at: http://works.bepress.com/lisa-foster/3/