NLM Gateway
A service of the U.S. National Institutes of Health
Your Entrance to
Resources from the
National Library of Medicine
    Home      Term Finder      Limits/Settings      Search Details      History      My Locker        About      Help      FAQ    
Skip Navigation Side Barintended for web crawlers only

Detecting and adjusting for publication bias in meta-analysis: a systematic review.

Pham B, Moher D, Platt R, McAuley L, Jones A, Klassen TP; Cochrane Colloquium (7th : 1999 : Università S. Tommaso d'Aquino).

Best Evid Health Care Cochrane Colloq 7th 1999 Univ S Tommaso Daquino. 1999; 7: 16.

Thomas C. Chalmers Centre for Systematic Reviews, Children's Hospital of Eastern Ontario Research Institute, Ottawa, Canada.

INTRODUCTION: If publication bias is present in a meta-analysis this may result in a biased estimate of intervention effectiveness. Meta-analysts need to know how to detect and, if present, adjust for publication bias in the data analysis. OBJECTIVES: We conducted a systematic review to identify methods to detect, assess impact of, and adjust for publication bias. We also compared these methods using an unbiased sampling frame. METHODS: We searched Medline and Science Index (1966-99) and MathSciNet (1940-99) for relevant articles. After an initial screening [n=332] the remaining articles [n=67] were reviewed independently by two reviewers (BP, RP)using the following criteria: basic supporting theory, assumptions, method outcomes, estimation, limitations, simplicity and generality. To evaluate the performance of the included methods, we used 26 meta-analyses including 400 randomized trials of which 73 were unpublished. RESULTS: Thirty-one methods were identified and classified into four groups according to their initial concepts: file-drawer (7 methods), funnel plot (9 methods), selection model (11 methods) and selection model with data augmentation (4 methods). From Rosenthal's fail-safe number, recent file-drawer methods estimate the number of unpublished studies. Graphical inspection of a funnel plot can be supplemented with a rank-correlation test, linear or logistic regression analyses, and a simple rank-based data augmentation technique. "Trim and fill" methods estimate treatment effect adjusting for the number and outcomes of missing studies. Selection models estimate the treatment effect while allowing for modeling of non-randomly selected studies. Parameter estimation in these models used maximum likelihood, the expectation/maximization (EM) algorithm, and Markov Chain Monte-Carlo simulation. We discuss Bayesian hierarchical selection models, the data augmentation technique; and their applications to model both the selection process and sensitivity to unobserved studies. Thirteen of the 26 meta-analyses of published trials had statistically significant results. Of these, none became non-significant with the inclusion of unpublished trials; four became non-significant after adjusting for publication bias with the "trim-and fill" method; three with the "simple, graphical" method; and two with a selection model. On average, estimates from published studies overestimate treatment effect by 6% (inter-quartile range -3, 43%). The "trim and fill" method overcompensation and underestimated treatment effect by 6% (-39%, 18%), on average. The "simple, graphical" method overestimated by 21% (-19%, 76%) and a selection model overestimated by 47% (-22%, 104%). DISCUSSION: We identified a large number of methods that have been developed to detect and adjust for publication bias. The methods are diverse and when compared to one another can provide different estimates in terms of direction and magnitude of publication bias.

Publication Types:
  • Meeting Abstracts
Keywords:
  • Bias (Epidemiology)
  • Biomedical Research
  • Clinical Trials as Topic
  • Meta-Analysis
  • Publication Bias
  • Publishing
  • Regression Analysis
  • Research
  • United States
  • methods
  • statistics & numerical data
  • hsrmtgs
Other ID:
  • HTX/20603239
UI: 102194928

From Meeting Abstracts




Contact Us
U.S. National Library of Medicine |  National Institutes of Health |  Health & Human Services
Privacy |  Copyright |  Accessibility |  Freedom of Information Act |  USA.gov