Quality and clarity in systematic review abstracts: an empirical study

Res Synth Methods. 2016 Dec;7(4):447-458. doi: 10.1002/jrsm.1221. Epub 2016 Oct 20.

Abstract

Background: Systematic review (SR) abstracts are important for disseminating evidence syntheses to inform medical decision making. We assess reporting quality in SR abstracts using PRISMA for Abstracts (PRISMA-A), Cochrane Handbook, and Agency for Healthcare Research & Quality guidance.

Methods: We evaluated a random sample of 200 SR abstracts (from 2014) comparing interventions in the general medical literature. We assessed adherence to PRISMA-A criteria, problematic wording in conclusions, and whether "positive" studies described clinical significance.

Results: On average, abstracts reported 60% of PRISMA-A checklist items (mean 8.9 ± 1.7, range 4 to 12). Eighty percent of meta-analyses reported quantitative measures with a confidence interval. Only 49% described effects in terms meaningful to patients and clinicians (e.g., absolute measures), and only 43% mentioned strengths/limitations of the evidence base. Average abstract word count was 274 (SD 89). Word count explained only 13% of score variability. PRISMA-A scores did not differ between Cochrane and non-Cochrane abstracts (mean difference 0.08, 95% confidence interval -1.16 to 1.00). Of 275 primary outcomes, 48% were statistically significant, 32% were not statistically significant, and 19% did not report significance or results. Only one abstract described clinical significance for positive findings. For "negative" outcomes, we identified problematic simple restatements (20%), vague "no evidence of effect" wording (9%), and wishful wording (8%).

Conclusions: Improved SR abstract reporting is needed, particularly reporting of quantitative measures (for meta-analysis), easily interpretable units, strengths/limitations of evidence, clinical significance, and clarifying whether negative results reflect true equivalence between treatments. Copyright © 2016 John Wiley & Sons, Ltd.

Keywords: PRISMA; clinical significance; minimally important difference; reporting quality; systematic review abstracts.

MeSH terms

  • Abstracting and Indexing* / standards
  • Databases, Bibliographic
  • Decision Making
  • Empirical Research
  • Periodicals as Topic / standards
  • Publishing / standards
  • Quality Control
  • Research Design*
  • Systematic Reviews as Topic*
  • United States
  • United States Agency for Healthcare Research and Quality