Friday, 27 May 2016

New study scrutinises the quality of systematic reviews indexed in Medline

NHMRC Early Career Fellow Matthew Page, from the Australian Cochrane Centre and currently based at the University of Bristol, UK, has expertise in the area of systematic review methodology and bias in biomedical and public health research. He recently led a cross-sectional study with Dr David Moher and colleagues from the University of Ottawa on the publishing and reporting quality of systematic reviews of biomedical research.

According to the study, recently published in PLOS Medicine, systematic reviews continue to grow increasingly popular, but also continue to vary widely in quality of conduct and reporting.

Systematic reviews which identify, select, critically appraise, and synthesise the results of all existing studies of a given question, are considered the highest level of evidence for decision-makers.

To determine the number currently published, and assess the quality of conduct and reporting, Page and colleagues assessed a cross section of all systematic reviews indexed in Medline in February 2014.

“To our knowledge, since 2004 there has been no cross-sectional study of the prevalence, focus, and completeness of reporting of systematic reviews across different specialties, so we thought it was timely to gather more recent data,” said Page.

“What we found was that an increasing number of systematic reviews are being published, but that many are poorly conducted and reported. More strategies are needed to help reduce this avoidable waste in research,” said Page.

They found 682 systematic reviews indexed in February 2014 —a three-fold increase over the last decade—mainly of therapeutic questions and largely from the UK, US and China.

In a random sample of 300 of these systematic reviews, Page and colleagues found that at least a third of the reviews did not report how the reviewers searched for studies or how they assessed the quality of the included studies; unpublished data was rarely sought and at least a third of the reviews used statistical methods discouraged by leading organisations that have developed guidance for systematic reviews (for example, Cochrane and the Institute of Medicine).

Based on their findings, Page and colleagues recommend, “developing software to facilitate better reporting, certified training for journal editors in how to implement the use of reporting guidelines such as PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses), and formal training of biomedical researchers in research design and analysis.”

No comments:

Post a Comment

linkwithin

Related Posts Plugin for WordPress, Blogger...