Complete and transparent reporting of randomized controlled trials is integral for replication, critical appraisal and understanding context. Published July 31 in Trials, a new extension of the CONSORT Statement aims to improve the reporting of randomized controlled trials of social and psychological interventions. Here, co-authors Paul Montgomery, Evan Mayo-Wilson and Sean Grant discuss what these guidelines could mean for the field.
Randomized controlled trials (RCTs) are a prominent method for evaluating whether interventions work.
Of course, RCTs might be conducted in many ways, so they need to be carefully appraised, individually and in systematic reviews, to judge their quality and their applicability in different settings. This is why good reporting of trials is so important and why the CONSORT (Consolidated Standards of Reporting Trials) Statement was created as a guideline for reporting RCTs.
CONSORT originally aimed to address inadequate reporting of RCTs in medicine by identifying minimum standards for describing how RCTs were conducted and what they found. It has had a positive impact in medicine, and has since been extended to address a wide range of interventions and trial designs.
Unfortunately, compared with the medical sciences, implementation of CONSORT and other reporting guidelines has been lacking in the social and behavioral sciences.
As a result, important details are routinely missing from publications of trials evaluating the effects of social and psychological interventions. Most reports of RCTs of social and psychological interventions are not sufficiently comprehensive to replicate the interventions provided, appraise study quality, or understand for whom and under which circumstances results might apply. On top of this, the complex nature (PDF) of social and psychological interventions often demands greater attention to intervention design, delivery, uptake, and context.
Recently, the conduct and reporting of social and psychological intervention research has been under scrutiny because of the “replication crisis” in the social sciences. A recent NIH policy requiring registration and results reporting for RCTs that weren't previously considered “clinical trials” has led to great controversy.
Against this backdrop, social and behavioral scientists have taken steps to increase research transparency. For instance, there's great enthusiasm for data sharing, and there are emerging standards for promoting an open research culture (e.g. here and here).
However, even as data, code, and study materials become widely available, journal articles are likely to remain the cornerstone of scientific discourse. Most of us don't have the time, skills, or interest to reanalyze the studies we read. Similarly, registrations are of little value if we don't have clear reports summarizing the results.
While researchers' claims are far more believable when they're based on pre-specified methods and backed by open data, the usefulness of each step depends on the others. That is, new data sharing and registration initiatives will have greatest value if we also have clear and comprehensive reports describing study methods and results.
Increasing Trust in the Social and Behavioral Sciences
Published July 31 in Trials, we developed a CONSORT extension for Social and Psychological Interventions, CONSORT-SPI 2018, to improve reporting of these trials and thereby increase trust in the social and behavioral sciences.
CONSORT-SPI 2018 identifies the minimum information needed to understand and apply the results of RCTs that evaluate interventions thought to work through social and psychological mechanisms of action. To facilitate adherence to the CONSORT-SPI 2018 checklist, we also created an Explanation and Elaboration document that provides guidance tailored to concepts, theories, and taxonomies used in the social and behavioral sciences.
Ultimately, we hope to facilitate the pursuit of answering the “big question” of how and why interventions work, for whom, and under what conditions. To do so, we need to implement strategies to increase transparency and trust in the social and behavioral sciences, not just pay them lip service.
The July 31 publication of CONSORT-SPI 2018 aims to move these efforts forward. With support from the research community, this guideline could help boost confidence in and use of our research among policymakers and, perhaps most importantly, the public at large.
We look forward to working with authors, editors, and other stakeholders to translate this new reporting guideline into policy and practice. We are also eager to hear from the research community about how CONSORT-SPI 2018 works in practice and how we might improve it in the future. We included the year “2018” in the title to reflect our commitment to update this guideline regularly. Please send us your feedback by email to email@example.com.
Paul Montgomery is professor of social intervention at the University of Birmingham. His work is based around complexity in social interventions and how it may be considered across a range of research designs. He is also a Cross Whitehall Trials Panel member for the UK Government Cabinet Office.
Evan Mayo-Wilson is a core faculty member in the Center for Clinical Trials and Evidence Synthesis, Johns Hopkins Bloomberg School of Public Health. His work focuses on methods for conducting, reporting, and synthesizing studies of interventions for health and social problems. He is particularly interested in ways to increase research transparency and reproducibility, including trial registration and data sharing.
Sean Grant is a behavioral and social scientist at the RAND Corporation and professor at the Pardee RAND Graduate School. He is interested in advancing the transparency, openness, and rigor of intervention research for evidence-based policy and practice. He principally researches interventions for behavioral health, though he is interested in intervention research methodology across the behavioral, social, and health sciences.
This commentary originally appeared on On Medicine (BioMed Central) on July 31, 2018. Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.