Waste occurs during 5 stages of research production: question selection, study design, research conduct, publication, and reporting. Much of this waste appears to be avoidable or remediable, but there are few proposed solutions. To stimulate and promote research in this area,, Cochrane is now calling for nominations for the 2018 prize. Nominations should be submitted by 15 May 2018.
This one day symposium will bring together key stakeholders in science, including experts from schools of public health and medicine, and will be an ideal platform for discussing ways to promote reproducibility and transparency. Yale School of Public Health, New Haven, Connecticut on April 16th, 2018
Our postdoctoral fellowship for 2018-19 is now open. Fellowships will focus on METRICS research areas: methods, reporting, evaluation, reproducibility and incentives. Apply today!
Earlier this year METRICS brought together scientific thought leaders to Washington, D.C. to discuss how scientists are rewarded for their work. Here we report on some key themes from the workshop.
A call for the adoption of measures to improve key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives.
P values in display items are ubiquitous and almost invariably significant: A survey of top science journals
Metrics NewsMore Details
P values represent a widely used, but pervasively misunderstood and fiercely contested method of scientific inference. Display items, such as figures and tables, often containing the main results, are an important source of P values. We conducted a survey comparing the overall use of P values and the occurrence of significant P values in display items of a sample of articles in the three top multidisciplinary journals (Nature, Science, PNAS) in 2017 and, respectively, in 1997. We also examined the reporting of multiplicity corrections and its potential influence on the proportion of statistically significant P values. Our findings demonstrated substantial and growing reliance on P values in display items, with increases of 2.5 to 14.5 times in 2017 compared to 1997. The overwhelming majority of P values (94%, 95% confidence interval [CI] 92% to 96%) were statistically significant. Methods to adjust for multiplicity were almost non-existent in 1997, but reported in many articles relying on P values in 2017 (Nature 68%, Science 48%, PNAS 38%). In their absence, almost all reported P values were statistically significant (98%, 95% CI 96% to 99%). Conversely, when any multiplicity corrections were described, 88% (95% CI 82% to 93%) of reported P values were statistically significant. Use of Bayesian methods was scant (2.5%) and rarely (0.7%) articles relied exclusively on Bayesian statistics. Overall, wider appreciation of the need for multiplicity corrections is a welcome evolution, but the rapid growth of reliance on P values and implausibly high rates of reported statistical significance are worrisome.
Metrics NewsMore Details
Not all scientific information is created equal. Large differences exist across topics on how much is known, and with what degree of certainty. Some questions are more difficult to answer, and some research tools are more reliable than others. Not all methods can be applied to answer every question. Credibility depends  on how large and rigorous studies are, how well researchers have contained conflicts of interest (financial or other), and how successfully the study design and analysis have limited bias, properly accounting for the complexity inherent in each scientific question. Coordinated efforts among scientists instead of furtive competition help improve the odds of success. Transparency with full sharing of data, protocols and computer codes improves trust in research findings. Re-analysis of data by independent teams adds to that trust and replication in new studies further enhances it.
Latest BlogAll Blogs
Meta-Research is coming of age. This is the energizing insight that I brought home from Washington, DC, where I had joined the recent Sackler Colloquium held at the National Academy of Sciences. Organized by David B. Allison, Richard Shiffrin and Victoria Stodden, and generously supported by the Laura and John Arnold foundation and others, the colloquium brought together experts from all over the academic and geographic world, to discuss “Reproducibility of Research: Issues and Proposed Remedies”.
Latest PublicationAll publications
An-Wen Chan, Annukka Pello, Jessica Kitchen, Anna Axentiev, Jorma I. Virtanen, Annie Liu, Elina Hemminki Reporting
- Journal Title: JAMA
- Publication Date: 2017