Ioannidis 2014 Lancet

From Bioblast
Jump to: navigation, search
Publications in the MiPMap
Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R (2014) Increasing value and reducing waste in research design, conduct, and analysis. Lancet 383:166-75.

» PMID: 25552691 Open Access

Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R (2014) Lancet

Abstract: Correctable weaknesses in the design, conduct, and analysis of biomedical and public health research studies can produce misleading results and waste valuable resources. Small effects can be difficult to distinguish from bias introduced by study design and analyses. An absence of detailed written protocols and poor documentation of research is common. Information obtained might not be useful or important, and statistical precision or power is often too low or used in a misleading way. Insufficient consideration might be given to both previous and continuing studies. Arbitrary choice of analyses and an overemphasis on random extremes might affect the reported findings. Several problems relate to the research workforce, including failure to involve experienced statisticians and methodologists, failure to train clinical researchers and laboratory scientists in research methods and design, and the involvement of stakeholders with conflicts of interest. Inadequate emphasis is placed on recording of research decisions and on reproducibility of research. Finally, reward systems incentivise quantity more than quality, and novelty more than reliability. We propose potential solutions for these problems, including improvements in protocols and documentation, consideration of evidence from studies in progress, standardisation of research efforts, optimisation and training of an experienced and non-conflicted scientific workforce, and reconsideration of scientific reward systems.


Labels: MiParea: Instruments;methods 






Gentle Science 

Gentle Science

Selected text quotes

Recommendations

  • Make publicly available the full protocols, analysis plans or sequence of analytical choices, and raw data for all designed and undertaken biomedical research.
  • Maximise the effect-to-bias ratio in research through defensible design and conduct standards, a well trained methodological research workforce, continuing professional development, and involvement of non-conflicted stakeholders.
  • Reward (with funding, and academic or other recognition) reproducibility practices and reproducible research, and enable an efficient culture for replication of research.
  • Journals should consider setting some design prerequisites for particular types of studies before they accept reports for publication. This requirement goes beyond simply asking for transparency from investigators in reporting of what was done. Examples include the MIAME (Minimum Information About a Microarray Experiment) guidelines for microarray experiments.

Statistics and reproducibility

  • A study of reports published in 2001 showed that p values did not correspond to the given test statistics in 38% of articles published in Nature and 25% in the British Medical Journal.
  • Researchers at Bayer could not replicate 43 of 67 oncological and cardiovascular findings reported in academic publications. Researchers at Amgen could not reproduce 47 of 53 landmark oncological findings for potential drug targets.
  • The scientific reward system places insufficient emphasis on investigators doing rigorous studies and obtaining reproducible results.

Consideration of other evidence

  • Typically, every study is designed, done, and discussed in isolation.
  • Most research designs do not take account of similar studies being done at the same time.
  • Systematic reviews of previous data .. are done retrospectively, and investigators might have some knowledge of the data, even as they design the review.

Protocol documentation

  • Use of a strict preconceived protocol might not be feasible for some exploratory research, but nonetheless investigators should rigorously document the sequence of decisions and findings made in the course of the study, and reasons for those decisions.
  • Systematic reviews with written protocols detailing prespecifi ed steps can now be registered prospectively.52,53 Protocol registration will not avoid the need for unanticipated deviations from the protocol, but would make deviations more visible and open to public judgment.
  • Registration of clinical trials became widespread only when it became a prerequisite for publication in most major journals. Similarly, protocol or dataset registration and deposition is likely to become widely adopted only with similar incentives—eg, if a prerequisite for funding and publication of research reports.
  • Another option is to encourage or require full external peer review and publication of protocols in journals.
  • For preclinical laboratory and animal studies, prespecified protocols that are publicly deposited might also be desirable. Researchers who do this research have little experience of using protocols, and feasibility needs to be probed.
  • Date-stamped study protocols—including a statement of purpose or the hypotheses to be tested, power calculations, methods for data collection, and a statistical analysis plan—could be made available to journal reviewers on request.
  • Protocols should be written prospectively for studies on the basis of such data repositories.

Consortia

  • Large and inclusive consortia can have a more comprehensive view of what is already available or underway in the specialty through the enhancement of communication between investigators. New and interesting ideas can be proposed by individual investigators and then tested effi ciently at the consortium level. Consortia have been particularly successful in situations in which there is consensus that maximisation of sample size is paramount, such as in genome-wide association studies.

Standardization and development

  • Full standardisation of definitions and analytical procedures could be feasible for new research efforts.
  • For existing datasets and studies, harmonisation attempts to achieve some, but not necessarily perfect, homogeneity of definitions might need substantial effort and coordination.
  • Large consortia and collaborations can allow the use of a common language among investigators for clinical defi nitions, laboratory measurements, and statistical analyses.
  • The way that many laboratory studies are reported suggests that scientists are unaware that their methodological approach is without rigour.
  • Research is often done by stakeholders with conflicts of interest that favour specific results.
  • Clinical and laboratory researchers might also benefit from an opportunity to update their skills in view of newer methodological developments, perhaps through short courses and novel approaches to continued methodological education.

Publication

  • Statistically significant results, prestigious authors or journals, and well connected research groups attract more citations than do studies without these factors, creating citation bias.
  • Many appointment and promotion committees function under a misplaced emphasis on number of publications. Although publication of research is essential, use of number of publications as an indicator of scholarly accomplishment stresses quantity rather than quality.
  • Almost 20 years ago, Altman noted that there was a need for less research, better research, and research done for the right reasons.
  • Researchers are tempted to promise and publish exaggerated results to continue their funding for what they think of as innovative work.
  • PLOS One has pledged to publish reproducibility checks done by contracted independent laboratories as part of the reproducibility initiative.
  • Public availability of raw data and complete scripts of statistical analyses could be required by journals and funding agencies sponsoring new research.
  • Scientific productivity cannot be judged simply by number of publications. Publication of many low-quality articles is worse than is production of none.
  • Post-publication peer review might provide further insights about study quality and reproducibility, but few data exist for the eff ectiveness of this approach.

PubMed Commons

  • The development of electronic publishing could allow for post-publication ratings and comments on scientific work.
  • One author (RT) has helped to create such a system at PubMed, which is called PubMed Commons. It is a new feature built into PubMed, researchers can add a comment to any publication, and read the comments of others. PubMed Commons is a forum for open and constructive criticism and discussion of scientific issues. At present, comments are not anonymous to maintain the quality of the interchange.


Ten options to improve the quality of animal research

Protocols and optimum design

1 Creation of a publicly accessible date-stamped protocol preceding data collection and analysis, or clear documentation that research was entirely exploratory.
2 Use of realistic sample size calculations.
3 Focus on relevance, not only statistical efficiency.

Effect-to-bias ratio

4 Random assignment of groups.
5 Incorporation of blind observers.
6 Incorporation of heterogeneity into the design, whenever appropriate, to enhance generalisability.
7 Increase in multicentre studies.
8 Publishers should adopt and implement the ARRIVE (Animal Research: Reporting In Vivo Experiments) guidelines.

Workforce and stakeholders

9 Programmes for continuing professional development for researchers

Reproducibility and reward systems

10 Funders should increase attention towards quality and enforce public availability of raw data and analyses.