Skip to main content

The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research

Abstract

Reproducible science requires transparent reporting. The ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) were originally developed in 2010 to improve the reporting of animal research. They consist of a checklist of information to include in publications describing in vivo experiments to enable others to scrutinise the work adequately, evaluate its methodological rigour, and reproduce the methods and results. Despite considerable levels of endorsement by funders and journals over the years, adherence to the guidelines has been inconsistent, and the anticipated improvements in the quality of reporting in animal research publications have not been achieved. Here, we introduce ARRIVE 2.0. The guidelines have been updated and information reorganised to facilitate their use in practice. We used a Delphi exercise to prioritise and divide the items of the guidelines into 2 sets, the “ARRIVE Essential 10,” which constitutes the minimum requirement, and the “Recommended Set,” which describes the research context. This division facilitates improved reporting of animal research by supporting a stepwise approach to implementation. This helps journal editors and reviewers verify that the most important items are being reported in manuscripts. We have also developed the accompanying Explanation and Elaboration document, which serves (1) to explain the rationale behind each item in the guidelines, (2) to clarify key concepts, and (3) to provide illustrative examples. We aim, through these changes, to help ensure that researchers, reviewers, and journal editors are better equipped to improve the rigour and transparency of the scientific process and thus reproducibility.

Why good reporting is important

In recent years, concerns about the reproducibility of research findings have been raised by scientists, funders, research users, and policy makers [1, 2]. Factors that contribute to poor reproducibility include flawed study design and analysis, variability and inadequate validation of reagents and other biological materials, insufficient reporting of methodology and results, and barriers to accessing data [3]. The bioscience community has introduced a range of initiatives to address the problem, from open access and open practices to enable the scrutiny of all aspects of the research [4, 5] through to study preregistration to shift the focus towards robust methods rather than the novelty of the results [6, 7], as well as resources to improve experimental design and statistical analysis [8,9,10].

Transparent reporting of research methods and findings is an essential component of reproducibility. Without this, the methodological rigour of the studies cannot be adequately scrutinised, the reliability of the findings cannot be assessed, and the work cannot be repeated or built upon by others. Despite the development of specific reporting guidelines for preclinical and clinical research, evidence suggests that scientific publications often lack key information and that there continues to be considerable scope for improvement [11,12,13,14,15,16,17,18]. Animal research is a good case in point, where poor reporting impacts on the development of therapeutics and irreproducible findings can spawn an entire field of research, or trigger clinical studies, subjecting patients to interventions unlikely to be effective [2, 19, 20].

In an attempt to improve the reporting of animal research, the ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) were published in 2010. The guidelines consist of a checklist of the items that should be included in any manuscript that reports in vivo experiments, to ensure a comprehensive and transparent description [21,22,23,24,25,26,27,28,29,30]. They apply to any area of research using live animal species and are especially pertinent to describe comparative research in the laboratory or other formal test setting. The guidelines are also relevant in a wider context, for example, for observational research, studies conducted in the field, and where animal tissues are used. In the 10 years since publication, the ARRIVE guidelines have been endorsed by more than a thousand journals from across the life sciences. Endorsement typically includes advocating their use in guidance to authors and reviewers. However, despite this level of support, recent studies have shown that important information as set out in the ARRIVE guidelines is still missing from most publications sampled. This includes details on randomisation (reported in only 30–40% of publications), blinding (reported in only approximately 20% of publications), sample size justification (reported in less than 10% of publications), and animal characteristics (all basic characteristics reported in less than 10% of publications) [11, 31, 32].

Evidence suggests that 2 main factors limit the impact of the ARRIVE guidelines. The first is the extent to which editorial and journal staff are actively involved in enforcing reporting standards. This is illustrated by a randomised controlled trial at PLOS ONE, designed to test the effect of requesting a completed ARRIVE checklist in the manuscript submission process. This single editorial intervention, which did not include further verification from journal staff, failed to improve the disclosure of information in published papers [33]. In contrast, other studies using shorter checklists (primarily focused on experimental design) with more editorial follow-up have shown a marked improvement in the nature and detail of the information included in publications [34,35,36]. It is likely that the level of resource required from journals and editors currently prohibits the implementation of all the items of the ARRIVE guidelines.

The second issue is that researchers and other individuals and organisations responsible for the integrity of the research process are not sufficiently aware of the consequences of incomplete reporting. There is some evidence that awareness of ARRIVE is linked to the use of more rigorous experimental design standards [37]; however, researchers are often unfamiliar with the much larger systemic bias in the publication of research and in the reliability of certain findings and even of entire fields [33, 38,39,40]. This lack of understanding affects how experiments are designed and grant proposals prepared, how animals are used and data recorded in the laboratory, and how manuscripts are written by authors or assessed by journal staff, editors, and reviewers.

Approval for experiments involving animals is generally based on a harm–benefit analysis, weighing the harms to the animals involved against the benefits of the research to society. If the research is not reported in enough detail, even when conducted rigorously, the benefits may not be realised, and the harm–benefit analysis and public trust in the research are undermined [41]. As a community, we must do better to ensure that, where animals are used, the research is both well designed and analysed as well as transparently reported. Here, we introduce the revised ARRIVE guidelines, referred to as ARRIVE 2.0. The information included has been updated, extended, and reorganised to facilitate the use of the guidelines, helping to ensure that researchers, editors, and reviewers—as well as other relevant journal staff—are better equipped to improve the rigour and reproducibility of animal research.

Introducing ARRIVE 2.0

In ARRIVE 2.0, we have improved the clarity of the guidelines, prioritised the items, added new information, and generated the accompanying Explanation and Elaboration (E&E) document to provide context and rationale for each item [42] (also available at https://www.arriveguidelines.org). New additions comprise inclusion and exclusion criteria, which are a key aspect of data handling and prevent the ad hoc exclusion of data [43]; protocol registration, a recently emerged approach that promotes scientific rigour and encourages researchers to carefully consider the experimental design and analysis plan before any data are collected [44]; and data access, in line with the FAIR Data Principles (Findable, Accessible, Interoperable, Reusable) [45]. Additional file 1 summarises the changes.

The most significant departure from the original guidelines is the classification of items into 2 prioritised groups, as shown in Tables 1 and 2. There is no ranking of the items within each group. The first group is the “ARRIVE Essential 10,” which describes information that is the basic minimum to include in a manuscript, as without this information, reviewers and readers cannot confidently assess the reliability of the findings presented. It includes details on the study design, the sample size, measures to reduce subjective bias, outcome measures, statistical methods, the animals, experimental procedures, and results. The second group, referred to as the “Recommended Set,” adds context to the study described. This includes the ethical statement, declaration of interest, protocol registration, and data access, as well as more detailed information on the methodology such as animal housing, husbandry, care, and monitoring. Items on the abstract, background, objectives, interpretation, and generalisability also describe what to include in the more narrative parts of a manuscript.

Table 1 ARRIVE Essential 10
Table 2 ARRIVE Recommended Set

Revising the guidelines has been an extensive and collaborative effort, with input from the scientific community carefully built into the process. The revision of the ARRIVE guidelines has been undertaken by an international working group—the authors of this publication—with expertise from across the life sciences community, including funders, journal editors, statisticians, methodologists, and researchers from academia and industry. We used a Delphi exercise [46] with external stakeholders to maximise diversity in fields of expertise and geographical location, with experts from 19 countries providing feedback on each item, suggesting new items, and ranking items according to their relative importance for assessing the reliability of research findings. This ranking resulted in the prioritisation of the items of the guidelines into the 2 sets. Demographics of the Delphi panel and full methods and results are presented in Additional files 2 and 3. Following their publication on BioRxiv, the revised guidelines and the E&E were also road tested with researchers preparing manuscripts describing in vivo studies, to ensure that these documents were well understood and useful to the intended users. This study is presented in Additional files 4 and 5.

While reporting animal research in adherence to all 21 items of ARRIVE 2.0 represents best practice, the classification of the items into 2 groups is intended to facilitate the improved reporting of animal research by allowing an initial focus on the most critical issues. This better allows journal staff, editors, and reviewers to verify that the items have been adequately reported in manuscripts. The first step should be to ensure compliance with the ARRIVE Essential 10 as a minimum requirement. Items from the Recommended Set can then be added over time and in line with specific editorial policies until all the items are routinely reported in all manuscripts. ARRIVE 2.0 are fully compatible with and complementary to other guidelines that have been published in recent years. By providing a comprehensive set of recommendations that are specifically tailored to the description of in vivo research, they help authors reporting animal experiments adhere to the National Institutes of Health (NIH) standards [43] and the minimum standards framework and checklist (Materials, Design, Analysis and Reporting [MDAR] [47]). The revised guidelines are also in line with many journals’ policies and will assist authors in complying with information requirements on the ethical review of the research [48, 49], data presentation and access [50,51,52], statistical methods [51, 52], and conflicts of interest [53, 54].

Although the guidelines are written with researchers and journal editorial policies in mind, it is important to stress that researchers alone should not have to carry the responsibility for transparent reporting. Funders, institutions, and publishers’ endorsement of ARRIVE has been instrumental in raising awareness to date; they now have a key role to play in building capacity and championing the behavioural changes required to improve reporting practices. This includes embedding ARRIVE 2.0 in appropriate training, workflows, and processes to support researchers in their different roles. While the primary focus of the guidelines has been on the reporting of animal studies, ARRIVE also has other applications earlier in the research process, including in the planning and design of in vivo experiments. For example, requesting a description of the study design in line with the guidelines in funding or ethical review applications ensures that steps to minimise experimental bias are considered at the beginning of the research cycle [55].

Conclusion

Transparent reporting is clearly essential if animal studies are to add to the knowledge base and inform future research, policy, and clinical practice. ARRIVE 2.0 prioritises the reporting of information related to study reliability. This enables research users to assess how much weight to ascribe to the findings and, in parallel, promotes the use of rigorous methodology in the planning and conduct of in vivo experiments [37], thus increasing the likelihood that the findings are reliable and, ultimately, reproducible.

The intention of ARRIVE 2.0 is not to supersede individual journal requirements but to promote a harmonised approach across journals to ensure that all manuscripts contain the essential information needed to appraise the research. Journals usually share a common objective of improving the methodological rigour and reproducibility of the research they publish, but different journals emphasise different pieces of information [56,57,58]. Here, we propose an expert consensus on information to prioritise. This will provide clarity for authors, facilitate transfer of manuscripts between journals, and accelerate an improvement of reporting standards.

Concentrating the efforts of the research and publishing communities on the ARRIVE Essential 10 items provides a manageable approach to evaluate reporting quality efficiently and assess the effect of interventions and policies designed to improve the reporting of animal experiments. It provides a starting point for the development of operationalised checklists to assess reporting, ultimately leading to the build of automated or semi-automated artificial intelligence tools that can detect missing information rapidly [59].

Improving reporting is a collaborative endeavour, and concerted effort from the biomedical research community is required to ensure maximum impact. We welcome collaboration with other groups operating in this area, as well as feedback on ARRIVE 2.0 and our implementation strategy.

Availability of data and materials

All data and supporting information are available at https://osf.io/unc4j/.

Abbreviations

ARRIVE:

Animal Research: Reporting of In Vivo Experiments

FAIR:

Findable, Accessible, Interoperable, Reusable

E&E:

Explanation and Elaboration

MDAR:

Materials, Design, Analysis and Reporting

NIH:

National Institutes of Health

References

  1. Goodman SN, Fanelli D, Ioannidis JPA. What does research reproducibility mean? Sci Transl Med. 2016;8(341):341ps12. https://doi.org/10.1126/scitranslmed.aaf5027.

    Article  PubMed  Google Scholar 

  2. Begley CG, Ioannidis JP. Reproducibility in science: improving the standard for basic and preclinical research. Circ Res. 2015;116(1):116–26. https://doi.org/10.1161/CIRCRESAHA.114.303819 Epub 2015/01/02. PubMed PMID: 25552691.

    Article  PubMed  CAS  Google Scholar 

  3. Freedman LP, Venugopalan G, Wisman R. Reproducibility2020: Progress and priorities. F1000Res. 2017;6:604. https://doi.org/10.12688/f1000research.11334.1 Epub 2017/06/18. PubMed PMID: 28620458; PubMed Central PMCID: PMCPMC5461896.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Kidwell MC, Lazarevic LB, Baranski E, Hardwicke TE, Piechowski S, Falkenberg LS, et al. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLoS Biol. 2016;14(5):e1002456. https://doi.org/10.1371/journal.pbio.1002456 Epub 2016/05/14. PubMed PMID: 27171007; PubMed Central PMCID: PMCPMC4865119.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  5. Else H. Radical open-access plan could spell end to journal subscriptions. Nature. 2018;561(7721):17–8. https://doi.org/10.1038/d41586-018-06178-7 Epub 2018/09/06. PubMed PMID: 30181639.

    Article  PubMed  CAS  Google Scholar 

  6. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proc Natl Acad Sci U S A. 2018;115(11):2600–6. https://doi.org/10.1073/pnas.1708274114 Epub 2018/03/14. PubMed PMID: 29531091; PubMed Central PMCID: PMCPMC5856500.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  7. Chambers CD, Forstmann B, Pruszynski JA. Registered reports at the European journal of neuroscience: consolidating and extending peer-reviewed study pre-registration. Eur J Neurosci. 2017;45(5):627–8. https://doi.org/10.1111/ejn.13519 Epub 2016/12/28. PubMed PMID: 28027598.

    Article  PubMed  Google Scholar 

  8. Bate ST, Clark RA. The design and statistical analysis of animal experiments. Cambridge: Cambridge University Press; 2014. p. 310.

    Book  Google Scholar 

  9. Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. The experimental design assistant. PLoS Biol. 2017;15(9):e2003779. https://doi.org/10.1371/journal.pbio.2003779 Epub 2017/09/29. PubMed PMID: 28957312; PubMed Central PMCID: PMCPMC5634641.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  10. Lazic SE. Experimental design for laboratory biologists: maximising information and improving reproducibility. Cambridge: Cambridge University Press; 2016.

    Book  Google Scholar 

  11. Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde A, Sherratt N, et al. Risk of bias in reports of in vivo research: a focus for improvement. PLoS Biol. 2015;13(10):e1002273. https://doi.org/10.1371/journal.pbio.1002273 Epub 2015/10/16. PubMed PMID: 26460723; PubMed Central PMCID: PMCPMC4603955.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  12. Macleod MR, Fisher M, O'Collins V, Sena ES, Dirnagl U, Bath PM, et al. Good laboratory practice: preventing introduction of bias at the bench. Stroke. 2009;40(3):e50–2. https://doi.org/10.1161/STROKEAHA.108.525386 Epub 2008/08/16. PubMed PMID: 18703798.

    Article  PubMed  Google Scholar 

  13. Rice AS, Cimino-Brown D, Eisenach JC, Kontinen VK, Lacroix-Fralish ML, Machin I, et al. Animal models and the prediction of efficacy in clinical trials of analgesic drugs: a critical appraisal and call for uniform reporting standards. Pain. 2009;139(2):243–7. https://doi.org/10.1016/j.pain.2008.08.017 Epub 2008/09/26. PubMed PMID: 18814968. S0304-3959(08)00508-3.

    Article  Google Scholar 

  14. McCance I. Assessment of statistical procedures used in papers in the Australian veterinary journal. Aust Vet J. 1995;72(9):322–8 Epub 1995/09/01. PubMed PMID: 8585846.

    Article  CAS  PubMed  Google Scholar 

  15. Hackam DG, Redelmeier DA. Translation of research evidence from animals to humans. JAMA. 2006;296(14):1731–2. https://doi.org/10.1001/jama.296.14.1731 Epub 2006/10/13. PubMed PMID: 17032985.

    Article  PubMed  CAS  Google Scholar 

  16. Kilkenny C, Parsons N, Kadyszewski E, Festing MF, Cuthill IC, Fry D, et al. Survey of the quality of experimental design, statistical analysis and reporting of research using animals. PLoS One. 2009;4(11):e7824. https://doi.org/10.1371/journal.pone.0007824 Epub 2009/12/04. PubMed PMID: 19956596.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  17. van der Worp HB, Howells DW, Sena ES, Porritt MJ, Rewell S, O'Collins V, et al. Can animal models of disease reliably inform human studies? PLoS Med. 2010;7(3):e1000245. https://doi.org/10.1371/journal.pmed.1000245 Epub 2010/04/03. PubMed PMID: 20361020.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76. https://doi.org/10.1016/S0140-6736(13)62228-X Epub 2014/01/15. PubMed PMID: 24411647.

    Article  PubMed  Google Scholar 

  19. Begley CG, Ellis LM. Drug development: raise standards for preclinical cancer research. Nature. 2012;483(7391):531–3. https://doi.org/10.1038/483531a Epub 2012/03/31. PubMed PMID: 22460880.

    Article  CAS  PubMed  Google Scholar 

  20. Scott S, Kranz JE, Cole J, Lincecum JM, Thompson K, Kelly N, et al. Design, power, and interpretation of studies in the standard murine model of ALS. Amyotroph Lateral Scler. 2008;9(1):4–15. https://doi.org/10.1080/17482960701856300 Epub 2008/02/15. PubMed PMID: 18273714. 789666722.

    Article  CAS  PubMed  Google Scholar 

  21. Kilkenny C, Altman DG. Improving bioscience research reporting: ARRIVE-ing at a solution. Lab Anim. 2010;44(4):377–8. https://doi.org/10.1258/la.2010.0010021 PubMed PMID: 20660161. Epub 2010/07/28. la.2010.0010021.

    Article  PubMed  CAS  Google Scholar 

  22. Kilkenny C, Browne W, Cuthill IC, Emerson M, Altman DG. Animal research: reporting in vivo experiments: the ARRIVE guidelines. J Gene Med. 2010;12(7):561–3. https://doi.org/10.1002/jgm.1473 Epub 2010/07/08. PubMed PMID: 20607692.

    Article  PubMed  Google Scholar 

  23. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. PLoS Biol. 2010;8(6):e1000412. https://doi.org/10.1371/journal.pbio.1000412 Epub 2010/07/09. PubMed PMID: 20613859; PubMed Central PMCID: PMC2893951.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  24. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. J Pharmacol Pharmacother. 2010;1(2):94–9. https://doi.org/10.4103/0976-500X.72351..

    Article  PubMed  PubMed Central  Google Scholar 

  25. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Animal Research: Reporting In Vivo Experiments: the ARRIVE guidelines. J Physiol. 2010;588(Pt 14):2519–21. https://doi.org/10.1113/jphysiol.2010.192278 Epub 2010/07/17. PubMed PMID: 20634180; PubMed Central PMCID: PMC2916981. 588/14/2519.

    Article  CAS  Google Scholar 

  26. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Animal Research: Reporting In Vivo Experiments: the ARRIVE guidelines. Exp Physiol. 2010;95(8):842–4. https://doi.org/10.1113/expphysiol.2010.053793 Epub 2010/07/09. 95/8/842. PubMed PMID: 20610776.

    Article  Google Scholar 

  27. McGrath JC, Drummond GB, McLachlan EM, Kilkenny C, Wainwright CL. Guidelines for reporting experiments involving animals: the ARRIVE guidelines. Br J Pharmacol. 2010;160(7):1573–6. https://doi.org/10.1111/j.1476-5381.2010.00873.x Epub 2010/07/24. BPH873. PubMed PMID: 20649560; PubMed Central PMCID: PMC2936829.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  28. Kilkenny C, Browne W, Cuthill IC, Emerson M, Altman DG. Animal Research: reporting in vivo experiments-the ARRIVE guidelines. J Cereb Blood Flow Metab. 2011. https://doi.org/10.1038/jcbfm.2010.220 Epub 2011/01/06. PubMed PMID: 21206507. jcbfm2010220.

  29. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. Vet Clin Pathol. 2012;41(1):27–31. https://doi.org/10.1111/j.1939-165X.2012.00418.x Epub 2012/03/07. PubMed PMID: 22390425.

    Article  PubMed  Google Scholar 

  30. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. Osteoarthr Cartil. 2012;20(4):256–60. https://doi.org/10.1111/j.1939-165X.2012.00418.x.

    Article  CAS  Google Scholar 

  31. Avey MT, Moher D, Sullivan KJ, Fergusson D, Griffin G, Grimshaw JM, et al. The Devil Is in the Details: Incomplete Reporting in Preclinical Animal Research. PLoS ONE. 2016;11(11):e0166733. https://doi.org/10.1371/journal.pone.0166733 Epub 2016/11/18. PubMed PMID: 27855228; PubMed Central PMCID: PMCPMC5113978.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  32. Leung V, Rousseau-Blass F, Beauchamp G, Pang DSJ. ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesia. PLoS ONE. 2018;13(5):e0197882. https://doi.org/10.1371/journal.pone.0197882 PubMed PMID: 29795636; PubMed Central PMCID: PMCPMC5967836. Epub 2018/05/26.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  33. Hair K, Macleod MR, Sena ES, Sena ES, Hair K, Macleod MR, et al. A randomised controlled trial of an intervention to improve compliance with the ARRIVE guidelines (IICARus). Res Integr Peer Rev. 2019;4(1):12. https://doi.org/10.1186/s41073-019-0069-3.

    Article  PubMed  PubMed Central  Google Scholar 

  34. The NPQIP Collaborative group. Did a change in Nature journals’ editorial policy for life sciences research improve reporting? BMJ Open Sci. 2019;3(1):e000035. https://doi.org/10.1136/bmjos-2017-000035.

    Article  PubMed Central  Google Scholar 

  35. Han S, Olonisakin TF, Pribis JP, Zupetic J, Yoon JH, Holleran KM, et al. A checklist is associated with increased quality of reporting preclinical biomedical research: a systematic review. PLoS ONE. 2017;12(9):e0183591. https://doi.org/10.1371/journal.pone.0183591 Epub 2017/09/14. PubMed PMID: 28902887; PubMed Central PMCID: PMCPMC5597130.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  36. Ramirez FD, Motazedian P, Jung RG, Di Santo P, MacDonald ZD, Moreland R, et al. Methodological rigor in preclinical cardiovascular studies: targets to enhance reproducibility and promote research translation. Circ Res. 2017;120(12):1916–26. https://doi.org/10.1161/CIRCRESAHA.117.310628 Epub 2017/04/05. PubMed PMID: 28373349; PubMed Central PMCID: PMCPMC5466021.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  37. Reichlin TS, Vogt L, Wurbel H. The Researchers’ View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research. PLoS ONE. 2016;11(12):e0165999. https://doi.org/10.1371/journal.pone.0165999 Epub 2016/12/03. PubMed PMID: 27911901; PubMed Central PMCID: PMCPMC5135049.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  38. Hurst V, Percie du Sert N. The ARRIVE guidelines survey. Open Science Framework; 2017. https://doi.org/10.17605/OSF.IO/G8T5Q.

    Book  Google Scholar 

  39. Fraser H, Parker T, Nakagawa S, Barnett A, Fidler F. Questionable research practices in ecology and evolution. PLoS ONE. 2018;13(7):e0200303. https://doi.org/10.1371/journal.pone.0200303 Epub 2018/07/17. PubMed PMID: 30011289; PubMed Central PMCID: PMCPMC6047784.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  40. The Academy of Medical Sciences. Reproducibility and reliability of biomedical research: improving research practice 2015. Available from: https://acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research. Cited 16 June 2020.

    Google Scholar 

  41. Sena ES, Currie GL. How our approaches to assessing benefits and harms can be improved. Animal Welfare. 2019;28(1):107–15. https://doi.org/10.7120/09627286.28.1.107 PubMed PMID: WOS:000455904200011.

    Article  Google Scholar 

  42. Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. Reporting animal research: Explanation and Elaboration for the ARRIVE guidelines 2.0. PLoS Biol. 2020;18(7):e3000411. https://doi.org/10.1371/journal.pbio.300041.

  43. Landis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EW, et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature. 2012;490(7419):187–91. https://doi.org/10.1038/nature11556 Epub 2012/10/13. PubMed PMID: 23060188; PubMed Central PMCID: PMC3511845.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  44. Kimmelman J, Anderson JA. Should preclinical studies be registered? Nat Biotechnol. 2012;30(6):488–9. https://doi.org/10.1038/nbt.2261 Epub 2012/06/09. PubMed PMID: 22678379.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  45. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, et al. The FAIR guiding principles for scientific data management and stewardship. Sci Data. 2016;3:160018. https://doi.org/10.1038/sdata.2016.18.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2):e1000217. https://doi.org/10.1371/journal.pmed.1000217 Epub 2010/02/20. PubMed PMID: 20169112.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Chambers K, Collings A, Graf C, Kiermer V, Mellor DT, Macleod M, Swaminathan S, Sweet D, Vinson V. Towards minimum reporting standards for life scientists. MetaArXiv. 2019. https://doi.org/10.31222/osf.io/9sm4x.

  48. Rands SA. Inclusion of policies on ethical standards in animal experiments in biomedical science journals. J Am Assoc Lab Anim Sci. 2011;50(6):901–3 Epub 2012/02/15. PubMed PMID: 22330784; PubMed Central PMCID: PMCPMC3228928.

    PubMed  PubMed Central  CAS  Google Scholar 

  49. Osborne NJ, Payne D, Newman ML. Journal editorial policies, animal welfare, and the 3Rs. Am J Bioeth. 2009;9(12):55–9. https://doi.org/10.1080/15265160903318343 Epub 2009/12/17. PubMed PMID: 20013503.

    Article  PubMed  Google Scholar 

  50. Vasilevsky NA, Minnier J, Haendel MA, Champieux RE. Reproducible and reusable research: are journal data sharing policies meeting the mark? PeerJ. 2017;5. https://doi.org/10.7717/peerj.3208 PubMed PMID: WOS:000400303800002.

  51. Giofre D, Cumming G, Fresc L, Boedker I, Tressoldi P. The influence of journal submission guidelines on authors’ reporting of statistics and use of open research practices. PLoS ONE. 2017;12(4). https://doi.org/10.1371/journal.pone.0175583 PubMed PMID: WOS:000399874800038.

  52. Michel MC, Murphy TJ, Motulsky HJ. New author guidelines for displaying data and reporting data analysis and statistical methods in experimental biology. Mol Pharmacol. 2020;97(1):49–60. https://doi.org/10.1124/mol.119.118927 Epub 2019/12/29. PubMed PMID: 31882404.

    Article  PubMed  Google Scholar 

  53. Rowan-Legg A, Weijer C, Gao J, Fernandez C. A comparison of journal instructions regarding institutional review board approval and conflict-of-interest disclosure between 1995 and 2005. J Med Ethics. 2009;35(1):74–8. https://doi.org/10.1136/jme.2008.024299 PubMed PMID: WOS:000261929300018.

    Article  PubMed  CAS  Google Scholar 

  54. Ancker JS, Flanagin A. A comparison of conflict of interest policies at peer-reviewed journals in different scientific disciplines. Sci Eng Ethics. 2007;13(2):147–57. https://doi.org/10.1007/s11948-007-9011-z Epub 2007/08/25. PubMed PMID: 17717729.

    Article  PubMed  Google Scholar 

  55. Updated RCUK guidance for funding applications involving animal research 2015 [18 Nov 2019]. Available from: https://mrc.ukri.org/news/browse/updated-rcuk-guidance-for-funding-applications-involving-animal-research/. Cited 16 June 2020.

  56. Prager EM, Chambers KE, Plotkin JL, DL MA, Bandrowski AE, Bansal N, et al. Improving transparency and scientific rigor in academic publishing. J Neurosci Res. 2018. https://doi.org/10.1002/jnr.24340 Epub 2018/12/07. PubMed PMID: 30506706.

  57. Enhancing reproducibility. Nat Methods. 2013; 10(5): 367. Epub 2013/06/14. PubMed PMID: 23762900..

  58. Curtis MJ, Alexander S, Cirino G, Docherty JR, George CH, Giembycz MA, et al. Experimental design and analysis and their reporting II: updated and simplified guidance for authors and peer reviewers. Br J Pharmacol. 2018;175(7):987–93. https://doi.org/10.1111/bph.14153 Epub 2018/03/10. PubMed PMID: 29520785; PubMed Central PMCID: PMCPMC5843711.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  59. Heaven D. AI peer reviewers unleashed to ease publishing grind. Nature. 2018;563(7733):609–10. https://doi.org/10.1038/d41586-018-07245-9 Epub 2018/11/30. PubMed PMID: 30482927.

    Article  PubMed  CAS  Google Scholar 

Download references

Acknowledgements

We would like to thank the members of the expert panel for the Delphi exercise and the participants of the road testing for their time and feedback. We are grateful to the DelphiManager team for advice and use of their software. We would like to acknowledge the late Doug Altman’s contribution to this project; Doug was a dedicated member of the working group and his input to the guidelines’ revision has been invaluable. This article was originally published in Plos Biology https://doi.org/10.1371/journal.pbio.3000410 under a CC-BY license.

Funding

This work was supported by the National Centre of the Replacement, Refinement & Reduction on Animals in Research (NC3Rs, https://www.nc3rs.org.uk/). NPdS, KL, VH, and EJP are employees of the NC3Rs. Supporting information.

Author information

Authors and Affiliations

Authors

Contributions

NPdS: conceptualisation, data curation, formal analysis, funding acquisition, investigation, methodology, project administration, resources, supervision, visualisation, writing – original draft, writing – review and editing; VH: data curation, investigation, methodology, project administration, resources, writing – original draft, writing – review and editing; SEL, EJP: writing – review and editing; KL: investigation, project administration, writing – review and editing; AA, SA, MTA, MB, WJB, AC, ICC, UD, ME, PG, STH, DWH, NAK, CJMcC, MMcL, OHP, FR, PR, KR, ESS, SDS, TS, HW: investigation, methodology, resources, writing – original draft, writing – review and editing. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Nathalie Percie du Sert.

Ethics declarations

Competing interests

AA: editor in chief of the British Journal of Pharmacology. WJB, ICC and ME: authors of the original ARRIVE guidelines. WJB: serves on the Independent Statistical Standing Committee of the funder CHDI foundation. AC: Senior Editor, PLOS ONE. AC, CJM, MMcL and ESS: involved in the IICARus trial. ME, MMcL and ESS: have received funding from NC3Rs. ME: sits on the MRC ERPIC panel. STH: chair of the NC3Rs board, trusteeship of the BLF, Kennedy Trust, DSRU and CRUK, member of Governing Board, Nuffield Council of Bioethics, member Science Panel for Health (EU H2020), founder and NEB Director Synairgen, consultant Novartis, Teva and AZ, chair MRC/GSK EMINENT Collaboration. VH, KL, EJP and NPdS: NC3Rs staff, role includes promoting the ARRIVE guidelines. SEL and UD: on the advisory board of the UK Reproducibility Network, CJMcC: shareholdings in Hindawi, on the publishing board of the Royal Society, on the EU Open Science policy platform. UD, MMcL, NPdS, CJMcC, ESS, TS and HW: members of EQIPD. MMcL: member of the Animals in Science Committee, on the steering group of the UK Reproducibility Network. NPdS and TS: associate editors of BMJ Open Science. OHP: vice president of Academia Europaea, editor in chief of Function, senior executive editor of the Journal of Physiology, member of the Board of the European Commission’s SAPEA (Science Advice for Policy by European Academies). FR: NC3Rs board member, shareholdings in GSK. FR and NAK: shareholdings in AstraZeneca. PR: member of the University of Florida Institutional Animal Care and Use Committee, editorial board member of Shock. ESS: editor in chief of BMJ Open Science. SDS: role is to provide expertise and does not represent the opinion of the NIH. TS: shareholdings in Johnson & Johnson. SA, MTA, MB, PG, DWH, and KR declared no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Noteworthy changes in ARRIVE 2.0. This table recapitulates noteworthy changes in the ARRIVE guidelines 2.0, compared to the original ARRIVE guidelines published in 2010.

Additional file 2.

Delphi methods and results. Methodology and results of the Delphi study that was used to prioritise the items of the guidelines into the ARRIVE Essential 10 and Recommended Set.

Additional file 3.

Delphi data. Tabs 1, 2, and 3: Panel members’ scores for each of the ARRIVE items during rounds 1, 2, and 3, along with descriptive statistics. Tab. 4: Qualitative feedback, collected from panel members during round 1, on the importance and the wording of each item. Tab. 5: Additional items suggested for consideration in ARRIVE 2.0; similar suggestions were grouped together before processing. Tab. 6: Justifications provided by panel members for changing an item’s score between round 1 and round 2.

Additional file 4.

Road testing methods and results. Methodology used to road test the revised ARRIVE guidelines and E&E (as published in preprint) and how this information was used in the development of ARRIVE 2.0.

Additional file 5.

Road testing data. Tab 1: Participants’ demographics and general feedback on the guidelines and the E&E preprints. Tab 2: Outcome of each manuscript’s assessment and justifications provided by participants for not including information covered in the ARRIVE guidelines.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Percie du Sert, N., Hurst, V., Ahluwalia, A. et al. The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research. BMC Vet Res 16, 242 (2020). https://doi.org/10.1186/s12917-020-02451-y

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/s12917-020-02451-y