19,158
Views
15
CrossRef citations to date
0
Altmetric

The number of original manuscripts submitted to the Sports Performance section of the Journal of Sports Sciences increased 34% between 2017 and 2020 (637 vs 854). There are many factors that could be contributing to this rise in submissions, including an increase in the popularity of the Journal of Sports Sciences, “publish or perish” pressure (Brischoux & Angelier, Citation2015), increased data availability through routine monitoring of athletes (Robertson, Citation2020), perverse incentives, metrification, and hyper-competition that drives academics and researchers to prioritise quantity over quality (Edwards & Roy, Citation2017; Moore et al., Citation2017).

The rise in submissions to the Sports Performance section means that the bar has raised for getting published. Unfortunately, the space we have within the Journal is not unlimited, and that space is not being increased by the publisher to accommodate the increase in submissions. We also have a dearth of reviewers, which is exacerbated by some researchers wanting their own papers to receive high-quality reviews but unfortunately not willing to review themselves (Stafford, Citation2018). The bar to publication has also been raised by the emergence of Open Science (Munafò et al., Citation2017), a movement that comprises both principles (e.g., transparency, reuse, participation, accountability) and practices (e.g., open publications, data-sharing, citizen science; National Academies of Sciences Engineering and Medicine, Citation2018). Practices aimed at increasing the methodological rigour of scientific studies that have emerged, such as pre-registration and, more recently, Registered Reports (Abt et al., Citation2021b; Caldwell et al., Citation2020), are now offered as submission types in the Journal of Sports Sciences.

Given this background, we now set out what we are looking for in manuscripts submitted to the Sports Performance section.

Open science and reporting guidelines

We value Open Science practices such as study pre-registration (Bosnjak et al., Citation2021; including sample-size estimation; Abt et al., Citation2020), sharing data, sharing code, and Registered Reports (Abt et al., Citation2021b; Impellizzeri et al., Citation2019). In line with this call, we would encourage authors to submit data sets with their submissions so that if needed reviewers can evaluate study outcomes for themselves. We believe the trilogy of study pre-registration, the results section of the manuscript, and access to the raw data provides the best way for reviewers to evaluate the robustness, precision, and interpretation of study outcomes. We have revised our online system so that downloading of raw data is straightforward for reviewers. Along with Open Science practices, we strongly encourage authors to use reporting guidelines appropriate for the research design, such as STROBE for observational designs (Von Elm et al., Citation2007) and CONSORT for intervention studies (Moher et al., Citation2010). Although more than a decade has passed since sports performance researchers were encouraged to use reporting guidelines (Atkinson et al., Citation2008), unfortunately, few studies in our discipline explicitly state using them (Twomey et al., Citation2021). Nevertheless, evidence suggests that the use of reporting guidelines improves reporting and study quality (Cobo et al., Citation2011; Plint et al., Citation2006; Turner et al., Citation2012). Therefore, we will prioritise studies that make use of Open Science practices and appropriate reporting guidelines.

A higher quality of evidence

Given the applied nature of sports performance research, many of the submissions we see are uncontrolled analytic observational studies. That is, they describe a phenomenon rather than identifying causal relationships or evaluating the effectiveness of an intervention. To understand where these types of studies are placed, Bishop’s (Citation2008) Applied Research Model for the Sport Sciences (ARMSS) is useful. The ARMSS outlines three broad stages of research – description, experimentation, and implementation, which are subdivided into eight smaller stages progressing from descriptive to implementation studies. Applying the ARMSS to manuscripts submitted to the Sports Performance section reveals that the majority are descriptive studies (stages 1 to 3). Although the accepted view is that randomised controlled trials constitute a higher level of evidence compared to observational studies, research design is only one (albeit important) factor that determines quality of evidence and/or confidence in population estimates (Murad et al., Citation2016). Study limitations, indirectness of evidence, and imprecision in population estimates, all influence the quality of an individual study (Guyatt et al., Citation2008). The challenge for our discipline is to seek a higher quality of evidence from our research. This means two things. First, if the research question requires an observational design, then limitations need to be minimised and precision maximised. Second, we would strongly encourage a move towards studies at the higher stages of ARMSS – experimentation (stages 4 to 6) and implementation (stages 7 and 8). Although there are barriers (both real and perceived) to conducting efficacy and effectiveness trials in our discipline, it is a challenge that we all need to embrace. Maximising precision in population estimates through sample size planning (Abt et al., Citation2020), pre-registration and Registered Reports (Abt et al., Citation2021b), and calls for collaboration across research groups and sports clubs (Ramírez-López et al., Citation2020) are all required for increasing the quality of studies submitted to and published in the Sports Performance section. As such, we will prioritise studies that embrace these approaches.

Sound theoretical and/or practical rationale

High-quality studies provide a concise, logical flow of information (Barroga & Matanguihan, Citation2021; Hotaling, Citation2020) regarding the theoretical underpinning, causal chain, and practical implications arising from the observed data. Unfortunately, many submitted manuscripts lack a theory-driven rationale and/or clear research questions based on that theoretical rationale. As an example, we often see statements such as “Despite x having been extensively studied in soccer, no studies have examined this in [other team sport].” In these cases, it is incumbent on the authors to provide a logical causal chain of evidence for why (and how) the new sport is fundamentally different from soccer to cause the dependent variable to respond differently than it would during soccer. If authors are not able to outline that rationale in the introduction, then there is little basis for entering the study into peer-review. Similarly, from a practical perspective, we often read manuscripts that conclude “this study has important implications for training.” Yet without providing more detail on how such implications are to be realised, and the probability of those implications being implemented, the phrase is an empty one. Given that researchers often use language that exaggerates the findings (Vinkers et al., Citation2015), we expect authors to provide realistic implications for how their findings can be used and implemented and to clarify future research directions.

Model validation

Predicting performance has been a goal of sports scientists and coaches for many years (Bar-Or, Citation1975). An important part of developing a predictive model is model validation, which is the process of checking the predictive accuracy of the model (Vehtari et al., Citation2017). This is important because a model may display exceptional accuracy to predict the data on which the model is based, but markedly reduced accuracy (called validity shrinkage) when the model attempts to predict based on new data (Copas, Citation1983). Validating a model on an independent set of data not used to generate the model is required (Ivanescu et al., Citation2016). For example, model validation to check for out-of-sample predictive accuracy is an important part of linear regression and is recommended as standard practice (Jan & Shieh, Citation2019). There are several approaches to model validation, including cross-validation and bootstrapping that use data already collected and partitioned into training and testing sets (internal validation). The model is generated from the training set and then tested (validated) on the test set. However, the criterion method of model validation remains external validation through the use of an external data set (Altman et al., Citation2009; Debray et al., Citation2015; Ivanescu et al., Citation2016). Unfortunately, the evidence would suggest that most prediction studies focus on model development rather than external validation (Bouwmeester et al., Citation2012; Riley et al., Citation2016). Given this background, and for studies involving prediction models we will prioritise those that outline a clear approach to model validation, with internal validation as a minimum requirement but with external validation as a preferred goal.

Focus on sports performance

Manuscripts submitted to the Sports Performance section often focus on other fields of study (e.g., computer science, statistics, engineering) or on training descriptions and non-sports performance outcomes. For example, studies examining resistance training in sports people or the use of a machine learning algorithm to classify soccer player characteristics. Although these aspects are of interest, there is often no clear link to sports performance and/or criterion sports performance is often not measured. We will prioritise manuscripts that provide a clear link to sports performance, as operationally defined by the authors. Although sports performance is often difficult to define and measure (e.g., team sports) and there’s a trade-off between internal and external validity (Atkinson & Nevill, Citation2001), authors should outline a clear causal link between their research question(s) and sports performance.

“Impact” beyond a single sport

Research costs time, effort, and money. Therefore, if research evidence can be applied across a broad range of outlets (e.g., different sports, organisations), then there are clear benefits to both sports participants and researchers. In the UK, where the Journal of Sports Sciences is published, the Research Excellence Framework evaluates research case studies in terms of “impact”, which is subdivided into “significance” and “reach” (REF2021, REF2021, Citation2020). Although these terms are contested (Sutton, Citation2020; Watermeyer, Citation2016), the implication is that if research “bang for buck” can be maximised then this constitutes a good use of public money. As such, evidence that can be applied across a range of sports through a contribution to theoretical development and/or changing practice will demonstrate greater “impact”. Although it is not within our remit to define what “impact” is, we will prioritise studies that clearly have the potential for broad impact across multiple domains.

Mixed methods studies

Most complex problems require solutions based on evidence from multiple paradigms (Daviter, Citation2019). That is, the solutions to problems are based on a wide range of approaches, including quantitative and qualitative research evidence. Quantitative and qualitative research paradigms have advantages and disadvantages (Johnson & Onwuegbuzie, Citation2004), so mixed methods studies involve the researcher mixing quantitative and qualitative research techniques, methods, approaches, concepts or language into a single study (Johnson & Onwuegbuzie, Citation2004). Some argue this approach has many advantages over using quantitative or qualitative methods alone (Hadi & Closs, Citation2015). For example, it would not be enough to implement a new training method in the applied setting based solely on a quantitative efficacy study. Qualitative evidence on barriers to uptake (stage 7 of the ARMSS) would also be required to gauge the probability of successful implementation (Bishop, Citation2008). We would therefore encourage sports performance researchers to consider using a mixed methods approach rather than relying solely on either quantitative or qualitative methods.

Diversity and description of participants

We recently published our policy on equality, diversity, and inclusion (Abt et al., Citation2021a). Although this policy relates to editors, we acknowledge that diversity in research participants is important. Women are under-represented in sport and exercise science research (Costello et al., Citation2014) which has led to calls for more women to be recruited as participants (Elliott-Sale et al., Citation2021). Similarly, people of colour are under-represented in some areas like cancer exercise trials (Zuniga et al., Citation2020). We echo these calls and therefore encourage studies that recruit a proportionate sample of genders and people of colour. Although diversity of participants is important, we also need better descriptions of participants. This is important for evaluating the external validity of study findings (Betts et al., Citation2020; Moher et al., Citation2010; Von Elm et al., Citation2007). For example, in team sports research a dependency on players’ standards of performance is apparent (Dellal et al., Citation2011). Adequately describing participant characteristics is also important for understudied groups such as women. Consistency of terms used to describe women participants and inclusion/exclusion criteria (including hormonal parameters) are important for interpreting findings and generalizability (Elliott-Sale et al., Citation2021).

Through our own initiatives (Abt et al., Citation2020, Citation2021b) and external factors (Brischoux & Angelier, Citation2015; Munafò et al., Citation2017) the bar is being raised for publishing in the Sports Performance section at the Journal of Sports Sciences. We hope sports performance researchers will accept this challenge with a view to improving the quality of evidence created, with a subsequent decrease in the quantity of studies conducted (Altbach & De Wit, Citation2018). We argue this “less is more” approach should equally apply to postgraduate students, who are often the driving force behind tenured academics’ publishing empires. If we are to appropriately educate and train the next generation of high-quality researchers, we not only need to set the right example, we also need to train these future scientists to focus on quality over quantity (Button et al., Citation2020). Chambers and Tzavella (Citation2021) report that 77% of submitted stage 1 Registered Reports at the journal Cortex were first-authored by Ph.D. students and postdoctoral researchers, showing that this new method of publishing is not beyond the capability or constrained timeline of doctoral and postdoctoral researchers.

We hope the guidelines outlined here for publishing in the Sports Performance section of the Journal of Sports Sciences will act as a call to arms for researchers in this area. We are all responsible for the state of academic publishing and therefore we are all responsible for improving standards. Although the actions required to improve both the evidence base and academic publishing will require substantial cultural change, we believe these efforts are worth it for individual researchers, sportspeople, organisations, and ultimately our scientific discipline.

Sports performance section

Grant Abt

Simon Jobson

Jean-Benoit Morin

Louis Passfield

Jaime Sampaio

Caroline Sunderland

Craig Twist

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Abt, G., Boreham, C., Davison, G., Jackson, R., Nevill, A., Wallace, E., & Williams, M. (2020). Power, precision, and sample size estimation in sport and exercise science research. Journal of Sports Sciences, 38(17), 1933–1935. https://doi.org/10.1080/02640414.2020.1776002
  • Abt, G., Boreham, C., Davison, G., Jackson, R., Wallace, E., & Williams, A. M. (2021a). Equality, diversity, and inclusion: Policy statement. Journal of Sports Sciences, 39(24), 1–3. https://doi.org/10.1080/02640414.2021.1967608
  • Abt, G., Boreham, C., Davison, G., Jackson, R., Wallace, E., & Williams, A. M. (2021b). Registered reports in the journal of sports sciences. Journal of Sports Sciences, 39(16), 1789–1790. https://doi.org/10.1080/02640414.2021.1950974
  • Altbach, P. G., & De Wit, H. (2018). Too much academic research is being published. International Higher Education, 2018(96), 2–3. https://doi.org/10.6017/ihe.2019.96.10767
  • Altman, D. G., Vergouwe, Y., Royston, P., & Moons, K. G. M. (2009). Prognosis and prognostic research: Validating a prognostic model. BMJ, 338(may28 1), b605–b605. https://doi.org/10.1136/bmj.b605
  • Atkinson, G., Batterham, A., & Drust, B. (2008). Is it time for sports performance researchers to adopt a clinical-type research framework? International Journal of Sports Medicine, 29(9), 703–705. https://doi.org/10.1055/s-2008-1038545
  • Atkinson, G., & Nevill, A. M. (2001). Selected issues in the design and analysis of sport performance research. Journal of Sports Sciences, 19(10), 811–827. https://doi.org/10.1080/026404101317015447
  • Bar-Or, O. (1975). Predicting Athletic Performance. The Physician and Sportsmedicine, 3(2), 80–85. https://doi.org/10.1080/00913847.1975.11948147
  • Barroga, E., & Matanguihan, G. J. (2021). Creating logical flow when writing scientific articles. Journal of Korean Medical Science, 36(40), 1–14. https://doi.org/10.3346/jkms.2021.36.e275
  • Betts, J. A., Gonzalez, J. T., Burke, L. M., Close, G. L., Garthe, I., James, L. J., Jeukendrup, A. E., Morton, J. P., Nieman, D. C., Peeling, P., Phillips, S. M., Stellingwerff, T., van Loon, L. J. C., Williams, C., Woolf, K., Maughan, R., & Atkinson, G. (2020). PRESENT 2020: Text expanding on the checklist for proper reporting of evidence in sport and exercise nutrition trials. International Journal of Sport Nutrition and Exercise Metabolism, 30(1), 2–13. https://doi.org/10.1123/ijsnem.2019-0326
  • Bishop, D. (2008). An applied research model for the sport sciences. Sports Medicine (Auckland, NZ), 38(3), 253–263. https://doi.org/10.2165/00007256-200838030-00005
  • Bosnjak, M., Fiebach, C. J., Mellor, D., Mueller, S., O’Connor, D. B., Oswald, F. L., & Sokol-Chang, R. I. (2021). A template for preregistration of quantitative research in psychology: Report of the joint psychological societies preregistration task force. American Psychologist. https://doi.org/10.1037/amp0000879
  • Bouwmeester, W., Zuithoff, N. P. A., Mallett, S., Geerlings, M. I., Vergouwe, Y., Steyerberg, E. W., Altman, D. G., & Moons, K. G. M. (2012). Reporting and methods in clinical prediction research: A systematic review. PLoS Medicine, 9(5), e1001221. https://doi.org/10.1371/journal.pmed.1001221
  • Brischoux, F., & Angelier, F. (2015). Academia’s never-ending selection for productivity. Scientometrics, 103(1), 333–336. https://doi.org/10.1007/s11192-015-1534-5
  • Button, K. S., Chambers, C. D., Lawrence, N., & Munafò, M. R. (2020). Grassroots training for reproducible science: A consortium-based approach to the empirical dissertation. Psychology Learning & Teaching, 19(1), 77–90. https://doi.org/10.1177/1475725719857659
  • Caldwell, A. R., Vigotsky, A. D., Tenan, M. S., Radel, R., Mellor, D. T., Kreutzer, A., Lahart, I. M., Mills, J. P., & Boisgontier, M. P. (2020). Moving sport and exercise science forward: A call for the adoption of more transparent research practices. Sports Medicine, 50(3), 449–459. https://doi.org/10.1007/s40279-019-01227-1
  • Chambers, C. D., & Tzavella, L. (2021). The past, present and future of registered reports. Nature Human Behaviour, 561–586. https://doi.org/10.1038/s41562-021-01193-7
  • Cobo, E., Cortes, J., Ribera, J. M., Cardellach, F., Selva-O’Callaghan, A., Kostov, B., Garcia, L., Cirugeda, L., Altman, D. G., Gonzalez, J. A., Sanchez, J. A., Miras, F., Urrutia, A., Fonollosa, V., Rey-Joly, C., & Vilardell, M. (2011). Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: Masked randomised trial. BMJ, 343(nov22 2), d6783–d6783. https://doi.org/10.1136/bmj.d6783
  • Copas, J. B. (1983). Regression, prediction and shrinkage. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 45(3), 311–354. https://www.jstor.org/stable/2345402
  • Costello, J. T., Bieuzen, F., & Bleakley, C. M. (2014). Where are all the female participants in sports and exercise medicine research? European Journal of Sport Science, 14(8), 847–851. https://doi.org/10.1080/17461391.2014.911354
  • Daviter, F. (2019). Policy analysis in the face of complexity: What kind of knowledge to tackle wicked problems? Public Policy and Administration, 34(1), 62–83. https://doi.org/10.1177/0952076717733325
  • Debray, T. P. A., Vergouwe, Y., Koffijberg, H., Nieboer, D., Steyerberg, E. W., & Moons, K. G. M. (2015). A new framework to enhance the interpretation of external validation studies of clinical prediction models. Journal of Clinical Epidemiology, 68(3), 279–289. https://doi.org/10.1016/j.jclinepi.2014.06.018
  • Dellal, A., Hill-Haas, S., Lago-Penas, C., & Chamari, K. (2011). Small-sided games in soccer: Amateur vs. professional players’ physiological responses, physical, and technical activities. Journal of Strength and Conditioning Research, 25(9), 2371–2381. https://doi.org/10.1519/JSC.0b013e3181fb4296
  • Edwards, M. A., & Roy, S. (2017). Academic research in the 21st century: Maintaining scientific integrity in a climate of perverse incentives and hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223
  • Elliott-Sale, K. J., Minahan, C. L., de Jonge, X. A. K. J., Ackerman, K. E., Sipilä, S., Constantini, N. W., Lebrun, C. M., & Hackney, A. C. (2021). Methodological considerations for studies in sport and exercise science with women as participants: A working guide for standards of practice for research on women. Sports Medicine, 51(5), 843–861. https://doi.org/10.1007/s40279-021-01435-8
  • Guyatt, G. H., Oxman, A. D., Kunz, R., Vist, G. E., Falck-Ytter, Y., & Schünemann, H. J. (2008). What is “quality of evidence” and why is it important to clinicians? BMJ, 336(7651), 995–998. https://doi.org/10.1136/bmj.39490.551019.BE
  • Hadi, M. A., & Closs, S. J. (2015). Applications of mixed-methods methodology in clinical pharmacy research. International Journal of Clinical Pharmacy, 38(3), 635–640. https://doi.org/10.1007/s11096-015-0231-z
  • Hotaling, S. (2020). Simple rules for concise scientific writing. Limnology and Oceanography Letters, 5(6), 379–383. https://doi.org/10.1002/lol2.10165
  • Impellizzeri, F. M., McCall, A., & Meyer, T. (2019). Registered reports coming soon: Our contribution to better science in football research. Science and Medicine in Football, 3(2), 87–88. https://doi.org/10.1080/24733938.2019.1603659
  • Ivanescu, A. E., Li, P., George, B., Brown, A. W., Keith, S. W., Raju, D., & Allison, D. B. (2016). The importance of prediction model validation and assessment in obesity and nutrition research. International Journal of Obesity, 40(6), 887–894. https://doi.org/10.1038/ijo.2015.214
  • Jan, S.-L., & Shieh, G. (2019). Sample size calculations for model validation in linear regression analysis. BMC Medical Research Methodology, 19(1), 54. https://doi.org/10.1186/s12874-019-0697-9
  • Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. https://doi.org/10.3102/0013189X033007014
  • Moher, D., Hopewell, S., Schulz, K. F., Montori, V., Gøtzsche, P. C., Devereaux, P. J., Elbourne, D., Egger, M., & Altman, D. G. (2010). CONSORT 2010 explanation and elaboration: Updated guidelines for reporting parallel group randomised trials. Journal of Clinical Epidemiology, 63(8), e1–e37. https://doi.org/10.1016/j.jclinepi.2010.03.004
  • Moore, S., Neylon, C., Paul Eve, M., Paul O’Donnell, D., & Pattinson, D. (2017). “Excellence R Us”: University research and the fetishisation of excellence. Palgrave Communications, 3(1), 16105. https://doi.org/10.1057/palcomms.2016.105
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie Du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021
  • Murad, M. H., Asi, N., Alsawas, M., & Alahdab, F. (2016). New evidence pyramid. Evidence Based Medicine, 21(4), 125–127. https://doi.org/10.1136/ebmed-2016-110401
  • National Academies of Sciences Engineering and Medicine. (2018). Open Science by Design. National Academies Press. https://doi.org/10.17226/25116
  • Plint, A. C., Moher, D., Morrison, A., Schulz, K., Altman, D. G., Hill, C., & Gaboury, I. (2006). Does the CONSORT checklist improve the quality of reports of randomised controlled trials? A systematic review. Medical Journal of Australia, 185(5), 263–267. https://doi.org/10.5694/j.1326-5377.2006.tb00557.x
  • Ramírez-López, C., Till, K., Boyd, A., Bennet, M., Piscione, J., Bradley, S., Giuliano, P., Leduc, C., & Jones, B. (2020). Coopetition: Cooperation among competitors to enhance applied research and drive innovation in elite sport. British Journal of Sports Medicine, bjsports-2020-102901, 55(10), 522–523. https://doi.org/10.1136/bjsports-2020-102901
  • REF2021. (2020). REF2021: Guidance on submissions. Research Excellence Framework. https://www.ref.ac.uk/media/1447/ref-2019_01-guidance-on-submissions.pdf
  • Riley, R. D., Ensor, J., Snell, K. I. E., Debray, T. P. A., Altman, D. G., Moons, K. G. M., & Collins, G. S. (2016). External validation of clinical prediction models using big datasets from e-health records or IPD meta-analysis: Opportunities and challenges. BMJ, 353, i3140. https://doi.org/10.1136/bmj.i3140
  • Robertson, P. S. (2020). Man & machine: Adaptive tools for the contemporary performance analyst. Journal of Sports Sciences, 38(18), 2118–2126. https://doi.org/10.1080/02640414.2020.1774143
  • Stafford, T. F. (2018). Reviews, reviewers, and reviewing: The “tragedy of the commons” in the scientific publication process. Communications of the Association for Information Systems, 42(1), 624–629. https://doi.org/10.17705/1CAIS.04225
  • Sutton, E. (2020). The increasing significance of impact within the Research Excellence Framework (REF). Radiography, 26, S17–S19. https://doi.org/10.1016/j.radi.2020.02.004
  • Turner, L., Shamseer, L., Altman, D. G., Schulz, K. F., & Moher, D. (2012). Does use of the CONSORT statement impact the completeness of reporting of randomised controlled trials published in medical journals? A Cochrane review. Systematic Reviews, 1(1), 60. https://doi.org/10.1186/2046-4053-1-60
  • Twomey, R., Yingling, V., Warne, J., Schneider, C., McCrum, C., Atkins, W., Romero Medina, C., Harlley, S., & Caldwell, A. (2021). Nature of our literature. Communications in Kinesiology, 1(3), 5. https://doi.org/10.51224/cik.v1i3.43
  • Vehtari, A., Gelman, A., & Gabry, J. (2017). Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC. Statistics and Computing, 27(5), 1413–1432. https://doi.org/10.1007/s11222-016-9696-4
  • Vinkers, C. H., Tijdink, J. K., & Otte, W. M. (2015). Use of positive and negative words in scientific PubMed abstracts between 1974 and 2014: Retrospective analysis. BMJ, 351(December), h6467. https://doi.org/10.1136/bmj.h6467
  • Von Elm, E., Altman, D. G., Egger, M., Pocock, S. J., Gøtzsche, P. C., & Vandenbroucke, J. P. (2007). The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: Guidelines for reporting observational studies*. Bulletin of the World Health Organization, 85(11), 867–872. https://doi.org/10.2471/BLT.07.045120
  • Watermeyer, R. (2016). Impact in the REF: Issues and obstacles. Studies in Higher Education, 41(2), 199–214. https://doi.org/10.1080/03075079.2014.915303
  • Zuniga, K. B., Borno, H., Chan, J. M., Van Blarigan, E. L., Friedlander, T. W., Wang, S., Zhang, L., & Kenfield, S. A. (2020). The problem of underrepresentation: Black participants in lifestyle trials among patients with prostate cancer. Journal of Racial and Ethnic Health Disparities, 7(5), 996–1002. https://doi.org/10.1007/s40615-020-00724-8

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.