Skip to main content
  • Research article
  • Open access
  • Published:

Insights into student assessment outcomes in rural clinical campuses

Abstract

Background

There is an ongoing debate about the impact of studying medicine in rural vs. metropolitan campuses on student assessment outcomes. The UNSW Medicine Rural Clinical School has five main campuses; Albury-Wodonga, Coffs Harbour, Griffith, Port Macquarie and Wagga Wagga. Historical data of student assessment outcomes at these campuses raised concerns regarding potential biases in assessment undertaken, as well as the availability and quality of learning resources. The current study aims to identify the extent to which the location of examination (rural versus metropolitan) has an impact on student marks in OSCEs.

Methods

Assessment data was employed for this study from 275 medical students who sat their final examinations in Years 3 and 6 of the undergraduate Medicine program at UNSW in 2018. The data consists of matched student assessment results from the Year 3 (Y3) MCQ examination and OSCE, and from the Year 6 (Y6) MCQ, OSCE and management viva examinations. The analysis used Univariate Analysis of Variance and linear regression models to identify the impact of site of learning and site of examination on assessment outcomes.

Results

The results demonstrate that neither site of learning nor site of examination had any significant impact on OSCE or Management Viva assessment outcomes while potential confounders are controlled.

Conclusion

It is suggested that some of the supposed disadvantages inherent at rural campuses are effectively mitigated by perceived advantages; more intensive interaction with patients, the general and medical communities at those sites, as well as effective e-learning resources and moderation of assessment grades.

Peer Review reports

Background

The need to enhance and sustain the medical workforce in rural communities has been one of the most important health objectives of many countries, including Australia, New Zealand and Canada. Traditionally, medical schools have addressed rural communities’ need for more doctors by implementing student selection processes that aimed to identify those who were likely to pursue a medical career in rural regions [1]. Despite these efforts, the evidence regarding the effectiveness of the selection process alone to boost the number of doctors settling in rural communities is not promising [1, 2].

An alternative or complementary method for attracting medical students to pursue their career in rural communities has been deliberate exposure to rural medicine during medical studies [3,4,5]. A recent study suggests that even a short clinical learning experience in a rural community has a positive effect on medical students from metropolitan communities and increases the likelihood they would choose in future to practise medicine in rural regions [6]. It has also been suggested that the longer the students stay rural the more likely they are to practice rural [7].

Many medical schools in Australia, New Zealand, Canada and elsewhere have developed rural clinical schools where students live and study in these rural communities for a significant length of time during their medical training (for example: [1, 8,9,10]).Consequently, enhancing medical students’ experience in rural placements has become an important objective for these schools. With this in mind, there is evidence suggesting that the clinical learning experience of students in rural settings also has positive impacts on their performance in clinical skills assessments [11]. It has also been reported that students’ learning experience in rural settings is associated with subsequent rural career choices [11, 12]. Possible explanations for such impact of rural clinical experiences might be related to the nature of rural settings, whereby students are most commonly engaged in more intimate clinical settings learning in smaller groups which foster greater personal interactions with clinicians and the community, and experiencing a lower ratio of medical students to available patients [13,14,15].

Despite the evidence supporting the efficacy of rural clinical learning experiences on outcomes for medical students, it remains open to conjecture whether the grades of students who study and are assessed in rural clinical settings are related more to the learning experience, or whether the clinical skills assessment undertaken itself plays a role [16]. For example, compared with metropolitan settings, in rural communities student engagement with clinical and academic staff is more intensive, since they live and work within the same community and the daily interaction either within the working/learning environment or after hours is often unavoidable [17, 18].On the other hand, studies on potential biases impacting on examiner marking in OSCEs suggest that such examiner familiarity with students has a positive bias on student grades [19, 20].

The current study aims to identify the extent to which the location of examination (rural as compared with metropolitan) has an impact on student marks in OSCEs.

Methods

Setting

The UNSW Medicine Rural Clinical School has five main campuses – at Albury-Wodonga, Coffs Harbour, Griffith, Port Macquarie and Wagga Wagga. With the exception of Port Macquarie where students can undertake their entire course, students can spend up to 4 years of the 6-year Medicine program at a rural campus where they are taught predominantly by local clinicians and a smaller group of clinical academics. In total, approximately 50 students complete Year 3 in a rural campus and around 70 students undertake the last 2 years of their studies and sit their final exams in these country settings. UNSW Medicine has four main metropolitan campuses all based in Sydney.

This study employed assessment data from one cohort of medical students (N = 275) who sat their final examinations in Year 3 (in 2015) and Year 6 (in 2018) of the undergraduate Medicine program at UNSW. The data was received from the student administration office in February 2019. Data consist of matched student assessment results (total examination marks) from the Year 3 (Y3) MCQ examination and OSCE, as well as from the Year 6 (Y6) MCQ, OSCE and management viva examinations. The MCQ examinations were all conducted and answered online, and all students participated in the MCQ examination at the identical time independent of their rural or metropolitan site. For the OSCE and management viva examinations, questions are selected from an Assessment Item Bank so that all questions are the same at all rural and metropolitan sites for examinations held at the same date and time. Examination supervisor and examiners guidelines are prepared and distributed to all examination sites in advance. The rural OSCE and management viva examinations are always held at the same date and time as metropolitan sites. Most examiners remain in their teaching / clinical settings, although a few examiners move sites. For OSCE and viva examinations, the grading is performed on iPads and the grading criteria are identical for all metropolitan and rural sites. Following completion of the examinations the iPads are synchronised and all grading data and comments are electronically downloaded to the UNSW Medicine administration site in Sydney. In addition, data include site of clinical learning (home site) in Y3 and Y6, as well as the site for each examination, including whether the site was rural or metropolitan. Due to ethical requirements, the data does not include any individual characteristics of either students or examiners. Note that student and examiner individual characteristic data was not necessary for the analysis.

Statistical analysis

The analysis employed Univariate Analysis of Variance to identify the impact of site of learning and site of examination on assessment outcomes. Partial η 2[21] was used to compare the relative effect size (impact) of the independent variable on each assessment outcome (dependent variable). It is noted that partial η2 is not the most accurate measure of effect size compared with Cohen’s d when more than one category is assessed. However, as a relative measure partial η2 is practical and acceptable [22]. The large number of sites reduces the statistical power and therefore additional analysis took place where type of site (Rural/ Metropolitan) was used rather than the named site. This was undertaken under the assumption that rural campuses share some common features which make them different to Metropolitan campuses [4, 6, 17]. This also required adding a variable indicating whether the students were assessed in the same or different site where they had studies. Using features (rural/Metropolitan and same/different) increased the statistical power as well as made this analysis more generalisable.

Multiple linear regressions were used to identify the impact of sites of learning and examination (rural/metropolitan) on assessment outcomes.

It is acknowledged that education data may not always fully meet the assumption of normality. Yet, in reality if the data distribution is not extreme the risk of Type 1 & 2 errors is negligible [23]. The MCQ, OSCE and Management Viva data were tested for normality and the OSCE and Management viva appeared to not meeting the normal distribution but the breach was minimal (Kurtosis <│4.65│; and Skewness <│1.53│). That deviation from the normal distribution is small and should have not impacted the adequacy of the analysis [23]. The independent variables were tested for colinearity and found not to collinear (VIF < 2.4). The analysis was undertaken using SPSS v24 [24].

Results

The impact of clinical assessment site on OSCE results was the largest among the variables considered, yet not statistically significant (partial η2 = .745; p = .08) once impacts of home site and MCQ results are accounted for. Home site did not have any significant impact on OSCE results. However, MCQ results had small, yet statistically significant (partial η2 = .157; p < .001) impact on the OSCE results once home and examination sites are controlled for (Table 1). It was also found that home site in either Y3 or Y6 had no significant impact on Y6 MCQ’s. The only impact on Y6 MCQ results was of the Y3 MCQ results (partial η2 = .223, p < .001) (Table 2).

Table 1 Factors impacting Y3 OSCE results: univariate analysis, tests of between-subjects effects
Table 2 Factors impacting Y6 MCQ results: univariate analysis, tests of between-subjects effects

The next three analyses aimed to identify factors impacting each of the three main assessments undertaken at the end of Y6. These are high-stakes examinations, since passing all three of them is required for graduation.

The impact of examination site in Y6was estimated using Univariate analysis as follows: Dependent variable: Score in the type of examination (OSCE, MCQ, Viva); Fixed Factor: Site of the examination for each type of examination (as above); Random Factors: Y3 home site and Y6 home site; and Covariate: Y3MCQ results, Y3OSCE results, and two of the three Y6 results that are not the dependent variable, i.e. two ofY6MCQ, OSCE and Management Viva. In this way, the impact of the examination site on the examination results was assessed while most of the other important variables are controlled.

The results demonstrate that once all other variables are held constant (i.e. controlled) the examination site had no significant impact on the examination results (Tables 3, 4 and 5).

Table 3 Factors impacting Y6 OSCE results: univariate analysis, tests of between-subjects effects
Table 4 Factors impacting Y6 Management Viva results: univariate analysis, tests of between-subjects effects
Table 5 Factors impacting Y6 MCQ results: univariate analysis, tests of between-subjects effects

The final set of analyses consisted of two linear regression models which aimed to identify whether the location of the respective examinations matching the students’ home site (that is, the student both learned in and was examined in the same location) had any meaningful impact on the students’ performance which would be reflected in exam results. This analysis also included for Y3 MCQ and OSCE as control variables.

The results demonstrate that the Y6OSCE results were positively impacted only byY3 OSCE, Y6Management Viva and Y6MCQ results, whereas Y3and Y6home site or whether Y6home site was different to Y6OSCE site did not have any statistically significant impact on grades (Table 6).

Table 6 Factors impacting Y6 OSCE results: linear regression

Similar results were found regarding the factors impacting the Y6 Management Viva results. Only Y6OSCE and MCQ outcomes had positive and statistically significant impacts on Y6Management Viva grades, whereas Y3and Y6 home site or whether Y6 home site was different to Y6Management Viva site did not have any statistically significant impact on grades (Table 7).

Table 7 Factors impacting Y6 Management Viva results: linear regression

Discussion

The current study aimed to identify how sites of students’ clinical learning and examinations (rural as compared with metropolitan) as well as other related factors may impact final year medical students’ results across three assessment types - MCQ, OSCE and Management Viva.

The results demonstrate that neither site of learning nor site of examination had any significant impact on the outcomes in any of these three assessments. These results are important from a number of perspectives, most relating to medical schools which operate rural clinical campuses. The main message is that students who study in rural clinical schools are neither advantaged nor disadvantaged compared to their counterparts studying in metropolitan clinical schools in terms of examination performance. The results also support the notion that, the selection and use of questions from a standardised Assessment Item Bank, combined with the use of identical grading criteria and Examiner Guidelines can minimise variation between examination sites. The literature regarding the impact of clinical setting on learning outcomes suggests that students may obtain better clinical experience due to exposure to more diverse cases in rural settings compared with metropolitan settings [11]. However, concerns regarding access to learning resources for students in rural clinical settings have also been raised, particularly in Australia [25].It is important to note that studying in rural campuses did not affect student performance in any of the assessment types undertaken in students’ final year of study. This is a critical finding as it alleviates concerns regarding limited access to learning resources in clinical settings [25]. A plausible explanation for this finding is the extensive and effective e-learning resources made available to students, which may mitigate the lack of local resources in rural clinical campuses [26,27,28,29].

From an assessment perspective, it is interesting to see that common biases impacting Management Viva and OSCE assessors, i.e. familiarity with the student, coming from similar background [19] have not been observed in the current study. A possible explanation is that examiners overall were not much biased, or biases might have been reduced or nullified by strategies discussed above. In addition, it is also possible that OBM2 [30], anew assessment moderation technique recently introduced to the OSCEs and Management Viva at UNSW, might have moderated any bias that might previously have existed. The OBM2 is a method that moderates examiner bias around borderline performance. That is, borderline marks awarded by lenient examiners are more likely to be converted to fail grades, whereas borderline marks awarded by stringent examiners tend to be converted to pass grades. Since this study used only the final OSCE and Management Viva (post OBM2) results, it was impossible to identify to what extent examiners’ bias was eliminated by the OBM2. Further studies are required to examine that issue.

Unlike previous similar studies, which did not control for the site of examination (for example: [31,32,33]), this study does consider in the analysis both the site of study and the site of examination, particularly trying to identify whether student-examiner familiarity has had any impact on examination outcomes [19]. The results demonstrate that either at the site level or at the setting level [rural / metropolitan, (Tables 6 and 7), students’ performance in assessments of clinical practice and knowledge (OSCE, MCQ and Management Viva) is independent of both the site of learning and the site of examination. These results suggest that the quality of teaching, learning and assessment is similar across all campuses of the UNSW Medicine program, which may be a finding that could be generalised to other similar medical programs. Further support for this conclusion is presented in the significant association across all main assessments (OSCE, MCQ and Management Viva) in both Year 3 and Year 6.

Despite the encouraging results, this study has a number of limitations, one being the sample size. This analysis included two categorical variables, each consisting of 12 categories. The sample size (N = 275) employed in this study could be considered too small for such data. However, increasing the size of the dataset by adding more cohorts would further reduce the statistical power of the study, because examinations and some examiners are different across cohorts. The remedy employed in this study was additional analyses, which collapsed sites into two categories (rural/metropolitan) as well as adding a binary variable indicating whether examination site was similar to the learning site (home site). The results of the additional analyses were in line with the underpowered analysis, which enhance our confidence that this study was not susceptible to either type 1 or type 2 statistical errors.

It is noted that extracting information about the reliability or any other psychometric properties of the assessments was outside the scope of this study. The reason for that is that the purpose of this study was to serve overall scores, not psychometric characteristics of the various measures included.

It is also acknowledged that there are a number of factors that could have resulted in differences across rural/metropolitan sites that were outside the scope of the current investigation. For example, possible mechanisms or influential factors relating to the association between the locations (rural vs. Metropolitan) and assessment outcomes may be related to difference in examiners’ characteristics across sites for example seniority (senior examiners award lower marks than junior examiners), experience in assessing (the more experienced the examiners the lower the marks awarded) [34]. Furthermore, the difference in assessment outcomes may also be related to differences in technical practices applied by different clinicians, which may relate to professional experience [35]. Nonetheless, exploring the mechanisms underlying the potential biases in assessment outcomes across examination sites was not within the scope of the current study. Further research is required to address these issues.

Conclusions

This study demonstrated that studying in rural clinical schools neither advantages nor disadvantages medical students learning outcomes across a range of the main assessment types in the UNSW Medicine program (MCQ, OSCE and Management Viva). It is suggested that some of the supposed disadvantages inherent in rural campuses, are effectively mitigated by perceived advantages in regard to more intensive interaction with patients and the general and medical communities in those sites, as well as effective e-learning resources and moderation of assessment grades. The results of this study also support that the selection and use of questions from a standardised Assessment Item Bank, combined with the use of identical grading criteria and Examiner Guidelines can minimise variation between examination sites. Further studies may examine in more detail the specific factors that enable the success of rural medical training.

Availability of data and materials

Data may be available by request submitted to the corresponding author, subject to the approval the Human Research Ethics Committee of the University of New South Wales.

Abbreviations

MCQ:

Multiple Choice Questions

OSCE:

Objective Structured Clinical Examination

UNSW:

University of New South Wales

References

  1. Hay M, Mercer A, Lichtwark I, Tran S, Hodgson W, Aretz H, Armstrong E, Gorman D. Selecting for a sustainable workforce to meet the future healthcare needs of rural communities in Australia. Adv Health Sci Educ. 2017;22:533. https://doi-org.ezproxy.auckland.ac.nz/10.1007/s10459-016-9727-0.

    Article  Google Scholar 

  2. Raghavan M, Martin B, Burnett M, Aoki F, Christensen H, MacKalski B, Young D, Ripstein I. Multiple mini-interview scores of medical school applicants with and without rural attributes. Rural Remote Health. 2013;13(2):2362.

    Google Scholar 

  3. Raymond G, Vinson J. Clinical medical education in rural and underserved areas and eventual practice outcomes: a systematic review and meta-analysis. Educ Health. 2017;30(2):146–55.

    Article  Google Scholar 

  4. Somporn P, Ash J, Walters L. Stakeholder views of rural community-based medical education: a narrative review of the international literature. Med Educ. 2018;52(8):791–802.

    Article  Google Scholar 

  5. O'Sullivan B, McGrail MR, Russell DJ, Walker J, Chambers H, Major L, Langham R. Duration and setting of rural immersion during the medical degree relates to rural work outcomes. Med Educ. 2018;52(8):803–15.

    Article  Google Scholar 

  6. Crump A, Jeter K, Mullins S, Shadoan A, Ziegler C, Crump W. Rural medicine realities: the impact of immersion on urban-based medical students. J Rural Health. 2019;35(1):42–8.

    Article  Google Scholar 

  7. Playford D, Ngo H, Gupta S, Puddey I. Opting for rural practice: the influence of medical student origin, intention and immersion experience, vol. 207; 2017.

    Google Scholar 

  8. Tate R, Aoki F. Rural practice and the personal and educational characteristics of medical students: survey of 1269 graduates of the University of Manitoba. Can Fam Physician. 2012;58(11):e641–8.

    Google Scholar 

  9. McKillop A, Webster C, Bennett W, O'Connor B, Bagg W. Encourages and discouragers affecting medical graduates’ choice of regional and rural practice locations. Rural Remote Health. 2017;17(4):4247.

    Article  Google Scholar 

  10. Greenhill J, Walker J, Playford D. Outcomes of Australian rural clinical schools: a decade of success building the rural medical workforce through the education and training continuum. Rural Remote Health. 2015;15(3):2991.

    Google Scholar 

  11. Barrett F, Lipsky M, Nawal L. The impact of rural training experiences on medical students: a critical review. Acad Med. 2011;82(2):259–63.

    Article  Google Scholar 

  12. Kent M, Verstappen A, Wilkinson T, Poole P. Keeping them interested: a national study of factors that change medical student interest in working rurally. Rural Remote Health. 2018;18(4):4872.

    Google Scholar 

  13. Crump W, Fricker S, Ziegler C, Wiegman D, Rowland M. Rural track training based at a small regional campus: equivalency of training, residency choice, and practice location of graduates. Acad Med. 2013;88(8):1122–8.

    Article  Google Scholar 

  14. Crampton P, McLachlan J, Illing J. A systematic literature review of undergraduate clinical placements in underserved areas. Med Educ. 2013;47(10):969–78.

    Article  Google Scholar 

  15. Worley P, Condon J, Prideaux D. Student academic performance in rural clinical schools: the impact of cohort size and competition AU - Condon, Brendan P. Med Teach. 2017;39(3):262–8.

    Article  Google Scholar 

  16. Waters B, Hughes J, Forbes K, Wilkinson D. Comparative academic performance of medical students in rural and urban clinical settings. Med Educ. 2006;40(2):117–20.

    Article  Google Scholar 

  17. Strasser R. Community engagement: a key to successful rural clinical education. Rural Remote Health. 2010;10(1543):1–8.

    Google Scholar 

  18. Birden H. Twelve tips on rural medical placements: what has worked to make them successful AU - Page, Sue. Med Teach. 2008;30(6):592–6.

    Article  Google Scholar 

  19. Chong L, Taylor S, Haywood M, Adelstein B, Shulruf B. The sights and insights of examiners in objective structured clinical examinations. J Educ Eval Health Prof. 2017;14:34.

    Article  Google Scholar 

  20. Jefferies A, Simmons B, Regehr G. The effect of candidate familiarity on examiner OSCE scores. Med Educ. 2007;41(9):888–91.

    Article  Google Scholar 

  21. Cohen J. Eta-squared and partial eta-squared in fixed factor Anova designs. Educ Psychol Meas. 1973;33(1):107–12.

    Article  Google Scholar 

  22. Richardson J. Eta squared and partial eta squared as measures of effect size in educational research. Educ Res Rev. 2011;6(2):135–47.

    Article  Google Scholar 

  23. Sawilowsky S, Blair C. A more realistic look at the robustness and type II error properties of the t test to departures from population normality. Psychol Bull. 1992;111(2):352.

    Article  Google Scholar 

  24. Denis D. SPSS data analysis for univariate, bivariate, and multivariate statistics. London: Wiley; 2019.

    Google Scholar 

  25. Commonwealth of Australia. What makes for success in medical education ? Synthesis report. Canberra: Department of Education, Employment and Workplace Relations; 2008.

    Google Scholar 

  26. Shih P, Velan G, Shulruf B. Shared values and socio-cultural norms: E-learning technologies from a social practice perspective. Issues Educ Res. 2017;27(3):550–66.

    Google Scholar 

  27. Velan G, Jones P, McNeil P, Kumar R. Integrated online formative assessments in the biomedical sciences for medical students: benefits for learning. BMC Med Educ. 2008;8(1):52.

    Article  Google Scholar 

  28. Velan G, Goergen S, Grimm J, Shulruf B. Impact of interactive eLearning modules for appropriate imaging referrals: a multicentre randomised crossover study. J Am Coll Radiol. 2015;12(11):1207–14.

    Article  Google Scholar 

  29. Wynter L, Burgess A, Kalman E, Heron J, Bleasel J. Medical students: what educational resources are they using? BMC Med Educ. 2019;19(1):36.

    Article  Google Scholar 

  30. Shulruf B, Damodaran A, Jones P, Kennedy S, Mangos G, O’Sullivan A, Rhee J, Taylor S, Velan G, Harris P. Enhancing the defensibility of examiners’ marks in high stake OSCEs. BMC Med Educ. 2018;18(10):1–9.

    Google Scholar 

  31. Harris I, Swentko W, Halaas, Benson B. Comparing rural-trained medical students with their peers: performance in a primary care OSCE AU - power, David V. Teach Learn Med. 2006;18(3):196–202.

    Article  Google Scholar 

  32. Birden H, Wilson I. Rural placements are effective for teaching medicine in Australia: evaluation of a cohort of students studying in rural placements; 2012.

    Google Scholar 

  33. Bianchi F, Stobbe K, Eva K. Comparing academic performance of medical students in distributed learning sites: the McMaster experience AU - Bianchi, Flavia. Med Teach. 2008;30(1):67–71.

    Article  Google Scholar 

  34. Chong L, Taylor S, Haywood M, Adelstein B, Shulruf B. Examiner seniority and experience are associated with bias when scoring communication but not examination skills in Objective Structured Clinical Examinations (OSCEs) in Australia. J Educ Eval Health Prof. 2018;15:17.

    Article  Google Scholar 

  35. Naumann F, Marshall S, Shulruf B, Jones P. Exploring examiner Judgement of professional competence in rater based assessment. Adv Health Sci Educ. 2016;21(4):775–88.

    Article  Google Scholar 

Download references

Acknowledgements

The authors express their gratitude to Ms. Petrina Choong for helping with the data management.

Funding

This study was not funded by any agency.

Author information

Authors and Affiliations

Authors

Contributions

BS has initiated the study, designed the methodology, analysed the data interpreted the results and contributed to the writing; GV has contributed to the interpretation of the results and to the writing of the manuscript; LF has contributed to the interpretation of the results and to the writing of the manuscript; AO has contributed to the interpretation of the results and to the writing of the manuscript; PH has contributed to the interpretation of the results and to the writing of the manuscript; ST has contributed to the interpretation of the results and to the writing of the manuscript. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Boaz Shulruf.

Ethics declarations

Ethics approval and consent to participate

Ethics approval for this study was granted by the Human Research Ethics Committee of the University of New South Wales (reference, HC15421, HREAPG: Health, Medical, Community and Social). The ethics approval waives the need to seek participants’ consent.

Consent for publication

This study used administrative data held by UNSW Medicine. The ethics approval did not request any participants consent for either using the data for the research or for publication, thus no consent was obtained.

Competing interests

Boaz Shulruf is a member of the editorial board (Associate Editor) of this journal; All other authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shulruf, B., Velan, G., Forster, L. et al. Insights into student assessment outcomes in rural clinical campuses. BMC Med Educ 19, 380 (2019). https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-019-1828-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-019-1828-z

Keywords