Skip to main content

Adapting the supervisory relationship measure for general medical practice



The relationship between general practice (GP) supervisors and registrars is a critical component in effective training for the next generation of medical practitioners. Despite the importance of the relational aspect of clinical education, most evaluation has traditionally occurred from the perspective of the registrar only. As such, no validated tools exist to measure the quality of the supervisory relationship from the perspective of the supervisor. This paper presents an adaptation and validation of the clinical psychology supervisory relationship measure (Pearce et al, Br J Clin Psychol 52:249–68, 2013) for GP supervisors in an Australian context.


Following an Expert Group review and adaptation of the items, 338 GP supervisors completed the adapted tool.


Using principal components analysis and Procrustes confirmatory rotation, an optimal three-component model of supervisory relationship was identified, reflecting measures of Safe base (α = .96), Supervisor investment (α = .85), and Registrar professionalism (α = .94).


The general practice supervisory relationship measure (GP-SRM) demonstrated excellent model fit, high internal consistency, and was theoretically consistent with the original tool. Implications for clinical education and future research are presented.

Peer Review reports


General practice (GP) supervisors are critical to the training of the next generation of general practitioners. GP supervisors provide “guidance and feedback on matters of personal, professional and educational development in the context of a trainee’s experience of providing safe and appropriate patient care” [1]. The feedback that GP supervisors provide to registrars is recognised as a critical component in Australian specialist medical education [2] and is explicitly part of the Australian College of Rural and Remote Medicine [3] and the Australian Medical Council Specialist Education Accreditation Committee [4] supervision standards for general practice training. Yet, despite this importance, little attention is paid to the experiences of GP supervisors, and in particular to the experiences of the educational relationship between supervisors and registrars.

Supervisor satisfaction is a critical motivator to engage and retain supervisors in clinical and educational supervision [5, 6]. The quality of the relationship between supervisors and registrars underpins supervisor satisfaction. Despite this, most studies of the educational environment, clinical learning environment or educational alliance have tended to focus on the relationship from the perspective of the registrar only [7,8,9,10].

Registrar perceptions of their training experience are used as part of the quality assurance process for accrediting anaesthetic training programs in the UK [11]. In the USA, the Clinical Learning Environment Review was introduced by the Accreditation Council for Graduate Medical Education to provide feedback to hospitals and medical centres on how successful they are at engaging residents to improve quality and safety systems in the clinical learning environment [12].

The relationship between a registrar and their supervisor has been suggested as the platform for all other aspects of learning. The concept of educational alliance is emergent in the literature as a central component of supervision and commonly included in definitions used for GP supervisors. Wearne et al. [13] described an effective supervisor as “a general practitioner who establishes and maintains an educational alliance that supports the clinical, educational and personal development of a resident” (p. 1169). The Australian General Practice Training (AGPT) program describes GP supervisors as “experienced general practitioners [who] establish and maintain educational alliances that support the clinical, educational and personal development of registrars who come to work in their practices”. (AGPT [14];).

Yet evaluation of this relationship, or alliance, is entirely focused on registrar satisfaction with the supervision and training they received. There are no measurement tools available that capture the perceptions and experiences of the GP supervisor in their relationship with supervised registrars. Evaluating the educational alliance from the point of view of the GP supervisor holds great potential as a feedback framework [15] and thereafter a measure and predictor of supervisor and registrar satisfaction, and supervisor retention, providing important insights into the type of supervisor educational supports required for junior supervisors joining GP training.

In other disciplines, tools have been developed to measure supervisory relationships from the point of view of supervisors. For example, a study of the perception of medical faculty members of their educational environment in teaching undergraduate medical students led to the development a 50-item inventory, the Assessment of Medical Education Environment by Teachers [16]. Clinical psychology is a field where the relationship between supervisor and registrar is considered so important that its formation and management is included as a core competency internationally [17].

Reconsideration of the role of the supervisor in providing feedback within the context of the educational alliance is also suggested by others [18]. It has been proposed that the field of medical education could benefit by examining the therapeutic alliance (the relationship between supervising psychologists and trainee psychologists, which is a significant mechanism by which positive outcomes are achieved) in psychotherapy training as analogous to the educational alliance in medical training, in particular the use of feedback [15]. This perspective is one which offers an innovative framework from which to reconceptualise feedback and suggests that new research questions should explore the educational alliance between registrars and supervisors, particularly in other disciplines.

Despite the lack of tools to measure the supervisory relationship in medical education, there is one validated tool designed for use in a parallel context. The Supervisory Relationship Measure (SRM) is a 51-item questionnaire which was developed as a measure of the supervisory relationship between clinical psychology supervisors and their registrars, from the perspective of the supervisor [19]. In the original SRM development study, principal components analysis identified five aspects of supervisor experience and perceptions: “Safe base”, “Supervisor commitment”, “Trainee contribution”, “External influences”, and “Supervisor investment” [19]. The SRM was demonstrated to be a valid and reliable measure of the supervisory relationship from the supervisor’s perspective. A recent independent review of supervisory relationship measures found that the SRM is “a sound measure of the supervisory relationship”, with a large sample used for initial verification, and application of several tests for validity and reliability [20].

While a number of studies have examined registrar satisfaction with supervision, none have attempted to measure the GP-supervisor perspective of the relationship. The current study addresses this dearth of research by adapting and implementing an existing reliable and validated tool to identify key relationship deficits and professional development opportunities for GP supervisors within the Australian context.


In this study, we adapted and validated the Supervisory Relationship Measure (SRM) for use with GP supervisors. The original SRM, developed for use with clinical psychology supervisors [19], consists of seven-point Likert-scale items which measure the level of agreement (strongly disagree to strongly agree) with 51 statements regarding the supervisory relationship with a particular clinical trainee, such as “My trainee and I have a good professional relationship” and “My trainee is open about any difficulties they are experiencing”. Reversed responses for statements asked in the negative, such as “My registrar has a poor professional approach” were recoded for consistency. Where used, the phrase “trainee” was replaced with “registrar”, which is a more appropriate term in the GP training context. An Expert Group reviewed and adapted items, which was followed by pilot testing. Ethical approval for the study was provided by the Monash University Human Research Ethics Committee (project number 10977).

Expert group review and pilot testing

The 51 original SRM items [18] were assessed by an Expert Group of four experienced GP Supervisors. These supervisors were recruited from General Practice Training Tasmania. The Expert Group were current supervisors who had between three and 20 years’ experience as a supervisor, and had supervised six or more registrars. Through iterative expert consensus [21] the Expert Group determined the appropriateness of each item, adapted items when appropriate, deleted inappropriate or irrelevant items, and generated new items for use with GP supervisors, creating the GP-SRMS.

Most items were deemed appropriate for retention. The Expert Group made only minor changes to these items; for example, replacing “trainee” with “registrar” and “placement” with “practice term”.

Six items were removed:

  1. 1.

    My trainee is open to new experiences on placement

  2. 2.

    I set up regular supervision for my trainee

  3. 3.

    My trainee produces good quality work

  4. 4.

    My trainee’s past experiences of supervision interfere with our relationship

  5. 5.

    I sense that my trainee worries because I am evaluating them

  6. 6.

    My trainee is too anxious to engage in supervision

When electing to remove items, the Expert Group considered a number of criteria. Items which were not considered accurate in the GP training context included reference to regular supervision (being generally provided less regularly), or concerns around evaluation (because evaluation was unavoidable). The Expert Group considered items relating to a registrar being open to new experiences, past experiences, or mood to be unrelated to the educational alliance. Five new items were added by the Expert Group:

  1. 1.

    I’m enthusiastic about my registrar’s practice term with me

  2. 2.

    I provide the environment and opportunities for my registrar to give me open and honest feedback

  3. 3.

    My registrar is not clinically competent

  4. 4.

    My registrar’s practice is safe

  5. 5.

    My registrar’s self-directed learning interferes with their clinical practice.

These adapted items, along with questions on demographic characteristics of the supervisor and registrar, were piloted in an online survey using SurveyMonkey. The pilot testers were six GP supervisors with a range of supervisory experience, all recruited from General Practice Training Tasmania. Those participating in the pilot gave written feedback on item clarity and appropriateness, and time taken to complete the survey. One of the aims of the pilot testing was to estimate typical completion time, with the aim to keep completion time less than 15 min, for higher response rates [22]. Five of the six pilot testers stated that they completed the survey in 10–15 min, with only one taking longer - a reported 25 min. All stated that they found all the items clear and appropriate. Thus, no further changes were made to the survey.

Data collection

Once the survey items were finalised, GP Supervisors Australia sent an email to all 3200 members inviting them to participate in a short anonymous online survey on the supervisory experience, provided they had supervised at least one GP Registrar in the previous 3 months. The email included the aim of the survey, estimated completion time, and a link to the survey conducted through SurveyMonkey. There were 365 GPSA members who answered the GP-SRMS survey demographic questions and a slightly smaller subset of 338 who answered the demographic questions and all 50 GP-SRMS items (11% response rate). Overall, the mean missing data rate across the GP-SRMS items was 5.50%, ranging from 0.27 to 8.28%. The validation was conducted on the 338 participants who provided complete answers.

Demographic characteristics of survey participants

Demographic characteristics of survey supervisor participants and their registrars are shown in Table 1. The median and modal age group of supervisors responding to the survey was 50–54 years, with ages ranging from 30 to 34 years to 65+ years. Just over half of survey participants were male. Around half had been supervising registrars for more than 10 years, and just over 40% had supervised ten or more registrars. One-quarter were the sole supervisor of the registrar that was the subject of the SRM items, 55% were primary supervisor, and 20% were the secondary supervisor in a group of supervisors. Of registrars who were the subject of the SRM items, half were aged 25–29 years, with another 29% aged 30–34 years. Sixty per cent were female and 42% were at the first level of training.

Table 1 Demographic characteristics of GP-SRMS supervisor-participants and their registrars

Statistical analysis

Building on the previous statistical validation of the SRM, principal components analysis (PCA) with Procrustes transformation [23, 24] was used. This method has been described as a confirmatory PCA and provides a measure of model fit. The Procrustes transformation compares the rotated solution to an ideal matrix where items either load completely or not at all; providing an estimate of how well items fit. Using SPSS version 24 [25] PCA with direct oblimin rotation was conducted with all GP-SRMS items, followed by Procrustes transformation using Orthosim version 2.01 [26]. Items with low communality, low primary loading or significant cross-loading, or poor fit were systematically removed until a stable component structure and robust model fit was achieved.


A Kaiser-Meyer-Olkin value of .95 [27, 28] and a significant Bartlett’s Test of Sphericity [29] were found, supporting the factorability of the correlation matrix. While eight components had eigenvalues exceeding 1, initially five components were extracted consistent with the factor structure of the SRM. A review of the model using Cattell’s scree test [30] and Parallel Analysis [31] suggested that the five component model was an overextraction, with three or four components being more appropriate.

Three, four, and five component models of the GP-SRMS items were examined, and poor items were systematically removed. An optimal model fit was achieved with a three component model which retained 45 items. The overall solution congruence with an ideal target matrix was .97, with values of .85 and above indicating similarity [32, 33]. Two additional measures of congruence were also calculated, with the Double-Scaled Euclidean Distance (.90) and the Kernel Smoothed Distance (.84) both indicating similarity [34, 35]. Component fit was also very high. The final principal components loading matrix, model fit statistics, and reliability coefficients can be found in Table 2.

Table 2 Principal Components Analysis (PCA) of GP-SRMS items


The aim of the current study was to adapt and validate the SRM for use with GP supervisors within the general medical practice training context. The relationship between clinical supervisors and registrars has been demonstrated to be vital in fields such as clinical psychology [19], and it was our contention that the therapeutic alliance between supervisors and trainees in psychology was analogous to the educational alliance in medical training [17,18,19], highlighting the need to consider supervisory relationships in medical training.

In the original SRM development study, principal components analysis identified five aspects of supervisor experience and perceptions. The results of the current study suggested that a three component model comprising “Safe base”, “Supervisor investment”, and “Registrar professionalism” was more appropriate in the general medical practice training context, following the removal of several poorly performing items. Building on the statistical methodology which was used to develop the SRM, the three component GP-SRMS demonstrated excellent model fit overall, as well as within components. The original SRM demonstrated subscale reliabilities ranging from .71 to .96. The GP-SRMS subscales matched or exceeded the reliabilities achieved by the SRM, ranging from .85 to .96. The fourth SRM subscale “External influences” and fifth SRM subscale “Supervisor investment” likely represent an overextraction in the original study, which also found some evidence for a two or three component model.

In the original SRM study, the “Safe base” subscale contained the highest number of items and the highest reliability. The final GP-SRMS “Safe base” subscale retained the equivalent items, and also gained two items, one of which had been previously been included with “Supervisor investment” in the SRM. The item “I feel safe giving my registrar negative feedback” had no direct equivalent in the SRM, however it was semantically and theoretically consistent with “Safe base”. The item “I have a good idea about what my registrar wants to gain from this practice term” could contribute to a registrar feeling valued and understood, which is consistent with other “Safe base” items. Scores on the “Safe base” subscale reflect an enthusiastic, open, collaborative GP supervisory relationship.

The “Supervisor investment” subscale in the GP-SRMS reflects a combination of the items which originally comprised “Supervisor investment” and “Supervisor commitment” in the SRM. The original subscales separated more clearly into professional commitment and emotional investment, and it is possible that the Expert Group item changes reflect a subtle shift away from emotional investment, or that there is less differentiation between professional and emotional aspects of supervision in the medical context. Scores on the “Supervisor investment” subscale reflect a GP supervisor’s effort to support the registrar through resources, preparation, and being interested in the registrar.

“Trainee contribution” in the SRM was largely subsumed into “Registrar professionalism” in the GP-SRMS. While nearly all of the equivalent items were retained, several items such as “My registrar is not clinically competent” and “My registrar’s practice is safe” and the Expert Group item changes reflect a more explicit focus on professionalism rather than contribution in the broader sense. Scores on the “Registrar professionalism” subscale reflect a GP supervisor’s perceptions of how competent, responsible, organised, and committed a registrar is.

The “External influences” subscale of the original SRM did not emerge clearly in any of the component models of the GP-SRMS. The poorly fitting item “My registrar wants me to be their GP as well as their supervisor” perhaps reflects a difference in the supervisory relationship between general practice medicine and clinical psychology. Alternatively, it is possible that a GP supervisor would not consider a registrar who discloses or seeks out medical consultation to be acting inappropriately. Two items relating to life stressors (“My registrar has other life stressors which distract them from their work” and “I have stressors in my life which make it difficult for me to focus on supervision”) did not demonstrate adequate model fit, which suggests that life stressors are typically less of an factor in the GP supervisory relationship, regardless of whether the stressors are experienced by the supervisor or registrar. Finally, two items relating to education and learning (“My registrar’s educational training requirements interfere with their clinical practice” and “My registrar’s self-directed learning interferes with their clinical practice”) also failed to demonstrate adequate model fit. The latter item was suggested by the Expert Group, however it is possible that this experience in supervision was not as widely encountered, or perceived to be as damaging to the supervisory relationship. It could also be suggested that education and learning requirements will vary between training programs and placements, reducing the consistency in which this aspect is experienced. While we are unable to be certain why “External influences” failed to emerge in the GP-SRMS, there was evidence that a model with more than three components was an overextraction. The results of the current study suggest that the three component model demonstrated better model fit and reliability without this subscale and associated items.

The current study is not without limitations. With a response rate of approximately 10%, there remains a question about the representativeness of the sample used. However, there is evidence to suggest that response-rate bias is not overly problematic for most quantitative analyses, suggesting that the results are robust [36]. GP supervisor selection of registrars was also not randomised or stratified, which may have impacted on the distribution of responses. Test-retest reliability has not yet been established, nor the convergent validity between GP supervisor reports and GP registrar reports, however this research is currently ongoing.


The aim of the current study was to adapt and validate the Supervisory Relationship Measure for use with general practitioner supervisors. Following an expert review and psychometric evaluation, the revised GP-SRMS demonstrated excellent psychometric properties across three domains of “Safe base”, “Supervisor investment”, and “Registrar professionalism”. Given the lack of research which considers the supervisory relationship from the perspective of the GP supervisor, the GP-SRMS will be of significant interest to clinical educators. Future research will focus on determining test-retest reliability of the GP-SRMS; convergent validity between supervisor and registrar versions of the tool; and identifying the training needs and guidelines for most effective use of the GP-SRMS in clinical education.



Australian General Practice Training


General practice or general practitioner


General Practice Supervisory Relationship Measure for Supervisors


Principal Components Analysis


Supervisory Relationship Measure


  1. 1.

    Kilminster S, Cottrell D, Grant J, Jolly B. AMEE guide no. 27: effective educational and clinical supervision. Med Teach. 2007;29(1):2–19.

    Article  Google Scholar 

  2. 2.

    The Royal Australian College of General Practitioners. Standards for general practice training. 2nd ed. East Melbourne: RACGP; 2015.

    Google Scholar 

  3. 3.

    Australian College of Rural and Remote Medicine. Primary rural and remote training – standards for supervisors and teaching posts. Brisbane: ACRRM; 2013.

    Google Scholar 

  4. 4.

    Australian Medical Council Specialist. Education Accreditation Committee: Standards for assessment and accreditation of specialist medical programs and professional development programs. Canberra: Australian Medical Council Limited; 2015.

    Google Scholar 

  5. 5.

    Ingham G, Fry J, O'Meara P, Tourle V. Why and how do general practitioners teach? An exploration of the motivations and experiences of rural Australian general practitioner supervisors. BMC Med Educ. 2015;15:190.

    Article  Google Scholar 

  6. 6.

    Thomson JS, Anderson KJ, Mara PR, Stevenson AD. Supervision--growing and building a sustainable general practice supervisor system. Med J Aust. 2011;194(11):S101–4.

    Google Scholar 

  7. 7.

    Mulrooney A. Development of an instrument to measure the practice vocational training environment in Ireland. Med Teach. 2005;27(4):338–42.

    Article  Google Scholar 

  8. 8.

    Bloomfield L. Evaluation of the clinical learning environment for radiation oncology specialty training. Focus Health Prof Educ: A Multi-Disciplinary J. 2007;9(2):71–82.

    Google Scholar 

  9. 9.

    Bloomfield L, Subramaniam R. Development of an instrument to measure the clinical learning environment in diagnostic radiology. J Med Imaging Radiat Oncol. 2008;52(3):262–8.

    Article  Google Scholar 

  10. 10.

    Miles S, Leinster SJ. Comparing staff and student perceptions of the student experience at a new medical school. Med Teach. 2009;31(6):539–46.

    Article  Google Scholar 

  11. 11.

    Holt MC, Roff S. Development and validation of the Anaesthetic theatre educational environment measure (ATEEM). Med Teach. 2004;26(6):553–8.

    Article  Google Scholar 

  12. 12.

    Wagner R, Weiss KB, Passiment ML, Nasca TJ. Pursuing excellence in clinical learning environments. J Grad Med Educ. 2016;8(1):124–7.

    Article  Google Scholar 

  13. 13.

    Wearne S, Dornan T, Teunissen PW, Skinner T. General practitioners as supervisors in postgraduate clinical education: an integrative review. Med Educ. 2012;46(12):1161–73.

    Article  Google Scholar 

  14. 14.

    Australian General Practice Training: Snippets for Supervisors []. Accessed 18 Nov 2017.

  15. 15.

    Telio S, Ajjawi R, Regehr G. The “educational alliance” as a framework for reconceptualizing feedback in medical education. Acad Med. 2015;90(5):609–14.

    Article  Google Scholar 

  16. 16.

    Shehnaz SI, Premadasa G, Arifulla M, Sreedharan J, Gomathi KG. Development and validation of the AMEET inventory: an instrument measuring medical faculty members’ perceptions of their educational environment. Med Teach. 2015;37(7):660–9.

    Article  Google Scholar 

  17. 17.

    Watkins CE Jr. The supervisory alliance: a half century of theory, practice, and research in critical perspective. Am J Psychother. 2014;68(1):19–55.

    Article  Google Scholar 

  18. 18.

    Weinstein DF. Feedback in clinical education: untying the Gordian knot. Acad Med. 2015;90(5):559–61.

    Article  Google Scholar 

  19. 19.

    Pearce N, Beinart H, Clohessy S, Cooper M. Development and validation of the supervisory relationship measure: a self-report questionnaire for use with supervisors. Br J Clin Psychol. 2013;52(3):249–68.

    Article  Google Scholar 

  20. 20.

    Tangen JL, Borders D. The supervisory relationship: a conceptual and psychometric review of measures. Couns Educ Superv. 2016;55(3):159–81.

    Article  Google Scholar 

  21. 21.

    Fink A, Kosecoff J, Chassin M, Brook RH. Consensus methods: characteristics and guidelines for use. Am J Public Health. 1984;74(9):979–83.

    Article  Google Scholar 

  22. 22.

    Dykema J, Jones NR, Piche T, Stevenson J. Surveying clinicians by web: current issues in design and administration. Eval Health Prof. 2013;36(3):352–81.

    Article  Google Scholar 

  23. 23.

    Hopwood CJ, Donnellan MB. How should the internal structure of personality inventories be evaluated? Personal Soc Psychol Rev. 2010;14(3):332–46.

    Article  Google Scholar 

  24. 24.

    McCrae RR, Zonderman AB, Costa PT, Bond MH, Paunonen SV. Evaluating replicability of factors in the revised neo personality inventory: confirmatory factor analysis versus procrustes rotation. J Pers Soc Psychol. 1996;70(3):552–66.

    Article  Google Scholar 

  25. 25.

    IBM Corp. IBM SPSS Statistics for Windows, Version 24. Armonk: IBM Corp; 2015.

    Google Scholar 

  26. 26.

    Barrett P. Orthosim v.2.01. Auckland: Author; 2006.

    Google Scholar 

  27. 27.

    Kaiser H. A second generation little jiffy. Psychometrika. 1970;35:401–15.

    Article  Google Scholar 

  28. 28.

    Kaiser H. An index of factorial simplicity. Psychometrika. 1974;39:31–6.

    Article  Google Scholar 

  29. 29.

    Bartlett M. A note on the multiplying factors for various chi square approximations. J R Stat Soc. 1954;16(Series B):296–8.

    Google Scholar 

  30. 30.

    Catell R. The scree test for number of factors. Multivar Behav Res. 1966;1:245–76.

    Article  Google Scholar 

  31. 31.

    Costello S. MonteCarlo parallel analysis for principal components analysis: an SPSS macro. figshare. Melbourne: Author; 2016.

    Google Scholar 

  32. 32.

    Mulaik S. The foundations of factor analysis. New York: McGraw-Hill; 1972.

    Google Scholar 

  33. 33.

    ten Berge JF. Rotation to perfect congruence and the cross validation of component weights across populations. Multivar Behav Res. 1986;21(1):41–64.

    Article  Google Scholar 

  34. 34.

    Barrett P. Person-target profiling. In: Beauducel A, Biehl B, Bosniak M, Conrad W, Schönberger G, Wagener D, editors. Multivariate research strategies: A festschrift for Werner Wittmann. Edn. Germany: Shaker Verlag GmbH; 2005. p. 63–118.

    Google Scholar 

  35. 35.

    Hastie T, Tibshirani R, Friedman J. The elements of statistical learning: data mining, inference, and prediction. 2nd ed. New York: Springer; 2009.

    Google Scholar 

  36. 36.

    Rindfuss RR, Choe MK, Tsuya NO, Bumpass LL, Tamaki E. Do low survey response rates bias results? Evidence from Japan. Demogr Res. 2015;32(26):797–828.

    Article  Google Scholar 

  37. 37.

    Bentley M, Burns J, Costello S, Kippen R, Rawlin M, Sampson M, Wallace G, Warrington A. General practice supervisory relationship measure (supervisors) – handscored instrument. In: figshare; 2018.

    Google Scholar 

Download references


The authors would like to acknowledge the support and assistance of the Expert Group; Pilot Group testers; GP Supervisors Australia members; and project team members Michael Bentley, Simon Morgan, Morton Rawlin, Marisa Sampson, Gina Smith, Glen Wallace, Cathy Ward, and Allyson Warrington.


This research project was supported by the Royal Australian College of General Practitioners with funding from the Australian Government under the Australian General Practice Training program.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Author information




JB collected the data used in this research project. RK conducted the initial data preparation, and SC conducted the statistical analyses. All authors were involved in constructing the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Shane Costello.

Ethics declarations

Ethics approval and consent to participate

Completing the anonymous survey implied consent by the participants. Ethical approval for the study was provided by the Monash University Human Research Ethics Committee (project number 10977).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Costello, S., Kippen, R. & Burns, J. Adapting the supervisory relationship measure for general medical practice. BMC Med Educ 18, 284 (2018).

Download citation


  • Supervisory relationship
  • Educational alliance
  • General practice
  • Clinical education