Skip to main content

Distance education perception scale for medical students: a validity and reliability study

Abstract

Background

There are major changes in education strategies as higher education institutions urgently need to adopt distance education tools and practices due to the Covid-19 pandemic. Medical education is also trying to get out of this emergency using distance education. In this study, we aimed to develop a reliable and valid scale in order to evaluate the perceptions of medical students towards distance education.

Methods

The students taking part in the study were in the first five academic years of the medical faculty in Bursa in Turkey. At first, 57 items were determined to evaluate students’ perceptions. Content validity was examined according to the assessment of the expert team. Construct validity of these items was examined by exploratory and confirmatory factor analysis. Also, Cronbach’s alpha coefficients were calculated for reliability analysis. The medical students’ responses were scored using a five-point Likert scale.

Results

When the content validity was examined, the number of items was determined to be 38 items. Construct validity of these items was examined by exploratory and confirmatory factor analysis. Because of the exploratory factor analysis performed on the responses of 429 medical students, 22 items were included in four factors. This four-factor model was applied to 286 medical students and validated by confirmatory factor analysis. Also, Cronbach’s alpha coefficients were calculated for reliability analysis and values were between 0.713 and 0.930.

Conclusions

This study demonstrated validation and reliability of perceptions of distance education for medical students. We suggest a 22-item model with a four-factorial scale.

Peer Review reports

Background

In December 2020, the world was shaken by the Covid-19 pandemic. Everything we used to have lost its meaning and a new structuring was tried to be created that affected education profoundly with all its levels. It is warned that after this difficult process experienced due to the pandemic, the world may face climate change and food security problems and that there may be famine. Therefore, the economic use of resources seems to be much more important from now on. COVID-19 has led to the global testing of urgent distance education software programs. Schools, institutions, and universities are facing the challenge of how to continue teaching and learning while protecting their educators, staff, and students from the pandemic [1]. So it would be realistic to expect these systems to displace at least some of the costly face-to-face education [2].

Distance education is an educational experience in which students and educators are separated in time and place [3], and it means that education can also take place remotely at an academic institution and a degree or qualification certificate can be awarded [4]. There are different types of applications of distance education, but in this study, the assessment was conducted through online learning with synchronous, asynchronous, blended, large online open courses and open program online courses. Unlike asynchronous teaching, instructors and students meet for a session (usually online) at a predetermined time. According to Watt, live streaming video and/or audio are used for simultaneous interaction [5]. Although video-conferencing allows participants to see each other, it is not considered a face-to-face interaction due to physical separation.

With the onset of the pandemic in Turkey, asynchronous education for 1 month from March to April 2020 were initiated. In this period, educational materials were presented to students in digital media. After the learning management system infrastructure was established, the synchronous education period was started in the period of April–July 2020, during this period, education was provided online via a video-conferencing application. In July, exams were held remotely via the learning management system.

Many studies examine the perceptions of distance education according to the graduate/undergraduate students, lecturer/instructor, department/faculty/institution, courses and different countries where they study [6,7,8,9,10,11,12,13]. A large number of different studies have been conducted in which a questionnaire was applied to measure the perception of distance education among medical students [14,15,16,17,18,19]. In addition to studies using questionnaires in the evaluation of the perception of distance education, some studies develop and use scales [20,21,22,23]. Like these scales used in other students, we aim to develop a reliable and valid distance education perception scale for medical students.

Materials and methods

Study design and participants

This study aimed to examine the reliability and validity of the scale, that we have developed, and that we have called the “Distance Education Perception Scale - Medical Students” (DEPS - Medical Students). The subjects participating in the study consisted of medical students in the first five academic years of Bursa Uludağ University Faculty of Medicine. Bursa Uludağ University Faculty of Medicine is located in the South Marmara region which is a metropolitan area. The annual intake of the faculty is approximately 400 students. Bursa Uludağ University Pre-Graduate Medical Education program consists of three stages. Stage 1; covers the Preclinical Education-Training process, which includes 1st, 2nd, and 3rd academic year. Stage 2 includes the Clinical Education-Training process, which includes clerkship programs in 4th and 5th academic year. Stage 3 consists of 6th academic year, also known as the internship period. Medical students in 6th academic year did not participate in distance education during the internship education. They carried out the internship processes face to face by taking the necessary personal protective equipment and institutional measures. Therefore, their participation in the research was not planned. The sample size for this study was 429 medical students. Sample size calculation of studies using factor analysis in construct validity was expressed by many researchers. Some literature suggested minimum sample sizes in absolute numbers like 100–250 [24, 25], 300 [26]. In addition, Yeo and Kim [27] stated in their study that the sample size was 290 meeting the minimum requirement. They based this on the statement proposed by Kang [28] that when n > 200 the risk of distorting the results is low. The sample size of 429 medical students in our study meets these criteria.

Development of the distance education perception scale for medical students (DEPS - medical students)

First, a literature review was performed by searching Turkish Index journal list (https://trdizin.gov.tr/), Google scholar, the Pubmed and Cochrane databases to define items for the first scale version. There are many studies that develop and examine the distance education perception scale [20,21,22,23]. These scales can be used for different education degrees and different departments/programs. One of these studies was remarkable. Gok and Cakmak [20] developed the distance education perception scale of faculty staff. Their study measures the distance education perceptions of lecturers at the university. They created an item pool with 82 scale items. As a result of experts’ opinions, some items were omitted from the scale and the last version of the item pool consists of 57 items. As a result of examining the reliability and validity of 57 items, they developed a scale consisting of a five-point Likert scale (between “strongly agree” and “strongly disagree”) consists of 21 items with three factors. In our study, 57 item-pool used by Gok and Cakmak [20] was taken into consideration. In these items, the statements regarding the faculty staff were adapted for the medical students. All items were positively stated. Later, it was sent to 10 experts in medical education for content validity.

The content validity of questionnaires can be examined using the assessments of the experts. The prescribed number of experts to review a scale differs from two to twenty people [29]. At least five experts were proposed to review the scale to have adequate authority over chance agreement [30]. Content validity of DEPS - Medical Students was determined using several experts (n = 10). The expert group of this study comprised ten senior academicians in medical education. We asked experts to rate the items in terms of their relevancy to the construct underlying study on a four-point ordinal scale (1: not relevant, 2: somewhat relevant, 3: quite relevant, 4: highly relevant). We were also asked the experts to make recommendations for improving the scale. The content validity index (CVI) is the most given index for content validity in scale development. The CVI measures the proportion of experts who are agreed on items and can be computed using Item-CVI (I-CVI). I-CVI is processed as the number of experts giving a rating of “highly relevant” for each item divided by the number of experts. I-CVI ranges from 0 to 1 where I-CVI > 0.79, the item is relevant, somewhere between 0.70 and 0.79, the item needs updates, and if I-CVI is underneath 0.70 the item is omitted [30,31,32].

Data collection

Because of the pandemic, we preferred to apply the scale web-based form instead of paper-based form. We administered the scale on July 2020 after acquiring permission from the Bursa Uludağ University, Faculty of Medicine, Clinical Researches Ethics Committee (Date 10 June 2020; Number: 2020–10/25). We informed all participating medical students with the purpose and methods of the study and assured them of personal confidentiality.

Data analysis

Descriptive statistics of demographic variables were presented by frequency and percentage. Descriptive statistics of items were given as mean and standard deviation (SD). To check for the normality of items, we calculated the skewness and kurtosis coefficients.

Exploratory factor analysis (EFA) was performed for the construct validity of the DEPS-Medical Students scale. Before examining factor structures with EFA analysis, Bartlett’s sphericity test and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy for factor analysis were performed [33]. Thereafter, an EFA was performed with the principal components analysis (PCA) and an oblique rotation method (Promax rotation). In testing for the construct of items DEPS-Medical Students scale using EFA, we followed the four rules that are often used as the criteria for deciding about items: (i) eigenvalues larger than 1 (Kaiser criterion); (ii) factor loadings lower than < 0.40; (iii) factor loadings on over one factor; (iv) single item factors [34,35,36]. Confirmatory factor analysis (CFA) was also performed to investigate construct validity. It was carried out in order to measure the consistency of the rotated factor loadings structure resulting from EFA. To examine the reliability of the scale, the internal consistency reliability was used and calculated with Cronbach’s alpha coefficient. Cronbach’s α coefficients of larger than 0.70 were considered acceptable [32, 37].

EFA was performed with IBM SPSS ver.23.0 (IBM Corp. Released 2015. IBM SPSS Statistics for Windows, Version 23.0. Armonk, NY: IBM Corp.). CFA was performed using the package Lavaan 0.6–8 [38] (release date:10 March 2021) in R Studio Version 1.4.1103 (RStudio Team (2020). RStudio: Integrated Development for R. RStudio, PBC, Boston, MA URL). Statistical significance level was considered as p < 0.05.

Results

A total of 429 medical students completed the web-based scale for EFA. Of the 429 medical students, 202 were male (47.1%) and 227 were female (52.9%). For CFA, 286 medical students completed the web based scale. Gender and academic year distribution of medical students were given in Table 1.

Table 1 Demographic variables of medical students

Content validity

The initial number of items for the DEPS - Medical Students was 57. For the content validity of the DEPS - Medical Students scale, we excluded 19 items according to assessments of the expert group because the I-CVI of items were below < 0.70. The DEPS - Medical Students scale has adequate content validity, as the remaining 38 items have I-CVI greater than 0.80.

Exploratory factor analysis (EFA)

EFA was performed for the construct validity of the 38-item DEPS-Medical Students scale. Before using EFA, Bartlett’s sphericity test was performed with significant results (χ2 = 4969.778; p < 0.001), showing that the inter-correlation matrix contained enough common variance to make the factor analysis valuable. The KMO measure of sampling adequacy was 0.943, which suggests that these data, very suitable for factor analysis, was very sufficient. The 429 medical students’ responses were analysed using PCA and an oblique rotation method (Promax rotation). Based on previous studies, 0.40 was used in the factor loading as the cut-off score for selecting the items to be retained in the scale. Items with factor loading exceeding 0.40 and no cross-loading were assigned to factors. EFA showed that 16 items were either loaded on more than a single factor and the difference between factor loading was smaller than 0.10 or failed to load on any single factor (loading< 0.40). The retained 22-item scale was loaded on four factors (Table 2). The eigenvalues of the four factors were: λ1 = 9.243, λ2 = 1.991, λ3 = 1.180, λ4 = 1.084. The PCA findings show that the four factors with eigenvalues greater than 1.00 accounted for 61.35% of the extracted total variance in the DEPS-Medical Students scale. Description of the four factors were defined as “Students’ perception”, “Equipment facility”, “Time Management”, and “Facility and support of the institution”, respectively (Table 2). Students’ perception as Factor-I consisted of 12 items. Equipment facility as Factor-II consisted of five items. Time Management as Factor-III consisted of two items. Facility and support of the institution as Factor-IV consisted of three items.

Table 2 Rotated factor loadings for the 22-item DEPS - Medical Students scale

Confirmatory factor analysis (CFA)

As the 22 items were loaded four factors in EFA, a CFA was performed for the other medical students (n = 286) resulting in same four-factor structure. When the coefficients of skewness and kurtosis were examined, it was seen that the normality of the items in the scale was suitable for CFA (Table 3). The maximum likelihood CFA was used to test the “goodness of fit” of the four-factor model. The results showed that the chi-square value of four-factor structure was significant (χ2 = 409.526, degree of freedom (df) =203, p < 0.001). Overall fit indexes also were calculated. The relative chi-square (chi-square value /df) was 2.017 (acceptable value between 1 and 5) [39, 40]. Standardized root mean square residual (SRMR) was 0.055 (acceptable value < 0.08) [39, 41]. Root mean square error of approximation (RMSEA) was 0.06 (acceptable value < 0.10) [39, 41]. Comparative Fit Index (CFI) was 0.926 (acceptable value > 0.90) [41]. Tucker-Lewis Index (TLI) was 0.916 (acceptable value > 0.90) [40]. Measures of model fit for the CFA yielded satisfactory according to acceptable values.

Table 3 Descriptive statistics of items for CFA and Reliability

Reliability

The internal consistency reliability of 22-item DEPS - Medical Students scale and the factors were calculated by Cronbach’s alpha coefficient. Corrected item-total correlation coefficients and Cronbach’s Alpha coefficient if item deleted were at acceptable levels (Table 3). Cronbach’s alpha coefficient of DEPS - Medical Students scale was 0.930 (Table 4). Cronbach’s alpha of the four factors were 0.921 (Factor I), 0.774 (Factor II), 0.713 (Factor III), and 0.749 (Factor IV) which were also at acceptable levels.

Table 4 Descriptive statistics and Cronbach’s alpha values of four-factorial scale

Discussion

In this study, we aimed to develop a new scale (DEPS - Medical Students) to evaluate the perceptions of distance education among medical students. This scale was needed due to the shortcomings in currently available scales developed in the past and the possibilities that have evolved with the rapid advancement of technology.

There are many articles on the perception of distance education in the literature. For example, while Fidalgo et al. [6] applied a questionnaire to the students in their study, Bagriacik [7] and Gaytan [10] evaluated the perception of distance education with qualitative data analysis in their study. In some articles, mixed research models were used in which both quantitative and qualitative data were analysed [8, 9, 11,12,13]. They used questionnaires to collect quantitative data in mixed research models. Evaluation of the perception of distance education in these studies mentioned, the person, department, education level, country, etc. varies for situations. Tuma et al. [14], an example of the studies conducted to evaluate the perception of distance education of medical students, applied a questionnaire to students and faculty staff in their study. Wang et al. [15], Ibrahim et al. [16], Ibrahim et al. [17], Srinivasan [18], Gismalla et al. [19] also applied a questionnaire to evaluate the perceptions of medical students about distance education. There are also studies in which scales are used instead of questionnaires in the evaluation of the perception of distance education. Bhagat et al. [21] developed a scale to evaluate the perception of distance education among Taiwanese students with different education levels (undergraduate, master, or doctoral). In addition, while Gunduz and Isman [22] developed a scale for pre-service teachers, Karaca and Yuksekdag [23] developed a scale for nursing education.

The methods used, and the participants applied in these articles differ from our article. We obtained the items from the item pool of the study of Gok and Cakmak [20]. The items in the scale developed by Gok and Cakmak [20] to measure the perception of distance education of academic staff at the university were adapted by making necessary adaptations for medical faculty students. The purpose of our reference to a study conducted with faculty members is to examine the views of students and faculty members, who are among the education stakeholders, with similar articles. In total, 715 medical students in the first five academic years were evaluated. After analysing the answers obtained from the students, sufficient evidence was found to show that the DEPS - Medical Students scale is reliable and valid.

The content validity of DEPS was supported by the assessments of the expert group. With the examination of the I-CVI index, the number of items decreased from 57 to 38. The evidence for the construct validity of DEPS was supported by EFA and CFA. As a result of EFA, 22 items created four factors: “Students’ perception,” “Equipment facility”, “Time Management” and “Facility and support of the institution”, accounting for 61.35% of the variance. Further CFA with the maximum likelihood method supported a good fit to the model, as showed by the RMSEA, SRMR, CFI, and the TLI. The χ2 values for the model were significant, possibly due to the large sample size. Therefore, the calculated value of χ2/df also showed that the fit of the model was good. Results show that the DEPS scale can be used to reflect the distance education perception of medical students and also the functioning of the four factors. The total internal consistency reliability of the DEPS - Medical Students scale is quite high. Also, each of the reliability of four factors (students’ perception, equipment facility, time management, and facility and support of the institution) is at a satisfactory level of Cronbach’s alpha.

In our study, the scale developed for medical students consisted of 22 items and 4 factors, while Bhagat et al. [21] developed a 4-factor scale (Instructor characteristics, Social presence, Instructional design, and Trust) with 16 items to evaluate the perception of distance education among Taiwanese students with different education levels (undergraduate, master, or doctoral). The distance education perception scale developed by Gok and Cakmak [20] for faculty staff consists of 3 factors (Basic perception of distance education, Access to resources, and Education planning) with 21 items. The scale developed by Gunduz and Isman [22] for pre-service teachers consists of one factor with 17 items, while the scale developed by Karaca and Yuksekdag [23] for nursing education consists of 4 factors (learning, technology, communication-evaluation, and management affect) with 16 items. We could not find any studies evaluating the distance education perception of medical students using a scale in the databases we used while conducting literature review.

Limitation

Due to the rapid advancement of technology in terms of software and hardware in the field of communication, there is continuous development and progress in the learning tools used. This technology change is also reflected in studies related to distance education [7, 8] . As a result, students’ perceptions of distance education change dynamically. Longitudinal studies should also be conducted to determine how the students’ perceptions of distance education change. We planned to carry out these studies in future work.

Conclusions

The findings for content and construct validity, and reliability suggest that the DEPS - Medical Students scale is a valid and reliable scale to assess the distance education perceptions provided by medical students. Also, the DEPS scale, which consists of 22 items, is easy to use. Future studies may use DEPS - Medical Students and other related scales to examine factors associated with distance education perception in medical students. We recommend further studies with larger samples and more students to validate the findings reported here. Medical schools can use DEPS to better understand their students’ perceptions of distance education. Thus, they can apply methods to encourage effective learning and further influence teaching strategies.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author upon reasonable request.

Abbreviations

DEPS -Medical students:

Distance Education Perception Scale - Medical Students

EFA:

Exploratory Factor Analysis

CFA:

Confirmatory Factor Analysis

CVI:

Content validity index

I-CVI:

Item-content validity index

KMO:

Kaiser-Meyer-Olkin

RMSEA:

Root mean square error of approximation

SRMR:

Standardized root mean square residual

CFI:

Comparative Fit Index

TLI:

Tucker-Lewis Index

df:

degree of freedom

References

  1. Hodges C, Moore S, Lockee B, Trust T, Friday AB. The Difference Between Emergency Remote Teaching and Online Learning. 2020. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning. Accessed 30 Jun 2021.

    Google Scholar 

  2. Moorhouse BL. Adaptations to a face-to-face initial teacher education course ‘forced’ online due to the COVID-19 pandemic. J Educ Teach. 2020;1:609–11. https://0-doi-org.brum.beds.ac.uk/10.1080/02607476.2020.1755205.

    Article  Google Scholar 

  3. Keegan D. The future of learning: from eLearning to mLearning. Hagen: Fern Universität; 2002.

    Google Scholar 

  4. Jonassen D, Driscoll M. Editors. Handbook of research on educational communications and technology. 2nd edition. London: Routledge; 2004.

  5. Watts L. Synchronous and asynchronous communication in distance learning: a review of the literature. Q Rev Distance Educ. 2016;17:23–32.

    Google Scholar 

  6. Fidalgo P, Thormann J, Kulyk O, Lencastre JA. Students’ perceptions on distance education: a multinational study. Int J Educ Technol High Educ. 2020;17:18. https://0-doi-org.brum.beds.ac.uk/10.1186/s41239-020-00194-2.

  7. Bagriacik YA. Distance and face-to-face students’ perceptions towards distance education: a comparative metaphorical study. Turk Online J Dist Educ. 2019;20:191–207. https://0-doi-org.brum.beds.ac.uk/10.17718/tojde.522705.

    Article  Google Scholar 

  8. van Rooyen A. Distance education accounting students’ perceptions of social media integration. Procedia Soc Behav Sci. 2015;176:444–50. https://0-doi-org.brum.beds.ac.uk/10.1016/j.sbspro.2015.01.495.

    Article  Google Scholar 

  9. Cacheiro-Gonzalez ML, Medina-Rivilla A, Dominguez-Garrido MC, Medina-Dominguez M. The learning platform in distance higher education: Student’s perceptions. Turk Online J Dist Educ. 2019;20:71–95. https://0-doi-org.brum.beds.ac.uk/10.17718/tojde.522387.

    Article  Google Scholar 

  10. Gaytan J. Comparing faculty and student perceptions regarding factors that affect student retention in online education. Am J Dist Educ. 2015;29(1):56–66. https://0-doi-org.brum.beds.ac.uk/10.1080/08923647.2015.994365.

    Article  Google Scholar 

  11. Mishra L, Gupta T, Shree A. Online teaching-learning in higher education during lockdown period of COVID-19 pandemic. Int J Educ Res Open. 2020;1:100012. https://0-doi-org.brum.beds.ac.uk/10.1016/j.ijedro.2020.100012.

    Article  Google Scholar 

  12. Chen JC, Dobinson T, Kent S. Lecturers’ perceptions and experiences of blackboard collaborate as a distance learning and teaching tool via open universities Australia (OUA). Open Learn J Open Distance e-Learn. 2020;35(3):222–35. https://0-doi-org.brum.beds.ac.uk/10.1080/02680513.2019.1688654.

    Article  Google Scholar 

  13. Adanır GA, Borkoev B, Saliyeva K, Muhametjanova G. Kyrgyz learners’ and teachers’ experiences and perceptions related to ICT use in high school courses. Educ Inf Technol. 2020;25(6):4765–80. https://0-doi-org.brum.beds.ac.uk/10.1007/s10639-020-10196-2.

    Article  Google Scholar 

  14. Tuma F, Nassar AK, Kamel MK, Knowlton LM, Jawad NK. Students and faculty perception of distance medical education outcomes in resource-constrained system during COVID-19 pandemic. A cross-sectional study. Ann Med Surg. 2021;62:377–82. https://0-doi-org.brum.beds.ac.uk/10.1016/j.amsu.2021.01.073.

    Article  Google Scholar 

  15. Wang C, Xie A, Wang W, Wu H. Association between medical students’ prior experiences and perceptions of formal online education developed in response to COVID-19: a cross-sectional study in China. BMJ Open. 2020;10:1–10.

    Google Scholar 

  16. Ibrahim NK, Al Raddadi R, AlDarmasi M, Al Ghamdi A, Gaddoury M, AlBar HM, et al. Medical students’ acceptance and perceptions of e-learning during the Covid-19 closure time in king Abdulaziz University, Jeddah. J Infect Public Health. 2021;14(1):17–23. https://0-doi-org.brum.beds.ac.uk/10.1016/j.jiph.2020.11.007.

    Article  Google Scholar 

  17. Ibrahim G, Luzinge H, Kapanda G. Teaching and learning experiences in medical education during the covid-19 pandemic: the case of Kilimanjaro christian medical university college (kcmuco), Tanzania. J Learn Dev. 2020;7:433–46.

    Google Scholar 

  18. Srinivasan DK. Medical students’ perceptions and an anatomy Teacher’s personal experience using an e-learning platform for tutorials during the Covid-19 crisis. Anat Sci Educ. 2020;13(3):318–9. https://0-doi-org.brum.beds.ac.uk/10.1002/ase.1970.

    Article  Google Scholar 

  19. Gismalla MDA, Mohamed MS, Mohamed MN, MMA E, Ibrahim O. Students Perception Towards Challenges and Difficulties to Established E-learning Medical Education in a High Burden Developing Country; 2020.

    Book  Google Scholar 

  20. Gok B, Kilic CE. The perception of distance education faculty members regarding distance education. Kastamonu Educ J. 2020;28:1915–31.

    Google Scholar 

  21. Bhagat KK, Wu LY, Chang C-Y. Development and validation of the perception of students towards online learning (POSTOL). Educ Technol Soc. 2016;19:350–9.

    Google Scholar 

  22. Gunduz AY, Isman A. Pre-service teachers’ perception of distance education. TOJET Turkish Online J Educ Technol. 2018;17:125–9.

    Google Scholar 

  23. Karaca E, Boz YB. Açık ve uzaktan öğrenme de bir ölçek geliştirme çalışması : uzaktan hemşirelik eğitimine ilişkin algı çalışması örneği. Açıköğretim Uygulamaları ve Araştırmaları Derg. 2017;3:177–97.

    Google Scholar 

  24. Cattell RB. The scientific use of factor analysis in behavioral and life sciences. New York: Plenum; 1978. https://0-doi-org.brum.beds.ac.uk/10.1007/978-1-4684-2262-7.

    Book  Google Scholar 

  25. Gorsuch R. Factor analysis. Hillsdale: L. Erlbaum Associates; 1983.

    Google Scholar 

  26. Tabachnick B, Fidell L. Using multivariate statistics. Boston: Education Inc.; 2013.

    Google Scholar 

  27. Yeo S, Kim KJ. A validation study of the Korean version of the Toronto empathy questionnaire for the measurement of medical students’ empathy. BMC Med Educ. 2021;21:1–8.

    Article  Google Scholar 

  28. Kang H. A guide on the use of factor analysis in the assessment of construct validity. J Korean Acad Nurs. 2013;43(5):587–94. https://0-doi-org.brum.beds.ac.uk/10.4040/jkan.2013.43.5.587.

    Article  Google Scholar 

  29. Armstrong TS, Cohen MZ, Eriksen LCC. Content validity of selfreport measurement instruments: an illustration from the development of the brain tumor module of the M.D. Anderson symptom inventory. Oncol Nurs Forum. 2005;32(3):669–76. https://0-doi-org.brum.beds.ac.uk/10.1188/05.ONF.669-676.

    Article  Google Scholar 

  30. Zamanzadeh V, Ghahramanian A, Rassouli M, Abbaszadeh A, Alavi-Majd H, Nikanfar A-R. Design and implementation content validity study: development of an instrument for measuring patient-centered communication. J Caring Sci. 2015;4(2):165–78. https://0-doi-org.brum.beds.ac.uk/10.15171/jcs.2015.017.

    Article  Google Scholar 

  31. Rubio DM, Berg-Weger M, Tebb SS, Lee ES, Rauch S. Objectifying content validity: conducting a content validity study in social work research. Soc Work Res. 2003;27(2):94–104. https://0-doi-org.brum.beds.ac.uk/10.1093/swr/27.2.94.

    Article  Google Scholar 

  32. dos Santos FC, Riner ME, Henriques SH. Brazilian questionnaire of competencies of oncology nurses: construction and test of content validation. Int J Nurs Sci. 2019;6(3):288–93. https://0-doi-org.brum.beds.ac.uk/10.1016/j.ijnss.2019.06.005.

    Article  Google Scholar 

  33. Spicer J. Making sense of multivariate data analysis. London: Sage; 2005. https://0-doi-org.brum.beds.ac.uk/10.4135/9781412984904.

    Book  Google Scholar 

  34. Straub DW. Validating instruments in MIS research. MIS Q Manag Inf Syst. 1989;13(2):147–65. https://0-doi-org.brum.beds.ac.uk/10.2307/248922.

    Article  Google Scholar 

  35. Hair J, Black WC, Babin BJ, Anderson RE. Multivariate data analysis. 7th Editio. Upper Saddle River: Pearson Education International; 2010.

    Google Scholar 

  36. Hair JE Jr, Anderson RE, Tatham RL, Black WC. Multivariate data analysis. 5th Editio. Upper Saddle River: Prentice-Hall; 1998.

    Google Scholar 

  37. Cortina JM. What is coefficient alpha? An examination of theory and applications. J Appl Psychol. 1993;78(1):98–104. https://0-doi-org.brum.beds.ac.uk/10.1037/0021-9010.78.1.98.

    Article  Google Scholar 

  38. Rosseel Y. Lavaan: An R package for structural equation modeling. J Stat Softw. 2012;48:1–36. https://0-doi-org.brum.beds.ac.uk/10.18637/jss.v048.i02.

    Article  Google Scholar 

  39. Hooper D, Coughlan J, Mullen MR. Structural equation modelling: guidelines for determining model fit. Electron J Bus Res Methods. 2008;6:53–60.

    Google Scholar 

  40. Tabachnick B, Fidell L. Using multivariate statistics. 7th Editio. Boston: Pearson; 2018.

    Google Scholar 

  41. Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model. 1999;6(1):1–55. https://0-doi-org.brum.beds.ac.uk/10.1080/10705519909540118.

    Article  Google Scholar 

Download references

Acknowledgements

None.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

GO contributed to the design of the study, interpreted and analyzed the data. MOA and ZA contributed to design of the study and data collection. All authors drafted, read, and approved the final manuscript.

Corresponding author

Correspondence to Mevlüt Okan Aydin.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was taken from the Bursa Uludağ University, Faculty of Medicine, Clinical Researches Ethics Committee (Date 10 June 2020; Number: 2020–10/25). We conducted this study according to the principles of the Declaration of Helsinki. Informed consent was obtained from all the participants.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Özkaya, G., Aydin, M.O. & Alper, Z. Distance education perception scale for medical students: a validity and reliability study. BMC Med Educ 21, 400 (2021). https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-021-02839-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-021-02839-w

Keywords