Reflective Thinking Scale for Healthcare Students and Providers—Chinese version

Main Article Content

Hung-Chang Liao
Ya-huei Wang
Cite this article:  Liao, H.-C., & Wang, Y.-h. (2019). Reflective Thinking Scale for Healthcare Students and Providers—Chinese version. Social Behavior and Personality: An international journal, 47(2), e7671.


Abstract
Full Text
References
Tables and Figures
Acknowledgments
Author Contact

We developed the Reflective Thinking Scale for Healthcare Students and Providers (RTS-HSP) in Chinese by conducting a systematic literature review and consulting with a panel of experts. Participants were 579 randomly selected healthcare students and professionals from Taiwan. Using exploratory factor analysis we developed 22 items rated on a 9-point Likert scale, and extracted 4 factors: reflective skepticism (6 items explaining 29.48% of the variance), self-examination (6 items explaining 11.41% of the variance), empathetic reflection (5 items explaining 8.96% of the variance), and critical open-mindedness (5 items explaining 6.81% of the variance). Cronbach’s alpha reliability values for the 4 dimensions and the overall scale ranged from .77 to .87. An assessment of the RTS-HSP’s concurrent validity with the Questionnaire for Reflective Thinking also supported our scale’s external validity. Thus, the RTS-HSP can be considered a reliable instrument for measuring reflective thinking among healthcare students and professionals.

Reflective thinking is defined as a means of thinking back on one’s own experiences and integrating these experiences to make sense of them in the future (Knapp, 1993). Dewey (1933) defined reflection as an active, deliberate thinking process occurring during or after experiences, and commented that reflective thinking is “an active, persistent, and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the conclusion to which it tends” (p. 9). By drawing conclusions from reliving experiences, people can develop new insights for the future.

Danielson (1996) regarded reflection as an activity or process through which individuals consciously recall past experiences to comprehend, examine, evaluate, and seek solutions for future planning and actions. Though scholars regard it as a part of the critical thinking process, reflective thinking differs from critical thinking. While critical thinking usually refers to identifying the common flaws found in an argument rather than looking for reasons to trust a conclusion, reflective thinking refers not only to looking for flaws but also to finding reasons to trust a conclusion (Cottrell, 2005). In addition, instead of seeking desirable outcomes, which is the goal of critical thinking, reflective thinking is focused on the process of thinking back on one’s life or experiences to analyze, evaluate, and make sense of what has happened (Dewey, 1933).

Because reflective thinking is a skill that involves actively, constantly, and carefully thinking about a subject to induce positive feelings and behaviors (Dewey, 1933), scholars have considered this type of thinking an essential component in the development of independent medical and healthcare professionals. These professionals often encounter ambiguous and conflicting problems in clinical and healthcare situations, and must use reflective thinking to recall and apply their prior knowledge, skills, and experiences to enhance their recognition and understanding of complex clinical and healthcare issues (Gallagher et al., 2017; Naber, Hall, & Schadler, 2014).

Over the past two decades, reflective thinking has come to be considered an important tool to integrate professional knowledge into healthcare practice (Bethune & Brown, 2007; Crowe & O’Malley, 2006; Forneris & Peden-McAlpine, 2007; Kember et al., 2000; Sanford, 2010). There is an increasing trend toward using reflective applications in medical/healthcare education and training to develop students’ and practitioners’ professionalism, critical thinking, and insightfulness, thereby facilitating positive interactions between physicians, medical care professionals, patients, and patients’ families. Further, because reflective thinking involves thoughtful consideration and integration of previous knowledge and experience to reach better clinical/healthcare and patient outcomes, such thinking has become requisite in medical and healthcare education/training. Therefore, to assess whether or not medical or healthcare professionals and students can reflect on their professional actions, an objective and reliable instrument is needed.

Kitchener and King (1981) developed the Reflective Judgment Interview (RJI) to investigate intellectual growth and thinking in adolescents and adults, including realizing healthcare students’ epistemological thinking (Navedo, 2006; Pittman, 2006; Saltzberg, 2002). The RJI is based on the reflective judgment model, which covers seven stages fitting within three levels: prereflective thinking (Stages 1 to 3), quasireflective thinking (Stages 4 and 5), and reflective thinking (Stages 6 and 7). Although Kitchener and King (1981) reported a Cronbach’s internal consistency reliability alpha of .96, in Welfel’s (1982) study the Cronbach’s alpha was only .62, which is relatively low and indicates questionable reliability. Moreover, the RJI takes an hour to complete and is an interview-type instrument, not a questionnaire. The interview process alone is time consuming, even before considering the time required for transcription and analysis of data collected during the interview.

Kember et al.’s (2000) Questionnaire for Reflective Thinking (QRT) is a publicly available scale used to assess the development of healthcare students’ reflective thinking. The researchers developed the QRT based on Mezirow’s (1991) theory of reflective and nonreflective action (Kember et al., 2000). The scale comprises 16 items divided across four dimensions: habitual action (four items), understanding (four items), reflection (four items), and critical reflection (four items), and responses are made on a 5-point Likert scale ranging from 1 = definitely disagree to 5 = definitely agree, so that scores range from 16 to 80. Kember et al. conducted their study with 303 students in a health sciences program and demonstrated that the instrument’s Cronbach’s alpha values ranged between .62 and .76. Subsequent researchers using the QRT reported Cronbach’s alphas ranging between .58 and .91 (Leung & Kember, 2003; Mann, Gordon, & MacLeod, 2009; Phan, 2006). However, as stated by Tavakol and Dennick (2011), Cronbach’s alphas should be at least .70 to demonstrate sufficient internal consistency reliability.

Aukes, Geertsma, Cohen-Schotanus, Zwierstra, and Slaets (2007) developed the Groningen Reflection Ability Scale (GRAS) in Dutch to measure the self-reported self-reflection of medical students across three dimensions: self-reflection (10 items), empathetic reflection (six items), and reflective communication (seven items). The 23-item scale, which is rated on a 5-point Likert scale ranging from 1 = totally disagree to 5 = totally agree, is considered easy to administer and reliable for measuring medical students’ personal reflection, with Cronbach’s alpha for the first measurement being .83 and that for the second measurement being .74 (Aukes et al., 2007). However, the GRAS was developed to measure medical students’ personal reflection ability; thus, it is not suitable for assessing reflective thinking in other groups, such as healthcare students and professionals.

Therefore, each of the existing reflective thinking instruments have limitations. Kitchener and King’s (1981) RJI is time consuming, difficult to score, and has questionable internal consistency reliability. The sample size used by Kember et al. (2000) when developing the QRT was relatively small and weak internal consistency reliability was reported for some subscales. In addition, there is no affective component for reflective thinking in the QRT because Kember et al. intended to measure the level of reflective thinking that can be observed (Steur, 2016). Finally, Aukes et al.’s (2007) GRAS was developed based on the Dutch context to measure the personal reflection ability of medical students; therefore, it is not suitable for use with other groups.

Because all the above scales for assessing reflective thinking are based on Western medical care contexts, they may not be suitable for use in Taiwanese healthcare situations. Therefore, we sought to develop a comprehensive and psychometrically adequate scale grounded in Taiwanese healthcare situations that can be used to measure healthcare students’ and providers’ reflective thinking, and can help educators use reflection practice as a teaching and learning tool to improve healthcare students’ and professionals’ intellectual capacity, thus allowing them to deal appropriately with patients’ needs in practical healthcare and clinical situations.

Method

Procedure and Participants

We developed the Reflective Thinking Scale for Healthcare Students and Providers (RTS-HSP) by using the following steps: (1) conducting a systematic literature review to generate scale items, (2) purifying the initial scale items via expert review to confirm face and content validity, (3) collecting data using a pilot questionnaire to examine the scale’s psychometric properties, (4) deriving items and extracting subscales, (5) testing the scale’s internal consistency reliability, and (6) testing the scale’s construct validity and external validity.

After undertaking a series of literature reviews, we conducted an expert panel discussion in which each item was rated in terms of its relevance to the underlying construct until the panel members reached a consensus; that is, until the content validity index for each item was 1.00. After the expert panel discussion, we reduced the 65 items from the initial RTS-HSP to 49 items, which were rated on a 9-point Likert scale ranging from 1 = never to 9 = always. A higher item score indicates a participant places more importance on that component of reflective thinking. The scale was developed in English, then translated into Chinese and reviewed by two bilingual English teachers. It was then translated back into English by a Taiwanese doctoral student majoring in English. The original and subsequent versions were compared by a native English speaker with a doctoral degree in English and minor modifications were made where necessary.

The institutional review board of Chung Shang Medical University Hospital reviewed and approved the study procedures. In addition, we followed the ethical data collection guidelines set out in the Declaration of Helsinki (World Medical Association, 2000). All data were blinded and participants were identified using numbers.

The derived Chinese scale was administered to 607 healthcare students and professionals in Taiwan who agreed to respond to the survey. Surveys returned with missing data were regarded as invalid, resulting in a 95.38% response rate. Among the 579 valid respondents, 39% were men (n = 226) and 61% were women (n = 353), and their mean age was 29.8 years (SD = 4.3, range 18–55 years).

Data Analysis

We used SPSS version 14.0 to perform an exploratory factor analysis and test the internal consistency reliability and concurrent validity of the RTS-HSP by calculating Pearson product–moment correlations. To assess the RTS-HSP’s external validity we compared our scale with Kember et al.’s (2000) QRT.

The Kaiser–Meyer–Olkin (KMO) test and Bartlett’s test of sphericity were used to examine the adequacy of the sample size for exploratory factor analysis. Cronbach’s alpha was also used to test the internal consistency reliability within each dimension of the scale, and Pearson correlation coefficients were calculated between each two dimensions. To confirm the stability of the RTS-HSP, 4 weeks after baseline we readministered the scale to a further sample of 287 participants, which allowed us to measure the test–retest reliability.

Results

In terms of the RTS-HSP’s structural validity, the KMO value was .89, which is greater than the threshold value of .60 to indicate significance (Kaiser, 1970, 1974). Bartlett’s test of sphericity was also significant: chi square = 4864.30, degrees of freedom = 231, p < .05. These test results demonstrate the adequacy of our sample size for exploratory factor analysis. A scree plot also showed that a four-factor structure was optimal for the RTS-HSP (see Figure 1).

Table/Figure

Figure 1. Scree plot for factor analysis.

Exploratory Factor Analysis

Exploratory factor analysis was conducted to test the construct validity and internal consistency reliability of the RTS-HSP using the criterion of eigenvalues > 1.0 and principal component analysis with Promax oblique rotation. Items were retained when they loaded greater than .50 on the relevant factor and less than .50 on the nonrelevant factors. Exploratory factor analysis revealed 22 items and four dimensions for the RTS-HSP (see Table 1): reflective skepticism (six items), self-examination (six items), empathetic reflection (five items), and critical open-mindedness (five items). The eigenvalues of the four factors from principal component analysis were all greater than 1 (see Table 1). These results support the unidimensionality of the RTS-HSP.

Table 1. Rotated Factor Loadings for the Reflective Thinking Scale for Healthcare Students and Providers

Table/Figure

Note. Overall α = .87, total variance explained = 56.66%.

Descriptive Item Means, Standard Deviations, and Item–Total Correlations for RTS-HSP Dimensions

Table 2 shows the mean item scores, standard deviations, and item–total correlations for the four dimensions of the RTS-HSP.

Table 2. Mean Scores, Standard Deviations, and Item–Total Correlations for the Reflective Thinking Scale for Healthcare Students and Providers

Table/Figure

Note. (-) = reverse-scored item.

Correlation Analysis of the RTS-HSP Subscales

We calculated the Pearson correlation coefficient between each pair of categories in the RTS-HSP (see Table 3). Researchers have shown that scale dimensions measuring substantially different aspects of a common trait should still be moderately correlated (i.e., between .40 and .60; Dancey & Reidy, 2004). The correlation coefficients for the RTS-HSP ranged from .40 to .54 (p < .01), indicating significant but moderate correlations between any two categories in the RTS-HSP. The strongest correlation for the RTS-HSP dimensions was between reflective skepticism and self-examination dimensions, and the weakest correlation was between self-examination and critical open-mindedness.

Table 3. Correlation Analysis Results for the Reflective Thinking Scale for Healthcare Students and Providers Subscales

Table/Figure

Note.** p < .01.

Concurrent Validity

We used Kember et al.’s (2000) QRT and calculated Pearson product-moment correlations to test the RTS-HSP’s concurrent validity with a sample of 213 participants. The RTS-HSP was found to correlate significantly with the QRT (p < .05), providing evidence of the RTS-HSP’s external validity (see Table 4).

Table 4. Pearson Product-Moment Correlation Coefficients and Descriptive Statistics for the Reflective Thinking Scale for Healthcare Students and Providers and the Questionnaire for Reflective Thinking

Table/Figure

Note. n = 213. * p < .05.

Reliability

Cronbach’s alpha was used to test the internal consistency reliability within each dimension of the RTS-HSP. Generally speaking, .70 is the minimum acceptable level of reliability, and a level of .80 or greater is preferable (Tavakol & Dennick, 2011). Cronbach’s alpha values for our four subscales were .84, .84, .80, and .77, and that for the entire questionnaire was .87, indicating that the RTS-HSP has satisfactory reliability for assessing participants’ reflective thinking. Further, the 4-week test–retest correlation coefficients obtained from a sample of 287 respondents for the overall scale and subscales were .84, .82, .78, .74, and .73, indicating that the scale has high reliability and stability.

Discussion

We have developed a scale to measure reflective thinking in the healthcare context to help educators use reflection practice as a teaching and learning tool to improve students’ and practitioners’ intellectual capacity in practical healthcare and clinical situations. We found support for the RTS-HSP’s reliability and validity across the four dimensions of reflective skepticism, self-examination, empathetic reflection, and critical open-mindedness. The participants scored highest on empathetic reflection, followed by self-examination, then reflective skepticism, and finally critical open-mindedness. This indicates that healthcare students and providers are inclined to listen to and justify others’ feelings. However, they still have difficulty assessing events from different perspectives and thinking of alternative solutions.

Compared to Kitchener and King’s (1981) RJI, which is an interview-type instrument covering seven stages and three levels that takes an hour to administer, our newly developed RTS-HSP, a 22-item, four-factor scale, is simple to administer and quick to complete, making it more cost effective than the RJI. It is also difficult to score the RJI because well-trained raters must give stage scores responding to each interview question and further summarize the scores into three-digit codes. The RTS-HSP does require some time to design; however, data can be collected and analyzed in a standardized way that is more objective than interviews (Meadows, 2003).

Further, Kember et al. (2000) reported that the QRT’s Cronbach’s alpha values ranged from .62 to .76 when tested with a sample of 303 students in a health sciences program, and Leung and Kember (2003) and Phan (2006) reported reliability scores for this instrument ranging from .58 to .91. Compared to the 16-item QRT (Kember et al., 2000), the RTS-HSP has higher internal consistency reliability based on a larger sample. Moreover, instead of using a 5-point Likert scale as in the QRT and GRAS, the RTS-HSP is rated on a 9-point Likert scale, thereby measuring participants’ reflective thinking to a more sensitive degree.

In terms of limitations in this study, our participants were healthcare students and professionals, and the scale was developed based on the Taiwanese cultural context of healthcare. Those interested in using the scale for assessment outside Taiwan should consider potential cultural differences. To verify the feasibility of using the present scale in many other situations, future researchers could use different sample populations. Further, future researchers could use confirmatory factor analysis to verify the construct validity of the RTS-HSP, or use the RTS-HSP as a pretest–posttest measure to assess the feasibility of the scale for helping healthcare educators gain valuable learning outcomes and identify students’ weaknesses when implementing reflective thinking practices.

References

Aukes, L. C., Geertsma, J., Cohen-Schotanus, J., Zwierstra, R. P., & Slaets, J. P. J. (2007). The development of a scale to measure personal reflection in medical practice and education. Medical Teacher, 29, 177−182. https://doi.org/ctbstp

Bethune, C., & Brown, J. B. (2007). Residents’ use of case-based reflection exercises. Canadian Family Physician, 53, 470–476.

Cottrell, S. (2005). Critical thinking skills: Developing effective analysis and argument. Basingstoke, UK: Palgrave Macmillan.

Crowe, M. T., & O’Malley, J. (2006). Teaching critical reflection skills for advanced mental health nursing practice: A deconstructive–reconstructive approach. Journal of Advanced Nursing, 56, 79–87. https://doi.org/bz9z76

Dancey, C. P., & Reidy, J. (2004). Statistics without maths for psychology: Using SPSS for Windows (3rd ed.). Harlow, UK: Prentice Hall.

Danielson, C. (1996). Enhancing professional practice: A framework for teaching. Alexandria, VA: Association for Supervision and Curriculum Development.

Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process. Lexington, KY: D.C. Heath.

Forneris, S. G., & Peden-McAlpine, C. (2007). Evaluation of a reflective learning intervention to improve critical thinking in novice nurses. Journal of Advanced Nursing, 57, 410–421. https://doi.org/fq6fpt

Gallagher, L., Lawler, D., Brady, V., Oboyle, C., Deasy, A., & Muldoon, K. (2017). An evaluation of the appropriateness and effectiveness of structured reflection for midwifery students in Ireland. Nurse Education in Practice, 22, 7–14. https://doi.org/f9wb4b

Kaiser, H. F. (1970). A second generation little jiffy. Psychometrika, 35, 401–415. https://doi.org/fvm79q

Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39, 31–36. https://doi.org/cdm

Kember, D., Leung, D. Y. P., Jones, A., Loke, A. Y., McKay, J., Sinclair, K., … Yeung, E. (2000). Development of a questionnaire to measure the level of reflective thinking. Assessment & Evaluation in Higher Education, 25, 381–395. https://doi.org/fd2ngg

Kitchener, K. S., & King, P. M. (1981). Reflective judgment: Concepts of justification and their relationship to age and education. Journal of Applied Developmental Psychology, 2, 89–116. https://doi.org/b2b86t

Knapp, C. E. (1993). Lasting lessons: A teacher’s guide to reflecting on experience. Charleston, WV: ERIC Press.

Leung, D. Y. P., & Kember, D. (2003). The relationship between approaches to learning and reflection upon practice. Educational Psychology, 23, 61–71. https://doi.org/dn9p5h

Mann, K., Gordon, J., & MacLeod, A. (2009). Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education, 14, 595–621. https://doi.org/cb86g5

Meadows, K. A. (2003). So you want to do research? 5: Questionnaire design. British Journal of Community Nursing, 8, 562–570. https://doi.org/cskm

Mezirow, J. (1991). Transformative dimensions of adult learning. San Francisco, CA: Jossey-Bass.

Naber, J. L., Hall, J., & Schadler, C. M. (2014). Narrative thematic analysis of baccalaureate nursing students’ reflections: Critical thinking in the clinical education context. Journal of Nursing Education, 53, S90–S96. https://doi.org/f6qhjs

Navedo, D. D. (2006). A descriptive study of nursing judgment in senior nursing students and the relationship with reflective judgment (Unpublished doctoral dissertation). Boston College, Boston, MA.

Phan, H. P. (2006). Examination of student learning approaches, reflective thinking, and epistemological beliefs: A latent variables approach. Journal of Research in Educational Psychology, 4, 577–610.

Pittman, D. M. (2006). Applying the reflective judgment model to nursing students. Lawrence, KS: The University of Kansas.

Saltzberg, C. W. (2002). Nursing students’ uncertainty experiences and epistemological perspectives (Unpublished doctoral dissertation). Cornell University, New York, NY.

Sanford, P. (2010). Simulation in nursing education: A review of the research. The Qualitative Report, 15, 1006–1011.

Steur, J. M. (2016). It makes you think, or does it? Exploring graduateness in university education. Groningen, The Netherlands: Rijksuniversiteit Groningen.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55. https://doi.org/c29fhh

Welfel, E. R. (1982). How students make judgments: Do educational level and academic major make a difference? Journal of College Student Personnel, 23, 430–497.

World Medical Association. (2000). WMA Declaration of Helsinki: Ethical principles for medical research involving human subjects. Retrieved from https://bit.ly/2rJdF3M

Aukes, L. C., Geertsma, J., Cohen-Schotanus, J., Zwierstra, R. P., & Slaets, J. P. J. (2007). The development of a scale to measure personal reflection in medical practice and education. Medical Teacher, 29, 177−182. https://doi.org/ctbstp

Bethune, C., & Brown, J. B. (2007). Residents’ use of case-based reflection exercises. Canadian Family Physician, 53, 470–476.

Cottrell, S. (2005). Critical thinking skills: Developing effective analysis and argument. Basingstoke, UK: Palgrave Macmillan.

Crowe, M. T., & O’Malley, J. (2006). Teaching critical reflection skills for advanced mental health nursing practice: A deconstructive–reconstructive approach. Journal of Advanced Nursing, 56, 79–87. https://doi.org/bz9z76

Dancey, C. P., & Reidy, J. (2004). Statistics without maths for psychology: Using SPSS for Windows (3rd ed.). Harlow, UK: Prentice Hall.

Danielson, C. (1996). Enhancing professional practice: A framework for teaching. Alexandria, VA: Association for Supervision and Curriculum Development.

Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process. Lexington, KY: D.C. Heath.

Forneris, S. G., & Peden-McAlpine, C. (2007). Evaluation of a reflective learning intervention to improve critical thinking in novice nurses. Journal of Advanced Nursing, 57, 410–421. https://doi.org/fq6fpt

Gallagher, L., Lawler, D., Brady, V., Oboyle, C., Deasy, A., & Muldoon, K. (2017). An evaluation of the appropriateness and effectiveness of structured reflection for midwifery students in Ireland. Nurse Education in Practice, 22, 7–14. https://doi.org/f9wb4b

Kaiser, H. F. (1970). A second generation little jiffy. Psychometrika, 35, 401–415. https://doi.org/fvm79q

Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39, 31–36. https://doi.org/cdm

Kember, D., Leung, D. Y. P., Jones, A., Loke, A. Y., McKay, J., Sinclair, K., … Yeung, E. (2000). Development of a questionnaire to measure the level of reflective thinking. Assessment & Evaluation in Higher Education, 25, 381–395. https://doi.org/fd2ngg

Kitchener, K. S., & King, P. M. (1981). Reflective judgment: Concepts of justification and their relationship to age and education. Journal of Applied Developmental Psychology, 2, 89–116. https://doi.org/b2b86t

Knapp, C. E. (1993). Lasting lessons: A teacher’s guide to reflecting on experience. Charleston, WV: ERIC Press.

Leung, D. Y. P., & Kember, D. (2003). The relationship between approaches to learning and reflection upon practice. Educational Psychology, 23, 61–71. https://doi.org/dn9p5h

Mann, K., Gordon, J., & MacLeod, A. (2009). Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education, 14, 595–621. https://doi.org/cb86g5

Meadows, K. A. (2003). So you want to do research? 5: Questionnaire design. British Journal of Community Nursing, 8, 562–570. https://doi.org/cskm

Mezirow, J. (1991). Transformative dimensions of adult learning. San Francisco, CA: Jossey-Bass.

Naber, J. L., Hall, J., & Schadler, C. M. (2014). Narrative thematic analysis of baccalaureate nursing students’ reflections: Critical thinking in the clinical education context. Journal of Nursing Education, 53, S90–S96. https://doi.org/f6qhjs

Navedo, D. D. (2006). A descriptive study of nursing judgment in senior nursing students and the relationship with reflective judgment (Unpublished doctoral dissertation). Boston College, Boston, MA.

Phan, H. P. (2006). Examination of student learning approaches, reflective thinking, and epistemological beliefs: A latent variables approach. Journal of Research in Educational Psychology, 4, 577–610.

Pittman, D. M. (2006). Applying the reflective judgment model to nursing students. Lawrence, KS: The University of Kansas.

Saltzberg, C. W. (2002). Nursing students’ uncertainty experiences and epistemological perspectives (Unpublished doctoral dissertation). Cornell University, New York, NY.

Sanford, P. (2010). Simulation in nursing education: A review of the research. The Qualitative Report, 15, 1006–1011.

Steur, J. M. (2016). It makes you think, or does it? Exploring graduateness in university education. Groningen, The Netherlands: Rijksuniversiteit Groningen.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55. https://doi.org/c29fhh

Welfel, E. R. (1982). How students make judgments: Do educational level and academic major make a difference? Journal of College Student Personnel, 23, 430–497.

World Medical Association. (2000). WMA Declaration of Helsinki: Ethical principles for medical research involving human subjects. Retrieved from https://bit.ly/2rJdF3M

Table/Figure

Figure 1. Scree plot for factor analysis.


Table 1. Rotated Factor Loadings for the Reflective Thinking Scale for Healthcare Students and Providers

Table/Figure

Note. Overall α = .87, total variance explained = 56.66%.


Table 2. Mean Scores, Standard Deviations, and Item–Total Correlations for the Reflective Thinking Scale for Healthcare Students and Providers

Table/Figure

Note. (-) = reverse-scored item.


Table 3. Correlation Analysis Results for the Reflective Thinking Scale for Healthcare Students and Providers Subscales

Table/Figure

Note.** p < .01.


Table 4. Pearson Product-Moment Correlation Coefficients and Descriptive Statistics for the Reflective Thinking Scale for Healthcare Students and Providers and the Questionnaire for Reflective Thinking

Table/Figure

Note. n = 213. * p < .05.


Ya-huei Wang, Department of Applied Foreign Languages, Chung-Shan Medical University, 110, Sec. 1, Jian-Koa North Road, Taichung 402, Taiwan, ROC. Email: [email protected]

Article Details

© 2019 Scientific Journal Publishers Limited. All Rights Reserved.