Assessing a Chinese version of the Runco Ideational Behavior Scale
Main Article Content
I assessed the reliability and validity of a Chinese version of the Runco Ideational Behavior Scale (RIBS). I recruited 107 Taiwanese children (46 boys and 61 girls) for this study. The results indicated that the Chinese version of the RIBS is valid and reliable to some extent. A 2-factor construct was confirmed by confirmatory factor analysis, which is congruent with the statistical observations in the original study by Runco and colleagues. Nevertheless, the major difference between the current and original studies is that, in order to attain measurement model validity, 6 items were dropped from the Chinese version of the RIBS. Overall, the results obtained in the current study indicate that this abridged Chinese-language version of the RIBS has promise for future use. Limitations and implications of the study are discussed.
In the realm of cognitive psychology, both what is posited in pure theory and findings in empirical research lend support to the notion that divergent thinking is very closely related to creative thinking (Cheung, Lau, Chan, & Wu, 2004; Runco, Dow, & Smith, 2006). The concept of divergent thinking—as distinct from convergent thinking, which is often viewed as a criterion of intelligence— was first proposed by Guilford (1950). Guilford speculated that divergent thinking can serve as an important indicator of creative thinking and, based on this proposition, developed creativity tests that are comparable to convergent- thinking-based intelligence tests. Several creativity researchers have questioned the framework of divergent thinking (e.g., Cropley, 2000); but as Runco (1993, p. 223) stated, “a careful reading of the literature suggests that the dismissal of divergent thinking is premature.” The importance of ideation and divergent thinking to understanding of creativity should not be underestimated (Runco, 2007). In fact, findings in a number of studies have confirmed that tests of divergent thinking are useful indicators of people’s potential for creative thinking (e.g., Plucker, 1999; Runco, Millar, Acar, & Cramond, 2010). Additionally, tests of divergent thinking are often considered to be tools that are useful for identifying gifted children (Lemons, 2011). In short, divergent thinking is a useful estimate of creative potential.
In most tests of divergent thinking the focus is on fluency, originality, and flexibility (Kim, 2011). Through the lens of the scoring system of divergent- thinking tests, individuals are compared according to the quality and quantity of the ideas or solutions they generate. More centrally, fluency is defined as the frequency of ideas; originality is defined as how unique ideas are produced within the sample of individuals; and flexibility is treated as how responses are located in terms of category (Kim, 2006). Put another way, fluency represents productivity, originality indicates uniqueness of the ideas, and flexibility indicates unique ideas from different perspectives (Runco et al., 2011). Among these three variables, fluency has arguably been given disproportionate attention. Runco and other scholars (Plucker, Runco, & Lim, 2006; Runco & Acar, 2012) suggest that the behavior of ideation, consisting of the use of ideas and the ability to generate them, is at the heart of divergent thinking.
Runco and Okuda (1991) found that originality and flexibility scores could be enhanced by explicit instructions. In practice, an important implication of this is that explicit instructions from teachers could be employed in the classroom to maximize children’s ability to generate ideas (ideation). After all, as Runco and Okuda (1991) put it, “performance on divergent thinking tests requires both the ability to generate ideas and metacognitive and strategic skills” (p. 439); therefore, teachers cannot ask students to generate ideas without providing salient instructions. Runco (2009) hypothesizes that two important processes are at work in the generation of creative ideas: assimilation of information and original interpretation. Based on his hypothesis, teachers could help students to construct a unique interpretation that, in turn, might lead to the production of creative ideas by those students.
As noted, for Runco and his colleagues, ideation is the key to understanding creative thinking. As such, they have attempted to identify a more appropriate criterion to replace traditional divergent-thinking tests. This quest is grounded in the belief that such a criterion should be one that “emphasizes ideation: the use of, appreciation of, and skill of generating ideas” (Kaufman, Plucker, & Baer, 2008, p. 119). As a consequence, Runco, Plucker, and Lim (2001) created a pool of 100 items for initial pilot testing, which they later reduced to 23 items. All of the items describe actual behaviors related to ideation. In another empirical study, Plucker et al. (2006) found that scores for divergent-thinking tests were a significant predictor of scores from the Runco Ideational Behavior Scale (RIBS; Runco et al., 2001) and, on that basis, concluded that the RIBS is a useful measure for creativity research.
My objective in the current study was to examine the reliability and validity of a Chinese version of the RIBS. Through the use of factor analysis, it was my hope that any problematic items could be identified for further investigation and modification. This research was rooted in the notion that identifying students’ ideational behavior or creative thinking is an important task for teachers, because making this identification can function as a starting point for the implementation of alternative strategies to encourage students to generate ideas. As a result, in the current study my aim was to answer two research questions:
Research Question 1: Does the RIBS fit with a Chinese sample?
Research Question 2: Is the Chinese version of the RIBS a reliable and valid measure?
Method
Participants
Convenience sampling was used in this study and the sample comprised 107 children in Taiwan, 46 boys and 61 girls. Their average age was 11.19 years (SD = 1.07). All the participants were enrolled at the same elementary school in Taipei; the group comprised 29 third graders (27.1%), 8 fourth graders (7.5%), 24 fifth graders (22.4%), and 46 sixth graders (43%). The study was conducted during the second semester of the 2013-2014 academic year.
Instruments
The RIBS (Runco et al., 2001) was developed to measure individual ideation behavior. This measurement is close to being a divergent-thinking test, but differs significantly from such tests in its use of self-reporting as a tool to capture personal creative activities and attainment. Plucker et al. (2006) argue that the main purpose of developing the RIBS was so that it could serve as a criterion of creative potential, with the hope of being an alternative measurement of divergent thinking. All 23 items in the RIBS describe actual, overt behavior related to ideation. According to Runco et al. (2001), they found the RIBS to be a reliable instrument, but its construct validity is somewhat ambiguous. They identified two factors in the RIBS but, because of lack of theoretical justification, they suggested that a one-factor structure should be used to interpret RIBS results.
Procedure
In the current study, a Chinese language version of the RIBS was employed to measure the ideational behavior of the children who were the participants. I translated the 23 items in the English version of the scale into Chinese. This translation was then back-translated by a bilingual instructor who teaches English at the City University of Macau. In addition, two elementary school teachers checked the accuracy of the translation, to ensure that it would be correctly understood by Chinese-speaking children. They made some minor modifications to the Chinese translation before the instrument was distributed to the children.
The children were informed of the purpose of the study, and all of them participated in it voluntarily. After the RIBS was distributed to the children, they were first asked to complete the demographic information (age, gender, and grade). Including the demographic details, all the children finished the RIBS within 10 minutes.
Results
Because this was the first use of the Chinese version of the RIBS, examining the internal consistency and reliability of the RIBS scores was one of the main objectives of the study. Cronbach’s coefficient alpha value of the RIBS was calculated. In the process of assessing the validity and analyzing the 23 items, the correlations between the items were first checked, and then investigation of possible dimensions of the RIBS was conducted via a factor analysis. In order to gain further understanding of how well the measured variables represented the instrument, confirmatory factor analysis was also performed.
Descriptive Analysis
All coefficients, as presented in Table 1, were significant at p < .01, with the magnitude of the relationship ranging from weak to moderate. Overall measures of intercorrelation indicated some degree of multicollinearity, which is desirable, because the next step for a factor analysis is to identify interrelated sets of variables.
Internal Consistency Reliability
In order to measure scale reliability, Cronbach’s alpha was checked, and the result indicated that the overall reliability of the RIBS had a value of .949, which reflects a good internal consistency among the 23 items. The overall α of the current study was similar to that found by Runco et al. (2001), which was α = .92.
Construct Validity
Exploratory factor analysis (EFA). Principal factor extraction with varimax rotation was performed through SPSS FACTOR on the 23 items for the sample of 107 Taiwanese children. As shown in Table 2, four extracted factors explain a total of 66.5% of the variance, and commonalities of most of the variables were over .60. In addition, the results of the Kaiser-Meyer-Olkin measure of sampling adequacy (.917) and a Bartlett test of sphericity (p < .001) both indicated the appropriateness of the factor analysis.
Table 1. Means, Standard Deviations, and Intercorrelations of the Runco Ideational Behavior Scale
Table 2. Factor Loadings for Varimax Orthogonal Four-Factor Solution for the 23 Items of the Runco Ideational Behavior Scale
Note. Boldface indicates highest factor loadings. h2 = communality.
When examining factor loadings, with a cutoff point of .45 for inclusion of a variable being considered practically significant, nine items were cross-loading on two factors, which may be problematic for construct validity and further interpretation. These nine items were: (1) I have many wild ideas, (2) I think about ideas more often than most people, (4) I come up with a lot of ideas or solutions to problems, (5) I come up with an idea or solution other people have never thought of, (8) I would rate myself highly in being able to come up with ideas, (9) I have always been an active thinker-I have lots of ideas, (10) I enjoy having leeway in the things I do and room to make up my own mind, (18) Some people might think me scatterbrained or absentminded because I think about a variety of things at once, (19) I try to exercise my mind by thinking things through. The cross-loadings were far too substantial to be ignored, and after employing other orthogonal methods (QUARTIMAX and EQUIMAX), this fundamental problem remained. Thus, the course of action was to retain these items for further analysis.
Confirmatory factor analysis (CFA). As described, EFA showed that a four-factor solution was appropriate but, at the same time, several issues were also found. In the study by Runco et al. (2001), from the statistical viewpoint, a two-factor construct was supported, but because of a lack of theoretical foundation for this, they suggested that a one-factor solution was more appropriate when interpreting the results. Therefore, I further investigated one-, two-, and four-factor constructs by using CFA to compare the differences. In Table 3 the results show that a two-factor construct was a better fit than a one-factor construct, and the fit of the four-factor construct was no better than that of the two-factor construct. Therefore, a two-factor solution was chosen for further modification.
Table 3. Goodness-of-Fit Indices of Models
Note. AGFI = adjusted goodness-of-fit index; CFI = comparative fit index; RMSEA = root mean square error of approximation. *** p < .001.
After modifying this measurement model, as shown in Figure 1 and Table 3, a positive result was obtained regarding overall model fit and construct validity. In Table 4 it can be seen that six items were excluded from the original 23 items of the RIBS, and this was due to the use of the two-factor model. The results set out in Table 3 show that all standardized loading estimates were over .50. Additionally, all average percentages of variance extracted were over .50, which suggests adequate convergent validity, and construct validity was over .60, which indicates acceptable convergence (Hair, Black, Babin, Anderson, & Tatham, 2006).
Figure 1. Standardized coefficients for a modified two-factor model.
Table 4. Standardized Factor Loadings, Variance Extracted, and Reliability Estimates by Confirmatory Factor Analysis for the Modified Two-Factor Model
Discussion
The overall results of the Chinese version of the RIBS indicate that this instrument is valid to some extent. EFA was first conducted to examine the robustness of the factor solution, which revealed a four-factor solution, but further analysis via CFA showed that a two-factor construct was a better fit for the purpose of construct validity and reliability. This finding is congruent with the statistical observations of Runco et al. (2001) who, in developing the overall measurement model of the RIBS, found that a two-factor construct was a better fit to the statistical results. The major difference between the current study and the original study by Runco et al. is that six items were dropped from the Chinese version of the RIBS in order to attain measurement model validity. Additionally, items (1) I have many wild ideas, (6) I like to play around with ideas for the fun of it, (7) It is important to be able to think of bizarre and wild possibilities, (11) My ideas are often considered “impractical” or even “wild”, and (18) Some people might think me scatterbrained or absentminded because I think about a variety of things at once, were located in factor two in the Chinese version, whereas in the original study by Runco et al. items 11, 14, 15, 16, 17, and 18 were located in the second factor. This salient difference led to my having concerns about the specification of the multi-item matching factors. Although factor loadings of the 17 items in the Chinese version were acceptable (> .05), the six items dropped from the original RIBS should be either excluded, or amended, to fit the scale for further use by researchers and educators in Chinese-speaking contexts.
Although the results of the current study demonstrate that the abridged Chinese version of the RIBS is a valid instrument, several discrepancies were identified in comparing the results of the present study with those in the previous research by Runco et al. (2001). There are several possible reasons for these. First, the sample size used in the current study (N = 107) was smaller than that (N = 224) used by Runco et al. (2001). In order to secure adequate estimates through the use of CFA, Hair et al. (2006) suggest that a minimum sample size of 100 is acceptable. Nevertheless, they also concluded that, with a sample size in the range of 150 to 400, the researcher is better able to ensure stable EFA solutions. As such, it would be advisable to recruit a bigger sample to gain a greater understanding of the issues identified in the current study. In addition, cultural issues should be taken into account. The cohort of the current study was different from that used by Runco et al. (2001). It is possible that, for Taiwanese children, perceptions of ideational behavior and divergent thinking are different from those of children in Western cultures. In fact, several scholars have argued that the environment of a Confucian-heritage culture (e.g., China, Hong Kong, Taiwan, and Singapore) has a detrimental influence on the development of creativity among East Asian students (Ho & Ho, 2008; Kim, Lee, Chae, Anderson, & Laurence, 2011; Ng, 2003; Ng & Smith, 2004). In the light of this perception of cultural influence, my findings suggest that, in order to answer the question of whether or not the RIBS is suitable for children in East Asian cultures, more studies are needed. Finally, the results in the current study might have been affected by other variables, for example, the linguistic skills of the children in the sample. The RIBS may be verbally biased, and this possibility cannot be excluded in relation to the present study. Finally, although I recruited students from third to sixth grade, the majority were sixth graders, which might have led to different results from those from a sample that was more evenly distributed across all four grades, and/or from a wider range of grades—suggesting a possible direction for future study.
Before turning to the broader implications of this study, limitations should be mentioned. First, the children in the sample were both relatively uniform in age and quite young (from 9 to 12 years old). In future studies researchers should test for this limitation by including older children. Additionally, the study was conducted with a homogeneous selection of students from a single institution and with the same cultural background. Future work will need to include multiple learning sites, and participants with multicultural heritage. Lastly, the examination of external validity was not included in the present study, and in future research this limitation should be addressed.
Implications
From the point of view of teachers, the most important practical implication of the present study is that the RIBS could be used in East Asian cultures to understand individual students’ ideational behavior. The speed with which the RIBS can be completed makes it especially appealing in this regard. If teachers find low levels of ideational behavior among some or all of their students, this could be a signal that they should employ different strategies to help the children explore new possibilities and generate ideas. At the same time, teachers should cultivate a classroom environment in which raising questions is welcomed, and delving beyond the surface (deep thinking) is encouraged.
Active learning is facilitated in a continuous loop via two important channels: teachers proposing questions, and students rethinking and reflecting on these questions. Unfortunately, in the typical East Asian classroom, rote learning and memorization is still the predominant method of instruction in a system that may overestimate the relative value of knowledge acquisition (Ho & Ho, 2008). I argue here that critical and creative thinking should form some part of students’ learning. The first step toward this is to support students in the generation of various ideas about the questions they are asked, and then to examine critically each of these ideas. Although this process might be very time-consuming, its potential rewards—in the form of enhanced thinking skills and the more efficient construction of knowledge—are too important to ignore.
References
Cheung, P. C., Lau, S., Chan, D. W., & Wu, W. Y. H. (2004). Creative potential of school children in Hong Kong: Norms of the Wallach-Kogan Creativity Tests and their implications. Creativity Research Journal, 16, 69–78. http://doi.org/bvrsmb
Cropley, A. J. (2000). Defining and measuring creativity: Are creativity tests worth using? Roeper Review, 23, 72–79. http://doi.org/bqccpt
Guilford, J. P. (1950). Creativity. American Psychologist, 5, 444–454.
Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis. Upper Saddle River, NJ: Pearson Education.
Ho, D. Y. F., & Ho, R. T. H. (2008). Knowledge is a dangerous thing: Authority relations, ideological conservatism, and creativity in Confucian-heritage cultures. Journal for the Theory of Social Behavior, 38, 67–86. http://doi.org/b57j9f
Kaufman, J. C., Plucker, J. A., & Baer, J. (2008). Essentials of creativity assessment. Hoboken, NJ: Wiley.
Kim, K. H. (2006). Can we trust creativity tests? A review of the Torrance Tests of Creative Thinking (TTCT). Creativity Research Journal, 18, 3–14. http://doi.org/dzfzx3
Kim, K. H. (2011). The creativity crisis: The decrease in creative thinking scores on the Torrance Tests of Creative Thinking. Creativity Research Journal, 23, 285–295. http://doi.org/dqc7z8
Kim, K. H., Lee, H. E., Chae, K.-B., Anderson, L., & Laurence, C. (2011). Creativity and Confucianism among American and Korean educators. Creativity Research Journal, 23, 357–371. http://doi.org/dw86s3
Lemons, G. (2011). Diverse perspectives of creativity testing: Controversial issues when used for inclusion into gifted programs. Journal for the Education of the Gifted, 34, 742–772. http://doi.org/bttkm4
Ng, A. K. (2003). A cultural model of creative and conforming behavior. Creativity Research Journal, 15, 223–233. http://doi.org/cgnx49
Ng, A. K., & Smith, I. (2004). The paradox of promoting creativity in the Asian classroom: An empirical investigation. Genetic, Social & General Psychology Monographs, 130, 307–332. http://doi.org/fmw8vj
Plucker, J. A. (1999). Is the proof in the pudding? Reanalyses of Torrance’s (1958 to present) longitudinal data. Creativity Research Journal, 12, 103–114. http://doi.org/bn7kgp
Plucker, J. A., Runco, M. A., & Lim, W. (2006). Predicting ideational behavior from divergent thinking and discretionary time on task. Creativity Research Journal, 18, 55–63. http://doi.org/bs5nhb
Runco, M. A. (1993). Creativity, causality, and the separation of personality and cognition. Psychological Inquiry, 4, 221–225. http://doi.org/dstrc3
Runco, M. A. (2007). Achievement sometimes requires creativity. High Ability Studies, 18, 75–77. http://doi.org/c7kx33
Runco, M. A. (2009). Simplifying theories of creativity and revisiting the criterion problem: A comment on Simonton’s (2009) hierarchical model of domain-specific disposition, development, and achievement. Perspectives on Psychological Science, 4, 462–465. http://doi.org/ff7678
Runco, M. A., & Acar, S. (2012). Divergent thinking as an indicator of creative potential. Creativity Research Journal, 24, 66–75. http://doi.org/3s2
Runco, M. A., Dow, G., & Smith, W. R. (2006). Information, experience, and divergent thinking: An empirical test. Creativity Research Journal, 18, 269–277. http://doi.org/b8n73h
Runco, M. A., Millar, G., Acar, S., & Cramond, B. (2010). Torrance Tests of Creative Thinking as predictors of personal and public achievement: A fifty-year follow-up. Creativity Research Journal, 22, 361–368. http://doi.org/fbggr3
Runco, M. A., Noble, E. P., Reiter-Palmon, R., Acar, S., Ritchie, T., & Yurkovich, J. M. (2011). The genetic basis of creativity and ideational fluency. Creativity Research Journal, 23, 376–380. http://doi.org/bhpb94
Runco, M. A., & Okuda, S. M. (1991). The instructional enhancement of the flexibility and originality scores of divergent thinking tests. Applied Cognitive Psychology, 5, 435–441. http://doi.org/ds4kx2
Runco, M. A., Plucker, J. A., & Lim, W. (2001). Development and psychometric integrity of a measure of ideational behavior. Creativity Research Journal, 13, 393–400. http://doi.org/fdrwb2
Cheung, P. C., Lau, S., Chan, D. W., & Wu, W. Y. H. (2004). Creative potential of school children in Hong Kong: Norms of the Wallach-Kogan Creativity Tests and their implications. Creativity Research Journal, 16, 69–78. http://doi.org/bvrsmb
Cropley, A. J. (2000). Defining and measuring creativity: Are creativity tests worth using? Roeper Review, 23, 72–79. http://doi.org/bqccpt
Guilford, J. P. (1950). Creativity. American Psychologist, 5, 444–454.
Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis. Upper Saddle River, NJ: Pearson Education.
Ho, D. Y. F., & Ho, R. T. H. (2008). Knowledge is a dangerous thing: Authority relations, ideological conservatism, and creativity in Confucian-heritage cultures. Journal for the Theory of Social Behavior, 38, 67–86. http://doi.org/b57j9f
Kaufman, J. C., Plucker, J. A., & Baer, J. (2008). Essentials of creativity assessment. Hoboken, NJ: Wiley.
Kim, K. H. (2006). Can we trust creativity tests? A review of the Torrance Tests of Creative Thinking (TTCT). Creativity Research Journal, 18, 3–14. http://doi.org/dzfzx3
Kim, K. H. (2011). The creativity crisis: The decrease in creative thinking scores on the Torrance Tests of Creative Thinking. Creativity Research Journal, 23, 285–295. http://doi.org/dqc7z8
Kim, K. H., Lee, H. E., Chae, K.-B., Anderson, L., & Laurence, C. (2011). Creativity and Confucianism among American and Korean educators. Creativity Research Journal, 23, 357–371. http://doi.org/dw86s3
Lemons, G. (2011). Diverse perspectives of creativity testing: Controversial issues when used for inclusion into gifted programs. Journal for the Education of the Gifted, 34, 742–772. http://doi.org/bttkm4
Ng, A. K. (2003). A cultural model of creative and conforming behavior. Creativity Research Journal, 15, 223–233. http://doi.org/cgnx49
Ng, A. K., & Smith, I. (2004). The paradox of promoting creativity in the Asian classroom: An empirical investigation. Genetic, Social & General Psychology Monographs, 130, 307–332. http://doi.org/fmw8vj
Plucker, J. A. (1999). Is the proof in the pudding? Reanalyses of Torrance’s (1958 to present) longitudinal data. Creativity Research Journal, 12, 103–114. http://doi.org/bn7kgp
Plucker, J. A., Runco, M. A., & Lim, W. (2006). Predicting ideational behavior from divergent thinking and discretionary time on task. Creativity Research Journal, 18, 55–63. http://doi.org/bs5nhb
Runco, M. A. (1993). Creativity, causality, and the separation of personality and cognition. Psychological Inquiry, 4, 221–225. http://doi.org/dstrc3
Runco, M. A. (2007). Achievement sometimes requires creativity. High Ability Studies, 18, 75–77. http://doi.org/c7kx33
Runco, M. A. (2009). Simplifying theories of creativity and revisiting the criterion problem: A comment on Simonton’s (2009) hierarchical model of domain-specific disposition, development, and achievement. Perspectives on Psychological Science, 4, 462–465. http://doi.org/ff7678
Runco, M. A., & Acar, S. (2012). Divergent thinking as an indicator of creative potential. Creativity Research Journal, 24, 66–75. http://doi.org/3s2
Runco, M. A., Dow, G., & Smith, W. R. (2006). Information, experience, and divergent thinking: An empirical test. Creativity Research Journal, 18, 269–277. http://doi.org/b8n73h
Runco, M. A., Millar, G., Acar, S., & Cramond, B. (2010). Torrance Tests of Creative Thinking as predictors of personal and public achievement: A fifty-year follow-up. Creativity Research Journal, 22, 361–368. http://doi.org/fbggr3
Runco, M. A., Noble, E. P., Reiter-Palmon, R., Acar, S., Ritchie, T., & Yurkovich, J. M. (2011). The genetic basis of creativity and ideational fluency. Creativity Research Journal, 23, 376–380. http://doi.org/bhpb94
Runco, M. A., & Okuda, S. M. (1991). The instructional enhancement of the flexibility and originality scores of divergent thinking tests. Applied Cognitive Psychology, 5, 435–441. http://doi.org/ds4kx2
Runco, M. A., Plucker, J. A., & Lim, W. (2001). Development and psychometric integrity of a measure of ideational behavior. Creativity Research Journal, 13, 393–400. http://doi.org/fdrwb2
Table 1. Means, Standard Deviations, and Intercorrelations of the Runco Ideational Behavior Scale
Table 2. Factor Loadings for Varimax Orthogonal Four-Factor Solution for the 23 Items of the Runco Ideational Behavior Scale
Note. Boldface indicates highest factor loadings. h2 = communality.
Table 3. Goodness-of-Fit Indices of Models
Note. AGFI = adjusted goodness-of-fit index; CFI = comparative fit index; RMSEA = root mean square error of approximation. *** p < .001.
Figure 1. Standardized coefficients for a modified two-factor model.
Table 4. Standardized Factor Loadings, Variance Extracted, and Reliability Estimates by Confirmatory Factor Analysis for the Modified Two-Factor Model
Kuan Chen Tsai,City University of Macau, 81-121 Av. Xian Xing Hai, Golden Dragon Centre, 19 Andares, Macau. Email: [email protected]