Dimensionality of Chemistry Teachers' Effectiveness Scale (CTES) in secondary schools in Osun state, Nigeria
Keywords:Chemistry Teachers Effectiveness Scale, Dimensionality, Item Response Theory, Multidimensionality, Unidimensionality
The study assessed the dimensionality of the Chemistry Teachers’ Effectiveness Scale (CTES) in Osun State, Nigeria secondary schools. Also, the study determined the extent to which CTES satisfies the unidimensionality assumption of the Item Response Theory model. It determined the extent to which the observed unidimensionality of CTES was confirmed when the scalability of the individual items and the overall scale was assessed. The study employed a survey research design. Thirty-five (35) Chemistry teachers who were rated by their Heads of Departments and Chemistry students made up the sample for the study. A multistage sampling procedure was employed for choosing the sample in two phases-: validation (conducted in Oyo State), and pilot testing (conducted in Osun State). A self-developed research instrument titled "Chemistry Teachers Effectiveness Scale (CTES)" was used for data collection. Items of the CTES were rated on a four-point Likert-type scale described under 1 = very poor, 2 = poor, 3 = moderate, and 4 = good. The Chemistry teachers’ effectiveness scale started with the initial version of 206 items, and was reduced to 96 items after validation. The 96-item second version of CTES was reduced to 62 items after pilot testing and reliability analysis (giving Cronbach Alpha coefficient of 0.92). A thorough and more robust statistical analysis was conducted on the 62-item third version of CTES. Mokken Scaling Analysis (MSA) was used to analyse the data via the Mokken package. Results showed that all the items of CTES have scalability coefficients within the 0.20 and 0.39 range and violated the Item Response Theory (IRT) unidimensionality assumption. The study concluded that items of the CTES are multi-dimensional.
Ajeigbe, T. O., & Afolabi, E. R. I. (2014). Assessing unidimensionality and differential item functioning in qualifying examination for senior secondary school students in Osun state, Nigeria. World Journal of Education, 4(4), 30-37. http://doi.org/10.5430/wje.v4n4p30.
Awopeju, B. P., & Afolabi, E. R. I. (2016). Comparative analysis of classical test theory and item response theory-based item parameter estimates of senior school certificate mathematics examination. European Scientific Journal, 12(28), 263-284. http://doi.org/10.19044/esj.2016.v12n28p263.
Babtimehin, T., Adamu, D. C., & Adeoye, O. P. (2021). Comparison of local item independence of WAEC and NECO dichotomously scored chemistry items in Osun State, Nigeria. Nigerian Journal of Educational Research and Evaluation, 20, 194-211.
Badri, H., & Alkhaili, M. (2014). Teacher effectiveness and organizational commitment. Journal of Academic Ethics, 7(4), 297-314.
Bock, R. D., Gibbons, R., & Muraki, E. (1988). Full information item factor analysis. Applied Psychological measurement, 12(3), 261-280. https://doi.org/10.1177/014662168801200305.
Carlson, J. E. (1987). Multi-dimensional item response theory estimation: A computer program. Iowa City IA: American College testing program. Retrieved from https://www.act.org/content/dam/act/unsecured/documents/ACT_RR87-19.pdf.
Devellis, R. F. (2012). Scale development: Theory and applications. Sage Publications. https://doi:10.7334/psicothema.
Ferrando P. J., & Lorenzo-Seva, U. (2018a). Assessing the quality and appropriateness of factor solutions and factor score estimates in exploratory item factor analysis. Educational and Psychological Measurement, 78(5), 762-780. https://doi.org/10.1177/0013164417719308.
Ferrando P. J., & Lorenzo-Seva, U. (2018b). On the added value of multiple factor score estimates essentially unidimensional models. Educational and Psychological Measurement, 78, 762- 780. https://doi.org/10.1177/0013164417719308.
Ferrando P. J., & Navarro-Gonzalez, D. (2018a). Assessing the quality and usefulness of factor analytic applications to personality measures: A study with the statistical anxiety scale. Personality and Individual Differences, 123, 81-86. https://doi.org/10.1016/j.pqid.2017.11.014.
Floyd, F. J., & Widaman, K. F. (1995). Factor analysis in the development and refinement of clinical assessment instruments. Psychological Assessment, 7, 286-299. https://doi.org/10.1037/1040-35126.96.36.1996.
Foa, E. B., & Cahill, S. P. (2001). Psychological therapies: Emotional processing. In N. J. Smelser & B. Baltes (Eds.). International Encyclopedia of the Social and Behavioural Sciences (pp. 12363-12369). Oxford: Elsvier. https://doi.org/10.1016/B0-08-043076-7/01338-3.
Furnham, A. (1990). The development of single trait personality theories. Personality and Individual Differences, 11(9), 923-929. https://doi.org/10.1016/0191-8869(90)90273-T.
Garrido, C. C., Gonzalez, D. N., Seva, U. L., & Piera, P. J. F. (2019). Multi-dimensional or essentially unidimensional? A multi-faceted factor analytic approach for assessing the dimensionality of tests and items. Psicothema, 31(4), 450-457. https://doi.org/10.7334/psicothema2019.153.
Guler, N., Uyanik, K. G., & Teker, G. T. (2013). Comparison of classical test and item response theory in terms of item parameters. International Journal of Social Sciences Research, 2(1), 1-6.
Hambleton, R. K., Swaminathan, H., Cook, L. L., Eignor, D. R., & Gifford, J. A. (1978). Developments in latent train theory: Models, technical issues, and applications. Review of Educational Research, 48(4), 467-510. https://doi.org/10.3102/00346543048004467.
Hattie, J., Krakowski, K., Rogers, H. J., & Swaminathan, H. (1996). An assessment of Stout's index of essential unidimensionality. Applied Psychological Measurement, 20(1), 1-14. https://doi.org/10.1016/0191-8869(90)90273.
Knol, D. L., & Berger, M. P. F. (1991). Empirical comparison between factor analysis and multi-dimensional item response models. Multivariate Behavioural Research, 26(3), 457-477. https://doi.org/10.1207/s15327906mbr2603_5.
Lee, C. P., Fu, T. S., Liu, C. Y., & Hung, C. J. (2017). Psychometric evaluation of the Oswerty disability index in patients with chronic low back pain: Factor and Mokken analyses. Health and Quality Life Outcomes, 15(1), 92. https://doi.org/10.1186/s12955-017-0768-8.
Lord, F. M. (1980). Application of item response theory to practical testing problems. Hildale, NJ: Lawrence Erlbaum.
Margono, G. (2015). Multi-dimensional reliability of instrument for measuring students' attitude towards statistics by using semantic differential scale. American Journal of educational Research, 3(1), 49-53.
McDonald, R. P. (1962). A general approach to non-linear factor analysis. Psychometrika, 27(4), 397-415. https://doi.org/10.1007/BF02289646.
Mcdonald, R. P. (1967). Factor interaction in non-linear factor analysis. ETS Research Bulletin Series, 1967(2), i-18. https://doi.org/10.1002/j.2333-8504.1967.tb00990.x.
Metibemu, M. A. (2016). Comparison of classical test and item response theory in the development and scoring of senior secondary school mathematics tests in Ondo State, Nigeria (Unpublished doctoral thesis). University of Ibadan, Ibadan, Nigeria.
Metibemu, M. A., Oguoma, C. C., & Essen, C. B. (2019). Ensuring quality in unidimensionality assumption assessment: Evaluating the appropriateness of using traditional factor analysis for multiple-choice test. Nigerian Journal of Educational research and evaluation, 18(1), 206-224.
Mokken, R. J. (1971). A Theory and Procedure of scale Analysis.Berlin, Germany: De Gruyter.
Muthen, B. (1993). Goodness of fit with categorical and other non-normal variables. In K. A. Bollen & J. S. Long (Eds.). Testing structural equation models (pp. 205-243). Newbury Park, CA: Sage Publications.
Nandakumar, R., & Stout, W. (1993). Refinements of Stout's procedure for assessing latent trait unidimensionality. Journal of Educational Statistics, 18(1), 41-68. https://doi.org/10.3102/10769986018001041.
Ojerinde, D. (2013). Classical test theory (CTT) versus item response theory: An evaluation of the comparability of item analysis results. A guest lecture presented at the Institute of Education, University of Ibadan. Retrieved from https://ui.edu.ng/sites/.../prof%20ojerinde's%20lecture%20.pdf.
Ojerinde, O. O. (2012). General principles of test planning. Educational tests and measurement. Ile-Ife, Nigeria: Obafemi Awolowo University Press
Reckase, M. (2009). Statistics for Social and behavioural Sciences: Multi-dimensional Item Response Theory. Dordrecht: Springer.
Reise, S. P., Bonifay, W. E., & Haviland, M. G. (2013). Scoring and modeling psychological measures in the presence of multidimensionality. Journal of Personality Assessment, 95(2), 129-140. Rtrieved from https://doi.org/10.1080/00223891.2012.725437.
Reise, S. P., Cook, K. F., & Moore, T. M. (2015). Evaluating the impact of multidimensionality on unidimensional item response theory model parameters. In Reise S. P. & Revicki D. A. (Eds.), Handbook of item response theory modeling: Applications to typical performance assessment (pp. 13–40). New York: Routledge. https://doi.org/10.4324/9781315736013.
Sick, J. (2010). Rasch measurement in language education part 5: Assumptions and requirements of Rasch measurement. SHIKEN: JALT Testing & Evaluation SIG Newsletter, 14(2), 23-29.
Sijtsma, K., & Molenaar, I. W. (2002). Introduction to Nonparametric Item Response Theory. Thousand Oaks, CA: Sage.
Stout, W. (1987). A nonparametric approach for assessing latent trait unidimensionality. Psychometrika, 52(4), 589-617. http://doi.org/10.1007/BF02294821.
Stronge, J. H. (2018). Qualities of effective teachers (3rd ed.). Alexandria, Virginia: ASCD.
Van-Abswoude, A. A. H., Van-Der, A. L. A., & Sijtsma, K. (2004). A comparative study of test data dimensionality assessment procedures under nonparametric IRT models. Applied Psychological Measurement, 28(1), 4-24. https://doi.org/10.1177/0146621603259277.
Wirth, R. J., & Edwards, M. C. (2007). Item factor analysis: Current approaches and future directions. Psychological Methods, 12(1), 58-79. https://doi.org/10.1037/1082-989X.12.1.58.
How to Cite
Copyright (c) 2023 Deborah Adamu
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors retain copyright and grant the journal right of the first publication with work simultaneously licensed under a Creative Commons Attribution 4.0 International license that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this in this journal.