Redefining Assessment Standards: A Framework for Examination Guidelines in South African Basic Education

Authors

DOI:

https://doi.org/10.38140/obp2-2024-07

Keywords:

Assessment standards, quality assurance framework, UMALUSI, national senior certificate, examination guidelines

Abstract

Umalusi quality assures assessments for exit qualifications such as the National Senior Certificate (NSC) through various processes, including the evalua­tion of examination guidelines. The NSC is examined by three assessment bodies, and each body must develop its own examination guidelines, which must be comparable across the assessment bodies. Previous research by Uma­lusi identified differences in the components contained in the examination guidelines of the three assessment bodies. These differences arose from the absence of a common framework for developing examination guide­lines and pose a threat to the maintenance of NSC assess­ment standards over time, which could undermine the credibility of this qualification. This study aimed to ad­dress this gap by developing a framework specifying compulsory components for NSC examination guide­lines. Data was collected through qualitative methodol­ogy, employing document analysis and systematic litera­ture review. Purposive sampling was used to select six countries and four subjects for evaluation; the sampled subjects were also used to pilot the framework. The find­ings identified five compulsory components, including general information, subject-specific de­tails, examinable content specifications and weighting, item specifications, and scoring and re­sponse specifications. The existence of a common framework is crucial for assessment bodies to produce comparable examination guidelines, ensuring the maintenance of NSC assessment standards. The study recommends that Umalusi adopt the proposed framework and use it as a standard for the development of NSC examination guidelines across assessment bodies. Fur­thermore, education researchers should consider conducting further research to extend this framework to other qualifications within and outside the Umalusi sub-framework.

References

Abdellatif, H. (2023). Test results with and without blueprinting: Psychometric analysis using the Rasch model. Educación Médica, 24(3), 100802. https://doi.org/10.1016/j.edumed.2023.100802

Adiyaa, O., Appiah, K. E., & Osei-Poku, P. (2022). Achieving content validity of teacher made test in general knowledge in Art: A prerequisite assessment need for SHS teachers in Ghana. Steadfast Arts and Humanities, 2(1). https://orcid.org/0000-0002-5179-2576

Alemayehu, W., G., Fufa, S., & Seyoum, Y. (2021). Evaluating the Content Validity of Grade 10 Mathematics Model Examinations in Oromia National Regional State, Ethiopia. Education Research International, 2021, 1-11. http://dx.doi.org/10.1155/2021/5837931

Al Lawati, Z. A. (2023). Investigating the characteristics of language test specifications and item writer guidelines, and their effect on item development: a mixed-method case study. Language Testing in Asia, 13(1), 21. https://doi.org/10.1186/s40468-023-00233-5

AlFallay, I. S. (2017). Test specifications and blueprints: Reality and expectations. International journal of instruction, 11(1), 195-210. https://doi.org/10.12973/iji.2018.11114a

American Educational Research Association. (2014). Standards for educational and psychological testing. American Educational Research Association. https://www.testingstandards.net/uploads/7/6/6/4/76643089/standards_2014edition.pdf

Astuti, P. (2020). Analysis of Content Validity in English Examination Test on Public Health’s Students. Journal of Industrial Engineering & Management Research, 1(4), 100-113. https://doi.org/10.7777/jiemar.v1i2

Aziz, J. (2021). Evaluating the role of exam blueprinting as a tool to improve the exam quality and students’ achievements [PowerPoint slides]. Researchbank.

Berman, A. I., Haertel, E. H., & Pellegrino, J. W. (2020). Comparability of Large-Scale Educational Assessments: Issues and Recommendations. National Academy of Education.

Campbell, S., Greenwood, M., Prior, S., Shearer, T., Walkem, K., Young, S., Bywaters, D., & Walker, K., (2020). Purposive sampling: complex or simple? Research case examples. Journal of research in Nursing, 25(8), 652-661. https://doi.org/10.1177%2F1744987120927206

Fain, R., Newton, W. P., & O’Neill, T. R. (2019). Creating a new blueprint for ABFM examinations. Ann Fam Med, 17(6), 562–4. https://doi.org/10.1370%2Fafm.2480

Goldsmith, L. J. (2021). Using Framework Analysis in Applied Qualitative Research. Qualitative Report, 26(6), 2061-2076. https://doi.org/10.46743/2160-3715/2021.5011

Ismail, M. A. A., Pa, M. N. M., Mohammad, J. A. M., & Yusoff, M. S. B. (2020). seven steps to Construct an Assessment Blueprint: A practical guide. Education in Medicine Journal.12(1), 71-80. https://doi.org/10.21315/eimj2020.12.1.8

Kaushik, V., & Walsh, C. A. (2019). Pragmatism as a research paradigm and its implications for social work research. Social sciences, 8(9), 255. http://dx.doi.org/10.3390/socsci8090255

Khan, M. A. (2019). Achieving the Validity of Essay Questions in the Subject of English at BA Level Examinations. Global Language Review (GLR), IV (I), 55-59. http://dx.doi.org/10.31703/glr.2019(IV-I).07

Lowe, N. K. (2019). What is a pilot study? Journal of Obstetric, Gynecologic & Neonatal Nursing, 48(2), 117-118. https://doi.org/10.1016/j.jogn.2019.01.005

Maarouf, H. (2019). Pragmatism as a supportive paradigm for the mixed research approach: Conceptualizing the ontological, epistemological, and axiological stances of pragmatism. International Business Research, 12(9), 1-12. http://dx.doi.org/10.5539/ibr.v12n9p1

Mamolo, L. A. (2021). Development of an Achievement Test to Measure Students' Competency in General Mathematics. Anatolian Journal of Education, 6(1), 79-90. http://dx.doi.org/10.29333/aje.2021.616a

Mathur, M., Verma, A., Mathur, N., Kumar, D., Meena, J. K., Nayak, S., ... & Parmar, P. (2023). Blueprint designing and validation for competency-based curriculum for theory assessment in community medicine. Medical Journal Armed Forces India, 79, S47-S53. https://doi.org/10.1016/j.mjafi.2021.10.003

Morgan, H. (2022). Conducting a qualitative document analysis. The Qualitative Report, 27(1), 64-77. https://doi.org/10.46743/2160-3715/2022.5044

Mwita, K. (2022). Strengths and weaknesses of qualitative research in social science studies. International Journal of Research in Business and Social Science (2147-4478), 11(6), 618-625. https://doi.org/10.20525/ijrbs.v11i6.1920

Obilor, E. I., & Miwari, G. U. (2022). Content Validity in Educational Assessment. International Journal of Innovative Education Research, 10(2), 57-69.

Osebhohiemen, E. (2019). Use of Table of Specification in Construction of Teacher-Made Achievement Test in Mathematics in the Primary and Secondary Schools. The Melting Pot, 5(2), 1-15.

Pawade, Y. R., Mehta, S., Mahajan, A. S., Patil, S. N., Barua, P., Desai, C., & Supe, A. N. (2020). ‘Blueprinting in assessment’–an online learning experience. South-East Asian Journal of Medical Education, 13(2), 77-85. https://doi.org/10.4038/seajme.v13i2.214

Ray, M. E., Daugherty, K. K., Lebovitz, L., Rudolph, M. J., Shuford, V. P., & DiVall, M. V. (2018). Best practices on examination construction, administration, and feedback. American journal of pharmaceutical education, 82(10), 7066. https://doi.org/10.5688%2Fajpe7066

Raymond, M. R., & Grande, J. P. (2019). A practical guide to test blueprinting. Medical teacher, 41(8), 854-861. https://doi.org/10.1080/0142159x.2019.1595556

Rudolph, M. J., Daugherty, K. K., Ray, M. E., Shuford, V. P., Lebovitz, L., & DiVall, M. V. (2019). Best practices related to examination item construction and post-hoc review. American journal of pharmaceutical education, 83(7), 7204. https://doi.org/10.5688%2Fajpe7204

Sireci, S., & Benítez, I. (2023). Evidence for Test Validation: A Guide for Practitioners. Psicothema, 35(3), 217-226. https://doi.org/10.7334/psicothema2022.477

Vurayai, S. (2020). Rurality and exclusion in ordinary level mathematics in Zimbabwe: A document analysis. International Journal of Learning, Teaching and Educational Research, 19(6), 370–386. http://dx.doi.org/10.26803/ijlter.19.6.22

Yusoff, M. S. B. (2019). ABC of content validation and content validity index calculation. Education in Medicine Journal, 11(2), 49–54. http://dx.doi.org/10.21315/eimj2019.11.2.6

Published

2024-12-20