Assessment of model fit for 2016 and 2017 biology multiple choice test items of the national business and technical examination board

Authors

  • HANNAH JUDITH OSARUMWENSE University of Benin, Nigeria
  • CHISOM PERPETUAL DURU University of Benin, Nigeria

DOI:

https://doi.org/10.31686/ijier.vol7.iss4.1319

Keywords:

Item Response Theory, Biology multiple choice test items, model fit and unidimensionality

Abstract

This study was based on the assessment of model fit for 2016 and 2017 Biology multiple choice test items of the National Business and Technical Examination Board. It aimed at empirically investigating the model fit of the 1, 2, and 3 Parameter Logistic Models (PLM) of the examinations using Item Response Theory. Three research questions were raised with two hypotheses formulated and tested. The expo-facto research design was adopted for this study. The population for the study was 5,115 and 4600 candidates in public and private schools in south-south geo-political zone in Nigeria for 2016 and 2017 respectively. A total of 2000 students were sampled using Simple random sampling technique. The instruments for data collection was the NABTEB 2016 and 2017 Biology multiple choice question papers. The instruments are said to be valid and reliable as they were developed by a standard examination body. The responses from the instruments were used for data analysis. The results obtained from the study revealed that the 1, 2 and 3 PLM fit the 2017 and 2016 NABTEB May/June Biology multiple choice test items. However, the 1PLM provided a better fit to the data than other models. Based on the findings of the study, it was recommended among others that the examining bodies should make sure that model fit the data well before they are used to make inferences regarding the data.

Downloads

Download data is not yet available.

Author Biographies

  • HANNAH JUDITH OSARUMWENSE, University of Benin, Nigeria

    Department of Educational Evaluation and Counselling Psychology
    Faculty of Education

  • CHISOM PERPETUAL DURU, University of Benin, Nigeria

    Master’s Student
    Department of Educational Evaluation and Counselling Psychology
    Faculty of Education

References

Adedoyin, O. O. & Mokobi, T. (2015). Using IRT psychometric analysis in examining the quality of junior certificate Mathematics multiple choice examination test items. International Journal of Asian Social Science, 3(4) 992-1011.
Ahmed, M. A. (2008). Influence of personality factor on biology Lecturers’ Assessment of difficulty level of genetics concepts in Nigeria Colleges of Education (unpublished doctoral thesis). Ilorin. Nigeria, University of Ilorin.
Auwalu, R. A., Mohd, E. T. & Muhammad, B. G. (2014). Academic achievement in Biology with suggested solutions in selected secondary schools in Kano State, Nigeria. International Journal of Education and Research. 2(11) 215-224.
Chernysheko,O. S., Stark,S., Chan, K. Y., Drasgow, F. & Williams, B. (2001). Fitting Item Response Theory models to two personality inventories: Issues and Insights: Multivariate Behavioural Resource 36(4) 523-562.doi:10.1207/S15327906MBR3604_03.
Chon, K. H., Lee, W. & Ansley, T. N. (2007). Assessing IRT model-data fit for mixed format tests Iowa City, Iowa: Center for Advanced Studies in Measurement and Assessment, the University of Iowa. Retrieved on the 4/6/2018 from http://education.uiowa.edu/casma.
Galdin,M. & Laurencelle,L.(2010). Assessing parameter invariance in item response theory’s logistic two parameter model: A Monte Carlo investigation: Tutorials in Quantitative Methods for Psychology 6(20) 39-51.
Germain,S.,Valois, P. & Abdous,B (2007). Eirt-Item response theory assistant for excel (Freeware). Retrieved on the 4/6/2018 from http://libirt.sf.net.
Glas, C. A. W. & Suarez-Falcon, J. C. (2003). A comparison of item – fit statistics for the three-parameter logistic model. Journal of Applied Psychological Measurement, 27(2), 87–106.
Hambleton, R. K. & Han, N. (2005). Assessing the fit of IRT models: Some approaches & graphical displays. Paper presented at the annual meeting of the National Council on measurement in education, San Diego, California. Retrieved on the 5/4/2018 from https://www.scribd.com>document .
Hambleton, R. K. & Swaminthan, H.(1985). Item Response Theory: Principles and Application. Boston: Kluwer-Nijhof.
Jansen, M. G. H. & Glas, C. A. W. (2005). Checking the assumptions of Rasch’s model for speed tests. Psychometrika,70(4) 671-684.Doi:10.1007/s11336-001-0929-2
Kareem, L. O. (2003). Effects of audio-graphics; self-instructional packages on senior-secondary school students’ performance in Biology in Ilorin, Nigeria. Unpublished Ph.D. Thesis of the University of Ilorin, Ilorin.
Kose, I. A. (2014). Assessing model data fit of unidimensional item response theory model in simulated data. Academic journals 9(17) 642-649
Kyong, H. C., Won, C. L. & Timothy, N. A. (2007). Assessing IRT model-data fit for mixed format test; Centre for advanced studies in measurement and assessment. (CASMA).University of IOWA, IOWA City. U.S.A
Leeson, H. & Fletcher, R. (2003). An investigation of fit: Comparism of the 1-2-3 parameter IRT models to the project as Ttle data. School of Psychology, Massey University, New Zealand.
Maydeu – Olivares, A. & Joe, H. (2005). Limited and full estimation and testing in contingency tables .A unified framework. Journal of the American statistical Association, (100) 1009 – 1020.
Parker, I. (1992). Discourse dynamics: Critical analysis for social and individual Psychology Florence, K.Y, U.S: Taylor & Frances /Routledge.
Sakiyo, J. & Badau, K. M. (2015). Assessment of the trend of secondary school students’ academic performance in the Sciences, Mathematics and English; Implications for the attainment of the milleneim development goals in Nigeria. Advances in Social Sciences Research Journal. 2(2) 31-38.
Salma, A., D. (2009). The eamination of the psychometricquality of the commoneducational proficiencyassessment (CEPA) English test. Unpublished Ph D thesis submitted to the Graduate Faculty of School of Education, University of Pittsburgh.
Si, C.B. (2002). Ability estimation under different item parameterization and scoring models. Unpublished Dissertation University of North Texas.
Sinharay S. (2005). Assessing fit of unidimensional item response theory models using a Bayesian approach. Journal of Educational and Behavioural Statistics. 42(4) 375 – 394
Stone, C. A. & Zhang, B. (2003). Assessing goodness of fit of item response theory models: A comparison of traditional and alternative procedures. Journal of Educational Measurement, 40(4) 331–352.
Udom, S.U (2004). Item response theory dimensionality of west African Examination council (WAEC) Mathematics objective test and students performance. Unpublished M.Ed. Thesis. Faculty of Education, University of Calabar, Calabar, Nigeria.
Wiberg, M. (2004). Classical test theory versus item response theory: An evaluation of the theory test in Swedish driving license test. Sweden: Umea’ University.
Zhao Y. (2008). Apparatus for addressing the fit of item response theory models to educational test data. Unpublished Doctoral dissertation, University of Massachusetts Amberst.

Downloads

Published

2019-04-01

How to Cite

OSARUMWENSE, H. J., & DURU, C. P. (2019). Assessment of model fit for 2016 and 2017 biology multiple choice test items of the national business and technical examination board. International Journal for Innovation Education and Research, 7(4), 12-22. https://doi.org/10.31686/ijier.vol7.iss4.1319