Understanding the Country’s Underachievement in International Assessment: Differential Item and Bundle Functioning Approach

Journal title CADMO
Author/s Josip Sabic
Publishing Year 2016 Issue 2016/1
Language English Pages 15 P. 5-19 File size 450 KB
DOI 10.3280/CAD2016-001002
DOI is like a bar code for intellectual property: to have more infomation click here

Below, you can see the article first page

If you want to buy this article in PDF format, you can do it, following the instructions to buy download credits

Article preview

FrancoAngeli is member of Publishers International Linking Association, Inc (PILA), a not-for-profit association which run the CrossRef service enabling links to and from online scholarly content.

In educational context, Differential Item Functioning (Dif) and Differential Bundle Functioning (Dbf) are usually used in assessing item bias. In present study, it is demonstrated how results of Dif and Dbf analyses can help in better understanding the reasons behind one country’s underachievement in international assessments. In Timss 2011 mathematics assessment, Croatian fourth graders achieved a mean result that was below the international average, while Slovenian and Serbian students achieved mean results that were above the international average. The results of Dif and Dbf analyses indicate that large portion of items exhibiting differences in favour of Slovenian and Serbian students are related to topics that are not taught in Croatian schools during first four grades. The described methodology can be used in investigating other countries’ (under)achievement in international assessments.

Keywords: International differences, differential item functioning, differential bundle functioning, mathematics, Timss

  1. Sobe, N.W. (2007), “US Comparative Education Research on Yugoslav Education in the 1950s and 1960s”, European Education, 38 (4), pp. 44-64.
  2. Timss-Pirls. (2011), Timss and Pirls – Informing Educational Policy for Improved Teaching and Learning. Hamburg: International Study Center.
  3. Banks, K. (2013), “A Synthesis of the Peer-reviewed Differential Bundle Functioning Research”, Educational Measurement: Issues and Practice, 32 (1), pp. 43-55.
  4. Bolt, D., Stout, W. (1996), “Differential Item Functioning: Its Multidimensional Model and Resulting Sibtest Detection Procedure”, Behaviormetrika, 23 (1), pp. 67-95.
  5. Bondžić, D. (2008), “Prosveta i nauka u Srbiji i Jugoslaviji 1945-1990” [Education and science in Serbia and Yugoslavia in 1945-1990], Istorija 20. veka, 26 (2), pp. 391-437.
  6. Buljan Culej, J. (2012), Timss 2011. Izvješće o postignutim rezultatima iz matematike [Timss 2011 mathematics report], Zagreb, Nacionalni centar za vanjsko vrednovanje obrazovanja.
  7. Dolenec, D. (2009), Razvoj socijalne dimenzije u obrazovanju: izvještaj za Sloveniju [Development of the social dimension in education: report for Slovenia], Zagreb, Institut za društvena istraživanja.
  8. Domazet, M., Baranović, B., Matić, J. (2013), “Mathematics Competence and International Mathematics Testing: Croatian Starting Point”, Sociologija i prostor, 195 (1), pp. 109-131.
  9. Foy, P., Brossman, B., Galia, J. (2012), “Scaling the Timss and Pirls 2011 Achievement Data”, in Martin, M.O., Mullis, I.V. (eds), Methods and Procedures in Timss and Pirls 2011. Chestnut Hill: Timss & Pirls International Study Center, Boston College, pp. 1-28.
  10. Gierl, M., Bisanz, J., Bisanz, G., Boughton, K., Khaliq, S. (2001), “Illustrating the Utility of Differential Bundle Functioning Analyses to identify and interpret Group Differences on Achievement Tests”, Educational Measurement: Issues and Practice, 20 (2), pp. 26-36.
  11. Haladyna, T.M., Rodriguez, M.C., Downing, S.M. (2013), Developing and Validating Test Items. New York: Routledge.
  12. Hambleton, R.K., Swaminathan, H., Rogers, J.H. (1991), Fundamentals of Item Response Theory. New York: Sage Publications.
  13. Institute for Education Quality and Evaluation (2012), Izveštaj o realizaciji I rezultatima završnog ispita na kraju osnovnog obrazovanja i vaspitanja u školskoj 2011/2012. godini: glavni nalazi [Report on the realisation and results of the final examination at the end of the primary education in the school year 2011/2012: major findings], Belgrade, Author.
  14. Kalaycioğlu, D.B., Berberoğlu, G. (2010), “Differential Item Functioning Analysis of the Science and Mathematics Items in the University Entrance Examinations in Turkey”, Journal of Psychoeducational Assessment, 29 (5), pp. 467-478.
  15. Mendes-Barnett, S., Ercikan, K. (2006), “Examining Sources of Gender Dif in Mathematics Assessments using a Confirmatory Multidimensional Model Approach”, Applied Measurement in Education, 19 (4), pp. 289-304.
  16. Ministry of Science, Education and Sports (2006), Nastavni plan i program za osnovnu školu [National plan and Program for Primary School], Zagreb, Author.
  17. Mullis, I.V.S., Martin, M.O., Foy, P., Arora, A. (2012), Timss 2011 International Results in Mathematics. Chestnut Hill: Timss & Pirls International Study Center, Boston College.
  18. Mullis, I.V.S., Martin, M.O., Foy, P., Drucker, K.T. (2012), Pirls 2011 International Results in Reading. Chestnut Hill: Timss & Pirls International Study Center, Boston College.
  19. Mullis, I.V.S., Martin, M.O., Ruddock, G.J., O’Sullivan, C.Y., Preuschoff, C. (2009), Timss 2011 Assessment Frameworks. Chestnut Hill: Timss and Pirls International Study Center, Boston College.
  20. Oecd-Pisa (Organisation for Economic Co-operation and Development – Programme for International Student Assessment) (2013), Pisa 2012 Results in focus: What 15-year-olds know and what they can do with what they know. Paris: Oecd.
  21. Ong, Y.M., Williams, J.S., Lamprianou, I. (2011), “Exploration of the Validity of Gender Differences in Mathematics Assessment using Differential Bundle Functioning”, International Journal of Testing, 11 (3), pp. 271-293.
  22. Pastuović, N. (2014), “Kvaliteta predtercijarnog obrazovanja u Hrvatskoj s posebnim osvrtom na strukturu obveznog obrazovanja kao čimbenika njegove kvalitete” [The quality of pre-tertiary education in Croatia, with the emphasis on the structure of compulsory education as a quality factor]. Sociologija i prostor, 51 (3), pp. 449- 470.
  23. Roussos, L.A., Stout, W.F. (1996a), “A Multidimensionality-based Dif Analysis Paradigm”, Applied Psychological Measurement, 20 (4), pp. 355-371.
  24. Roussos, L.A.. Stout, W.F. (1996b), “Simulation Studies of the Effects of Small Sample Size and Studied Item Parameters on Sibtest and Mantel-Haenszel Type I error Performance”, Journal of Educational Measurement, 33 (2), pp. 215-230. Shealy, R., Stout, W.F. (1993), “An Item Response Theory Model for Test Bias” in Holland, P.W., Wainer, H. (eds), Differential Item Functioning. Hillsdale: Erlbaum, pp. 197-239.
  25. Walker, C.M. (2011), “What's the Dif? Why Differential Item Functioning Analyses are an Important Part of Instrument Development and Validation”, Journal of Psychoeducational Assessmement, 29 (4), pp. 364-376.
  26. Wei, T., Chesnut, S.R., Barnard-Brak, L., Stevens, T., Olivárez, A.J. (2014), “Evaluating the Mathematics Interest Inventory using Item Response Theory: Differential Item Functioning across Gender and Ethnicities”, Journal of Psychoeducational Assessment, 32 (8), pp. 747-761.
  27. Zumbo, B.D. (1999), A Handbook on the Theory and Methods of Differential Item Functioning (Dif): Logistic Regression Modeling as a Unitary Framework for Binary and Likert-Type (Ordinal) Item Scores, Ottawa, Directorate of Human Resources Research and Evaluation, Department of National Defense.
  28. Zwick, R., Ercikan, K. (1988), Analysis of Differential Item Functioning in the Naep History Assessment. Ets Research Report Series. Princeton: Ets.

Josip Sabic, Understanding the Country’s Underachievement in International Assessment: Differential Item and Bundle Functioning Approach in "CADMO" 1/2016, pp 5-19, DOI: 10.3280/CAD2016-001002