Abstract
The Trends in International Mathematics and Science Study (TIMSS) aims to provide a broad perspective for evaluating and improving education. This assessment also ranks the participant countries based on their performance and makes inferences about factors affecting achievement and learning. However, the study may not function as it was expected because of differences in curricular, cultural, or language settings among countries. Consequently, this challenges assumptions about measurement equivalency. The present study aims to assess the equivalency of mathematics items on the TIMSS (2007) study across Australian and Indonesia. Students’ responses were subjected to Rasch analysis to determine DIF items. The results revealed that many items of mathematics tests are problematic because they showed significant bias. The study also found that Australian students performed better and found mathematics items on the test easier than their Indonesian counterparts did. Several factors such as curricular differences, methods used to solve mathematics problems, availability of textbooks and teachers’ quality might explain the existence of DIF between the countries. These findings indicate that serious limitations of using TIMSS results in comparing the performance of students across countries. Thus, further empirical evidence is needed before TIMSS 2007 results can be meaningfully used in research.
Recommended Citation
Fitriati, F. (2014). Differential Item Functioning: Item Level Analysis of TIMSS Mathematics Test Items Using Australian and Indonesian Database. Makara Human Behavior Studies in Asia, 18(2), 127-139. https://doi.org/10.7454/mssh.v18i2.3467