•  
  •  
 

Abstract

The Trends in International Mathematics and Science Study (TIMSS) aims to provide a broad perspective for evaluating and improving education. This assessment also ranks the participant countries based on their performance and makes inferences about factors affecting achievement and learning. However, the study may not function as it was expected because of differences in curricular, cultural, or language settings among countries. Consequently, this challenges assumptions about measurement equivalency. The present study aims to assess the equivalency of mathematics items on the TIMSS (2007) study across Australian and Indonesia. Students’ responses were subjected to Rasch analysis to determine DIF items. The results revealed that many items of mathematics tests are problematic because they showed significant bias. The study also found that Australian students performed better and found mathematics items on the test easier than their Indonesian counterparts did. Several factors such as curricular differences, methods used to solve mathematics problems, availability of textbooks and teachers’ quality might explain the existence of DIF between the countries. These findings indicate that serious limitations of using TIMSS results in comparing the performance of students across countries. Thus, further empirical evidence is needed before TIMSS 2007 results can be meaningfully used in research.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.