Using bilingual respondents to evaluate translated-adapted items


Sireci S., Berberoglu G.

APPLIED MEASUREMENT IN EDUCATION, vol.13, no.3, pp.229-248, 2000 (SSCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 13 Issue: 3
  • Publication Date: 2000
  • Doi Number: 10.1207/s15324818ame1303_1
  • Journal Name: APPLIED MEASUREMENT IN EDUCATION
  • Journal Indexes: Social Sciences Citation Index (SSCI), Scopus
  • Page Numbers: pp.229-248
  • Middle East Technical University Affiliated: No

Abstract

Translating and adapting tests and questionnaires across languages is a common strategy for comparing people who operate in different languages with respect to their achievement, attitude, personality, or other psychological construct. Unfortunately, when tests and questionnaires are translated from one language to another, there is no guarantee that the different language versions are equivalent. In this study, we present and evaluate a methodology for investigating the equivalence of translated-adapted items using bilingual test takers. The methodology involves applying item response theory models to data obtained from randomly equivalent groups of bilingual respondents. The technique was applied to an English-Turkish version of a course evaluation form. The results indicate that the methodology is effective for flagging items that function differentially across languages as well as for informing the test development and test adaptation processes. The utility and limitations of the procedure for evaluating translation equivalence are discussed.