Analyzing the Measurement Equivalence of a Translated Test in a Statewide Assessment Program
DOI:
https://doi.org/10.15359/ree.20-3.9Keywords:
Measurement equivalence, structural equation modeling, confirmatory factor analysisAbstract
When tests are translated into one or more languages, the question of the equivalence of items across language forms arises. This equivalence can be assessed at the scale level by means of a multiple group confirmatory factor analysis (CFA) in the context of structural equation modeling. This study examined the measurement equivalence of a Spanish translated version of a statewide Mathematics test originally constructed in English by using a multi-group CFA approach. The study used samples of native speakers of the target language of the translation taking the test in both the source and target language, specifically Hispanics taking the test in English and Spanish. Test items were grouped in twelve facet-representative parcels. The parceling was accomplished by grouping items that corresponded to similar content and computing an average for each parcel. Four models were fitted to examine the equivalence of the test across groups. The multi-group CFA fixed factor loadings across groups and results supported the equivalence of the two language versions (English and Spanish) of the test. The statistical techniques implemented in this study can also be used to address the performance on a test based on dichotomous or dichotomized variables such as gender, socioeconomic status, geographic location and other variables of interest.
References
American Educational Research Association (AERA), Asociación Americana de Psicología (APA), & National Coucil on Measurement in Education (NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
August, D., & Hakuta, K. (Eds.). (1997). Improving schooling for language-minority students. A research agenda. Washington, DC: National Academy of Science.
Bentler, P. M. (1995). EQS: Structural equations program manual. Encino, CA: Multivariate Software.
Byrne, B. M. (2006). Structural equation modeling with EQS: Basic concepts, applications, and programming. Mahwah, NJ: Lawrence Erlbaum. doi: http://dx.doi.org/10.1207/s15328007sem1302_7
Carvajal, J. (2015). Using DIF to monitor equivalence of translated tests in large scale assessment: A comparison of native speakers in their primary and the test’s source language. The Tapestry Journal, 7(1), 14-21. Recuperado de http://journals.fcla.edu/tapestry/article/view/88133/84742
Gierl, M., Rogers, W. T., & Klinger, D. A. (1999). Using statistical and judgmental reviews to identify and interpret translation differential item functioning. The Alberta Journal of Educational Research, 45(4), 353-376. Recuperado de http://ajer.journalhosting.ucalgary.ca/index.php/ajer/article/view/107/99
Hall, R. J., Snell, A. F., & Singer M. (1999). Item parceling strategies in SEM: Investigating the subtle effects of unmodeled secondary constructs. Organizational Research Methods, 2(3), 233-256. doi: http://dx.doi.org/10.1177/109442819923002
Hirschfeld, G., & von Brachel, R. (2014). Multiple-Group confirmatory factor analysis in R-A tutorial in measurement invariance with continuous and ordinal indicators. Practical Assessment, Research & Evaluation, 19(7), 1-12. Recuperado de http://pareonline.net/pdf/v19n7.pdf
Holmes, D., Hedlund, P., & Nickerson, B. (2000). Accommodating ELLs in state and local assessments. Washington, DC: National Clearinghouse for Bilingual Education.
Hu, L. & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structual Equation Modeling, 6(1), 1-55. doi: http://dx.doi.org/10.1080/10705519909540118
Lara, J., & August, D. (1996). Systemic reform and limited English proficient students. Washington, DC: Council of Chief State School Officers.
Lievens, F., Anseel, F., Harris, M. M., & Eisenberg, J. (2007). Measurement invariance of the Pay Satisfaction Questionnaire across three countries. Educational and Psychological Measurement, 67(6), 1042-1051. doi: http://dx.doi.org/10.1177/0013164406299127
Price, L. R. (1999). Differential functioning of items and tests versus the Mantel-Haenszel technique for detecting differential item functioning in a translated test. Paper presented at the annual meeting of the American Alliance of Health Physical Education, Recreation, and Dance. Boston, MA.
Robin, F., Sireci, S. G., & Hambleton, R. (2003). Evaluating the equivalence of different language versions of a credentialing exam. International Journal of Testing, 3(1), 1-20. doi: http://dx.doi.org/10.1207/S15327574IJT0301_1
Sireci, S. G., & Khaliq, S. N. (April, 2002). An analysis of the psychometric properties of dual language test forms. (Center for Educational Assessment, Report No. 458). Paper presented at the Annual Meeting of the National Council on Measurement in Education. Amherst: University of Massachusetts, School of Education. Recuperado de http://files.eric.ed.gov/fulltext/ED468489.pdf
Wu, A. D., Li, Z., & Zumbo, B. D. (2007). Decoding the meaning of factorial invariance and updating the practice of multi-group confirmatory factor analysis: A demonstration with TIMSS data. Practical Assessment, Research and Evaluation, 12(3), 1-26. Recuperado de http://pareonline.net/getvn.asp?v=12&n=3
Published
How to Cite
Issue
Section
License
1. In case the submitted paper is accepted for publication, the author(s) FREELY, COSTLESS, EXCLUSIVELY AND FOR AN INDEFINITE TERM transfer copyrights and patrimonial rights to Universidad Nacional (UNA, Costa Rica). For more details check the Originality Statement and Copyright Transfer Agreement
2. REUTILIZATION RIGHTS: UNA authorizes authors to use, for any purpose (among them selfarchiving or autoarchiving) and to publish in the Internet in any electronic site, the paper´'s final version, both approved and published (post print), as long as it is done with a non commercial purpose, does not generate derivates without previous consentment and recognizes both publisher's name and authorship.
3. The submission and possible publication of the paper in the Educare Electronic Journal is ruled by the Journal’s editorial policies, the institutional rules of Universidad Nacional and the laws of the Republic of Costa Rica. Additionally, any possible difference of opinion or future dispute shall be settled in accordance with the mechanisms of Alternative Dispute Resolution and the Costa Rican Jurisdiction.
4. In all cases, it is understood that the opinions issued are those of the authors and do not necessarily reflect the position and opinion of Educare, CIDE or Universidad Nacional, Costa Rica. It is also understood that, in the exercise of academic freedom, the authors have carried out a rogorous scientific-academic process of research, reflection and argumentation thar lays within the thematic scope of interest of the Journal.
5. The papers published by Educare Electronic Journal use a Creative Commons License: