Rahmenprogramm des BMBF zur Förderung der empirischen Bildungsforschung

Literaturdatenbank

Vollanzeige

    Pfeil auf den Link... Verfügbarkeit 
Autoren Haberkorn, Kerstin; Pohl, Steffi; Hardt, Katinka; Wiegand, Elena  
Titel NEPS technical report for reading - scaling results of starting cohort 4 in ninth grade.  
URL https://www.neps-data.de/Portals/0/Working Papers/WP_XVI.pdf  
Erscheinungsjahr 2012  
Seitenzahl 29 S.  
Verlag Bamberg: Otto-Friedrich-Universität  
Reihe NEPS Working Paper. Band 16  
Dokumenttyp Monographie; Discussion Paper / Working Paper / Konferenzbeitrag; online  
Beigaben Literaturangaben, Tabellen, Anhang  
Sprache englisch  
Forschungsschwerpunkt Bildungspanel (NEPS)  
Schlagwörter Bildungsforschung; Lesekompetenz; Item-Response-Theorie; Lesetest; Schüler; Schuljahr 09; Forschungsdesign; Skalierung; Empirische Untersuchung; Quantitative Methode; Deutschland  
Abstract The National Educational Panel Study (NEPS) aims at investigating the development of competences across the whole life span and tests for assessing the different competence domains are developed. In order to evaluate the quality of the competence tests, a wide range of analyses has been performed based on Item Response Theory (IRT). This paper describes the reading competence data of starting cohort 4 in ninth grade. Next to descriptive statistics of the data, the scaling model applied to estimate competence scores, analyses performed to investigate the quality of the scale, as well as the results of these analyses are presented. The reading test in ninth grade consisted of 33 items, which represented different cognitive requirements and text functions and used different response formats. 13,933 subjects participated in the reading test. For scaling the competence test, a partial credit model was applied to the data. Item fit statistics, differential item functioning, Rasch-homogeneity, the tests’ dimensionality, and local item independence were evaluated to ensure the quality of the test. The results show that the items fitted well to the model and that test fairness could be confirmed. The test’s high reliability guarantees precise and differentiating ability estimates for the students. However, many items are targeted towards a lower reading ability. While the different comprehension requirements seem to form a unidimensional structure, the findings point at some multidimensionality based on text functions. Altogether, the reading test exhibited good psychometric properties and, therefore, the estimation of a reliable reading competence score is supported. The data available in the Scientific Use File are described and ConQuest-Syntax for scaling the data is provided. (DIPF/Orig.)  
Förderkennzeichen 01GJ0888