Rahmenprogramm des BMBF zur Förderung der empirischen Bildungsforschung

Literaturdatenbank

Vollanzeige

    Pfeil auf den Link... Verfügbarkeit 
Autoren Hardt, Katinka; Pohl, Steffi; Haberkorn, Kerstin; Wiegand, Elena  
Titel NEPS technical report for reading. Scaling results of starting cohort 6 for adults in main study 2010/2011.  
URL https://www.neps-data.de/Portals/0/Working Papers/WP_XXV.pdf  
Erscheinungsjahr 2013  
Seitenzahl 29 S.  
Verlag Bamberg: Otto-Friedrich-Univ.  
Reihe NEPS Working Paper. Band 25  
Dokumenttyp Monographie; Discussion Paper / Working Paper / Konferenzbeitrag; online  
Beigaben Literaturangaben, Abbildungen, Tabellen  
Sprache englisch  
Forschungsschwerpunkt Bildungspanel (NEPS)  
Schlagwörter Bildungsforschung; Item-Response-Theorie; Skalierung; Lesekompetenz; Lesetest; Erwachsener; Forschungsdesign; Quantitative Methode; Deutschland  
Abstract The National Educational Panel Study (NEPS) aims at investigating the development of competencies across the whole life span and tests for assessing the different competence domains are developed. In order to evaluate the quality of the competence tests, a wide range of analyses have been performed based on Item Response Theory (IRT). This paper describes the data and scaling procedures of the adult reading competence test in starting cohort 6. After reporting descriptive statistics of the data, the scaling model applied to estimate competence scores, analyses performed to investigate the quality of the scale, as well as the results of these analyses are presented. The reading competence test for the adults’ cohort consisted of 32 items, which represented different cognitive requirements and text functions and used different response formats. The test was administered to 5,349 persons. A partial credit model was used for scaling the data. Item fit statistics, differential item functioning, Rasch-homogeneity, the tests’ dimensionality, and local item independence were evaluated to ensure the quality of the test. The results showed that the test exhibits a high reliability and that the items fit the model. Moreover, measurement invariance could be confirmed for various subgroups. Dimensionality analyses showed that the different cognitive requirements foster a unidimensional construct, while there is some evidence for multidimensionality based on text functions. It is to note that there is a considerable amount of items that have not been reached by the test takers within assessment time and that there are many items that are targeted towards a lower reading ability. Altogether, the results show good psychometric properties of the reading competence test and support the estimation of a reliable reading competence score. In addition to scaling results, the data available in the Scientific Use File are described and the ConQuest-Syntax for scaling the data is provided. (DIPF/Orig.)  
Förderkennzeichen 01GJ0888