Rahmenprogramm des BMBF zur Förderung der empirischen Bildungsforschung

Literaturdatenbank

Vollanzeige

    Pfeil auf den Link... Verfügbarkeit 
Autoren Jordan, Anne-Katrin; Duchhardt, Christoph  
Titel NEPS technical report for mathematics. Scaling results of starting cohort 6 - adults.  
URL https://www.neps-data.de/Portals/0/Working Papers/WP_XXXII.pdf  
Erscheinungsjahr 2013  
Seitenzahl 23 S.  
Verlag Bamberg: Otto-Friedrich-Univ.  
Reihe NEPS Working Papers. Band 32  
Dokumenttyp Monographie; Discussion Paper / Working Paper / Konferenzbeitrag; online  
Beigaben Literaturangaben, Abbildungen, Tabellen, Anhang  
Sprache englisch  
Forschungsschwerpunkt Bildungspanel (NEPS)  
Schlagwörter Bildungsforschung; Item-Response-Theorie; Skalierung; Mathematische Kompetenz; Erwachsener; Kompetenzmessung; Forschungsdesign; Quantitative Methode; Empirische Untersuchung; Rasch-Modell; Deutschland  
Abstract The National Educational Panel Study (NEPS) aims at investigating the development of competencies across the whole life span and designs tests for assessing these different competence domains. In order to evaluate the quality of the competence tests, a wide range of analyses have been performed on the basis of item response theory (IRT). This paper describes the data and scaling procedures on mathematical competence for Starting Cohort 6 – Adults. Besides presenting descriptive statistics for the data, the scaling model applied to estimate competence scores and analyses performed to investigate the quality of the scale, as well as the results of these analyses are also explained. The mathematics test for adults consisted of 22 items representing different content areas as well as different cognitive components and using different response formats. The test was administered to 5,245 adults. A Rasch model was applied to scaling the data. Item fit statistics, differential item functioning, Rasch-homogeneity, and the tests’ dimensionality were evaluated to ensure the quality of the test. The results show that the items exhibited good item fit and measurement invariance across various subgroups. Moreover, the test showed a high reliability. As the correlations between the four content areas are very high in a multidimensional model, the assumption of unidimensionality seems adequate. Among the challenges of this test are the relatively high omission rates in some items and the lack of very difficult items. But overall, the results revealed good psychometric properties of the mathematics test, thus supporting the estimation of a reliable mathematics competence score. This paper describes the data available in the Scientific Use File and provides ConQuest-Syntax for scaling the data — including the necessary item parameters. (DIPF/Orig.)  
Förderkennzeichen 01GJ0888