Targeted assessment of hypothesis testing skills using cognitive diagnostic models: Implications for formative practice

•Diagnoses student learning gaps in hypothesis testing using classroom assessment.•Compares six cognitive diagnostic models from the Generalized DINA (G-DINA) family.•Attribute-level feedback identifies gaps in conceptual reasoning and computation.•Supports formative assessment in large or hybrid un...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of educational research Vol. 134; p. 102801
Main Author Im, Seongah
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 2025
Subjects
Online AccessGet full text
ISSN0883-0355
DOI10.1016/j.ijer.2025.102801

Cover

More Information
Summary:•Diagnoses student learning gaps in hypothesis testing using classroom assessment.•Compares six cognitive diagnostic models from the Generalized DINA (G-DINA) family.•Attribute-level feedback identifies gaps in conceptual reasoning and computation.•Supports formative assessment in large or hybrid undergraduate statistics courses.•Provides actionable insights to inform targeted teaching and feedback. Formative assessment is essential for identifying student learning gaps and supporting meaningful feedback, particularly in subjects that require multi-step reasoning such as statistical hypothesis testing. This study demonstrates how Cognitive Diagnostic Models (CDMs) can enhance assessment practices by offering detailed, attribute-level feedback on student proficiency. Analyzing item response data from 219 undergraduate students in an introductory statistics course, the study employed the Generalized DINA model and its reduced variants, identifying the most suitable and interpretable model. Through expert evaluation and Q-matrix validation procedures, four attributes involved in hypothesis testing were specified and refined. Among the six models, the Linear Logistic Model (LLM) yielded the best fit.The attribute classification results revealed that while most students mastered procedural aspects of hypothesistesting, distinct groups struggled either with selecting appropriate statistical methods or managing multi-step computations. The study underscores the potential of CDM-based assessments to provide actionable diagnostic information for tailored instruction, targeted feedback, and pinpointing specific learning hurdles. While this approach is readily applicable in large classes, its effectiveness can also extend to smaller groups by aggregating data across multiple cohorts. CDMs offer a flexible and scalable framework for improving assessment-for-learning practices across structured subject areas.
ISSN:0883-0355
DOI:10.1016/j.ijer.2025.102801