REAL

An adaptive testing system for programming proficiency using Item Response Theory

Apró, Anikó and Tajti, Tibor (2025) An adaptive testing system for programming proficiency using Item Response Theory. ANNALES MATHEMATICAE ET INFORMATICAE, 61. pp. 31-42. ISSN 1787-6117

[img]
Preview
Text
31_42_apró.pdf - Published Version

Download (640kB) | Preview

Abstract

This paper presents the design and implementation of an adaptive testing system for assessing university students’ programming skills in Python, C#, Java, JavaScript, and SQL. Adaptive testing dynamically adjusts question difficulty based on individual performance, enabling more precise and efficient assessment compared to traditional fixed-form tests. We provide an overview of adaptive testing principles and the Item Response Theory (IRT) models (1PL–3PL) that underpin the system. Our approach integrates continuous, categorical, and accelerated adaptive methodologies to optimize both accuracy and test length. The system is implemented as a Flask-based web application that selects questions from a customizable bank, adapting to the learner’s estimated knowledge level in real time. Key features include topic-based item selection, immediate scoring, detailed post-test analytics, and end-of-test formative recommendations (tailored by language/level with estimated study time). The system demonstrates how IRT-based adaptive programming assessment supports personalized, data-driven evaluation in higher education and hiring.

Item Type: Article
Uncontrolled Keywords: adaptive testing, Item Response Theory, programming proficiency, computer science education
Subjects: Q Science / természettudomány > QA Mathematics / matematika > QA75 Electronic computers. Computer science / számítástechnika, számítógéptudomány
Depositing User: Tibor Gál
Date Deposited: 11 Nov 2025 11:02
Last Modified: 11 Nov 2025 11:02
URI: https://real.mtak.hu/id/eprint/228822

Actions (login required)

Edit Item Edit Item