In this paper we propose a method to classify the students’ textual descriptions of algorithms. This work is based on a wealth of data (programming tasks, related algorithm descriptions, and Peer Assessment data), coming from 6 years of use of the system Q2A, in a “Fundamentals of Computer Programming” course, given at first year in our university’s Computer Science curriculum. The descriptions are submitted, as part of the answer to a computer programming task, through Q2A, and are subject to (formative) Peer Assessment. The proposed classification method aims to support the teacher on the analysis of the quite numerous students’ descriptions, in ours as well as in other similar systems. We 1) process the students’ submissions, by topic automated extraction (BERTopic) and by separate Large Language Models, 2) compute their degree of suitability as “algorithm description”, in a scale from BAD to GOOD, and 3) compare the obtained classification with those coming from the teacher’s direct assessment (expert: one of the authors), and from the Peer Assessment. The automated classification does correlate with both the expert classification and the grades given by the peers to the “clarity” of the descriptions. This result is encouraging in view of the production of a Q2A subsystem allowing the teacher to analyse the students’ submissions guided by an automated classification, and ultimately support fully automated grading.

Automated Analysis of Algorithm Descriptions Quality, Through Large Language Models / Sterbini, A.; Temperini, M.. - 14798:(2024), pp. 258-271. ( 20th International Conference on Generative Intelligence and Intelligent Tutoring Systems, ITS 2024 Thessaloniki; Greece ) [10.1007/978-3-031-63028-6_20].

Automated Analysis of Algorithm Descriptions Quality, Through Large Language Models

Sterbini A.
;
Temperini M.
2024

Abstract

In this paper we propose a method to classify the students’ textual descriptions of algorithms. This work is based on a wealth of data (programming tasks, related algorithm descriptions, and Peer Assessment data), coming from 6 years of use of the system Q2A, in a “Fundamentals of Computer Programming” course, given at first year in our university’s Computer Science curriculum. The descriptions are submitted, as part of the answer to a computer programming task, through Q2A, and are subject to (formative) Peer Assessment. The proposed classification method aims to support the teacher on the analysis of the quite numerous students’ descriptions, in ours as well as in other similar systems. We 1) process the students’ submissions, by topic automated extraction (BERTopic) and by separate Large Language Models, 2) compute their degree of suitability as “algorithm description”, in a scale from BAD to GOOD, and 3) compare the obtained classification with those coming from the teacher’s direct assessment (expert: one of the authors), and from the Peer Assessment. The automated classification does correlate with both the expert classification and the grades given by the peers to the “clarity” of the descriptions. This result is encouraging in view of the production of a Q2A subsystem allowing the teacher to analyse the students’ submissions guided by an automated classification, and ultimately support fully automated grading.
2024
20th International Conference on Generative Intelligence and Intelligent Tutoring Systems, ITS 2024
Automated Assessment; Large Language Models; LLM-based Text Similarity; Peer Assessment
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Automated Analysis of Algorithm Descriptions Quality, Through Large Language Models / Sterbini, A.; Temperini, M.. - 14798:(2024), pp. 258-271. ( 20th International Conference on Generative Intelligence and Intelligent Tutoring Systems, ITS 2024 Thessaloniki; Greece ) [10.1007/978-3-031-63028-6_20].
File allegati a questo prodotto
File Dimensione Formato  
Sterbini_Automated-Analysis_2024.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.85 MB
Formato Adobe PDF
2.85 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1728615
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact