Classifying the difficulty levels of working memory tasks by using pupillary response

Knowing the difficulty of a given task is crucial for improving the learning outcomes. This paper studies the difficulty level classification of memorization tasks from pupillary response data. Developing a difficulty level classifier from pupil size features is challenging because of the inter-subj...

Full description

Saved in:
Bibliographic Details
Published inPeerJ (San Francisco, CA) Vol. 10; p. e12864
Main Authors Mitre-Hernandez, Hugo, Sanchez-Rodriguez, Jorge, Nava-Muñoz, Sergio, Lara-Alvarez, Carlos
Format Journal Article
LanguageEnglish
Published United States PeerJ. Ltd 29.03.2022
PeerJ, Inc
PeerJ Inc
Subjects
Online AccessGet full text
ISSN2167-8359
2167-8359
DOI10.7717/peerj.12864

Cover

More Information
Summary:Knowing the difficulty of a given task is crucial for improving the learning outcomes. This paper studies the difficulty level classification of memorization tasks from pupillary response data. Developing a difficulty level classifier from pupil size features is challenging because of the inter-subject variability of pupil responses. Eye-tracking data used in this study was collected while students solved different memorization tasks divided as low-, medium-, and high-level. Statistical analysis shows that values of pupillometric features (as peak dilation, pupil diameter change, and suchlike) differ significantly for different difficulty levels. We used a wrapper method to select the pupillometric features that work the best for the most common classifiers; Support Vector Machine (SVM), Decision Tree (DT), Linear Discriminant Analysis (LDA), and Random Forest (RF). Despite the statistical difference, experiments showed that a random forest classifier trained with five features obtained the best F1-score (82%). This result is essential because it describes a method to evaluate the cognitive load of a subject performing a task using only pupil size features.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2167-8359
2167-8359
DOI:10.7717/peerj.12864