Real-Time Workload Estimation Using Eye Tracking: A Bayesian Inference Approach

Workload management is a critical concern in shared control of unmanned ground vehicles. In response to this challenge, prior studies have developed methods to estimate human operators' workload by analyzing their physiological data. However, these studies have primarily adopted a single-model-...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of human-computer interaction Vol. 40; no. 15; pp. 4042 - 4057
Main Authors Luo, Ruikun, Weng, Yifan, Jayakumar, Paramsothy, Brudnak, Mark J., Paul, Victor, Desaraju, Vishnu R., Stein, Jeffrey L., Ersal, Tulga, Yang, X. Jessie
Format Journal Article
LanguageEnglish
Published Norwood Taylor & Francis 02.08.2024
Lawrence Erlbaum Associates, Inc
Subjects
Online AccessGet full text
ISSN1044-7318
1532-7590
1044-7318
DOI10.1080/10447318.2023.2205274

Cover

More Information
Summary:Workload management is a critical concern in shared control of unmanned ground vehicles. In response to this challenge, prior studies have developed methods to estimate human operators' workload by analyzing their physiological data. However, these studies have primarily adopted a single-model-single-feature or a single-model-multiple-feature approach. The present study proposes a Bayesian inference model to estimate workload, which leverages different machine learning models for different features. We conducted a human subject experiment with 24 participants, in which a human operator teleoperated a simulated High Mobility Multipurpose Wheeled Vehicle (HMMWV) with the help from an autonomy while performing a surveillance task simultaneously. Participants' eye-related features, including gaze trajectory and pupil size change, were used as the physiological input to the proposed Bayesian inference model. Results show that the Bayesian inference model achieves a 0.823 F 1 score, 0.824 precision, and 0.821 recall, outperforming the single models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1044-7318
1532-7590
1044-7318
DOI:10.1080/10447318.2023.2205274