An Optimized Uncertainty-Aware Training Framework for Neural Networks

Uncertainty quantification (UQ) for predictions generated by neural networks (NNs) is of vital importance in safety-critical applications. An ideal model is supposed to generate low uncertainty for correct predictions and high uncertainty for incorrect predictions. The main focus of state-of-the-art...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 35; no. 5; pp. 6928 - 6935
Main Authors Tabarisaadi, Pegah, Khosravi, Abbas, Nahavandi, Saeid, Shafie-Khah, Miadreza, Catalao, Joao P. S.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2162-237X
2162-2388
2162-2388
DOI10.1109/TNNLS.2022.3213315

Cover

More Information
Summary:Uncertainty quantification (UQ) for predictions generated by neural networks (NNs) is of vital importance in safety-critical applications. An ideal model is supposed to generate low uncertainty for correct predictions and high uncertainty for incorrect predictions. The main focus of state-of-the-art training algorithms is to optimize the NN parameters to improve the accuracy-related metrics. Training based on uncertainty metrics has been fully ignored or overlooked in the literature. This article introduces a novel uncertainty-aware training algorithm for classification tasks. A novel predictive uncertainty estimate-based objective function is defined and optimized using the stochastic gradient descent method. This new multiobjective loss function covers both accuracy and uncertainty accuracy (UA) simultaneously during training. The performance of the proposed training framework is compared from different aspects with other UQ techniques for different benchmarks. The obtained results demonstrate the effectiveness of the proposed framework for developing the NN models capable of generating reliable uncertainty estimates.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2022.3213315