Time-Efficient Neural Architecture Search for Autonomous Underwater Vehicle Fault Diagnosis

Autonomous underwater vehicle (AUV) can replace human to operate in complex underwater environment, so it must have the ability of self-fault diagnosis. Existing deep learning-based diagnostic methods have achieved excellent performance, but designing effective neural network structures is a time-co...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 72; p. 1
Main Authors Pei, Shicheng, Wang, Huan, Han, Te
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0018-9456
1557-9662
DOI10.1109/TIM.2023.3327477

Cover

More Information
Summary:Autonomous underwater vehicle (AUV) can replace human to operate in complex underwater environment, so it must have the ability of self-fault diagnosis. Existing deep learning-based diagnostic methods have achieved excellent performance, but designing effective neural network structures is a time-consuming and difficult task. Although neural network architecture search (NAS) can automatically search effective neural network structures in a certain search space, NAS algorithms are usually slow and expensive. Therefore, this paper introduces a time-efficient NAS-based AUV fault diagnosis framework (TENAS-FD). TENAS-FD constructs a novel scoring algorithm that effectively gives a metric to characterize the performance of an untrained network. This metric is given based on the overlapping activation between data-points in the untrained network with different inputs. This allows TENAS-FD to search for superior network architectures in seconds on a single GPU. Experiments were conducted on a real AUV dataset and showed that TENAS-FD can quickly obtain excellent network architectures for AUV fault diagnosis and has better diagnostic performance compared to hand-designing models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2023.3327477