Fabric defect detection under complex illumination based on an improved recurrent attention model

To solve the problem of fabric defect detection under complex illumination conditions, the Recurrent Attention Model (RAM) which is insensitive to illumination and noise differences has been introduced. However, the policy gradient algorithm in the RAM has some problems, such as the difficulty of co...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Textile Institute Vol. 112; no. 8; pp. 1273 - 1279
Main Authors Wang, Huang, Duan, Fajie, Zhou, Weiti
Format Journal Article
LanguageEnglish
Published Manchester Taylor & Francis 03.08.2021
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN0040-5000
1754-2340
1754-2340
DOI10.1080/00405000.2020.1809918

Cover

More Information
Summary:To solve the problem of fabric defect detection under complex illumination conditions, the Recurrent Attention Model (RAM) which is insensitive to illumination and noise differences has been introduced. However, the policy gradient algorithm in the RAM has some problems, such as the difficulty of convergence and the inefficiency of the algorithm due to the shortcomings of round updating. In this paper, the Deep Deterministic Policy Gradient- Recurrent Attention Model (DDPG-RAM) algorithm is proposed to solve the problems of policy gradient algorithm. Although the decoupling of the reinforcement learning task and classification task will lead to the inconsistency of the data, the gradient variance will be smaller, and the convergence speed and stability will be accelerated. Experiment results show that fabric defects can be detected by the proposed DDPG-RAM algorithm under complex illumination conditions. Compared with RAM and the Convolutional Neural Network (CNN), the accuracy of the decoupled algorithm is 95.24%, and the convergence speed is 50% faster than that of the RAM.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0040-5000
1754-2340
1754-2340
DOI:10.1080/00405000.2020.1809918