A Novel Learning Approach to Remove Oscillations in First‐Order Takagi–Sugeno Fuzzy System: Gradient Descent‐Based Neuro‐Fuzzy Algorithm Using Smoothing Group Lasso Regularization

As a universal approximator, the first order Takagi–Sugeno fuzzy system possesses the capability to approximate widespread nonlinear systems through a group of IF THEN fuzzy rules. Although group lasso regularization has the advantage of inducing group sparsity and handling variable selection issues...

Full description

Saved in:
Bibliographic Details
Published inAdvanced theory and simulations Vol. 7; no. 2
Main Authors Liu, Yan, Wang, Rui, Liu, Yuanquan, Shao, Qiang, Lv, Yan, Yu, Yan
Format Journal Article
LanguageEnglish
Published 01.02.2024
Subjects
Online AccessGet full text
ISSN2513-0390
2513-0390
DOI10.1002/adts.202300545

Cover

More Information
Summary:As a universal approximator, the first order Takagi–Sugeno fuzzy system possesses the capability to approximate widespread nonlinear systems through a group of IF THEN fuzzy rules. Although group lasso regularization has the advantage of inducing group sparsity and handling variable selection issues, it can lead to numerical oscillations and theoretical challenges in calculating the gradient at the origin when employed directly during training. The paper addresses the aforementioned obstacle by invoking a smoothing function to approximate group lasso regularization. On this basis, a gradient‐based neuro fuzzy learning algorithm with smoothing group lasso regularization for the first order Takagi–Sugeno fuzzy system is proposed. The convergence of the proposed algorithm is rigorously proved under gentle conditions. In addition, experimental outcomes acquired on two approximations and two classification simulations demonstrate that the proposed algorithm outperforms the algorithm with original group lasso regularization and L2 regularization in terms of error, pruned neurons, and accuracy. This is particularly evident in significant advancements in pruned neurons due to group sparsity. In comparison to the algorithm with L2 regularization, the proposed algorithm exhibits improvements of 6.3, 5.3, and 142.6 in pruned neurons during sin(πx)$(\pi x)$ function, Gabor function, and Sonar benchmark dataset simulations, respectively. A gradient‐based neuro‐fuzzy learning algorithm with smooth group lasso regularization for first‐order Takagi–Sugeno fuzzy system is proposed, in which smooth group lasso regularization can optimize the network structure by inducing group sparsity. The convergence of the proposed algorithm is rigorously proved. Two approximation and two classification simulations illustrate that the proposed algorithm exhibits better sparsity, convergence, and classification ability.
ISSN:2513-0390
2513-0390
DOI:10.1002/adts.202300545