Efficient Discrete Clustering With Anchor Graph

Spectral clustering (SC) has been applied to analyze varieties of data structures over the past few decades owing to its outstanding breakthrough in graph learning. However, the time-consuming eigenvalue decomposition (EVD) and information loss during relaxation and discretization impact the efficie...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 35; no. 10; pp. 15012 - 15020
Main Authors Wang, Jingyu, Ma, Zhenyu, Nie, Feiping, Li, Xuelong
Format Journal Article
LanguageEnglish
Published United States IEEE 01.10.2024
Subjects
Online AccessGet full text
ISSN2162-237X
2162-2388
2162-2388
DOI10.1109/TNNLS.2023.3279380

Cover

More Information
Summary:Spectral clustering (SC) has been applied to analyze varieties of data structures over the past few decades owing to its outstanding breakthrough in graph learning. However, the time-consuming eigenvalue decomposition (EVD) and information loss during relaxation and discretization impact the efficiency and accuracy especially for large-scale data. To address above issues, this brief proposes a simple and fast method named efficient discrete clustering with anchor graph (EDCAG) to circumvent postprocessing by binary label optimization. First of all, sparse anchors are adopted to accelerate graph construction and obtain a parameter-free anchor similarity matrix. Subsequently, inspired by intraclass similarity maximization in SC, we design an intraclass similarity maximization model between anchor-sample layer to cope with anchor graph cut problem and exploit more explicit data structures. Meanwhile, a fast coordinate rising (CR) algorithm is employed to alternatively optimize discrete labels of samples and anchors in designed model. Experimental results show excellent rapidity and competitive clustering effect of EDCAG.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2023.3279380