k-ATTRACTORS: A PARTITIONAL CLUSTERING ALGORITHM FOR NUMERIC DATA ANALYSIS

Clustering is a data analysis technique, particularly useful when there are many dimensions and little prior information about the data. Partitional clustering algorithms are efficient but suffer from sensitivity to the initial partition and noise. We propose here k-attractors, a partitional cluster...

Full description

Saved in:
Bibliographic Details
Published inApplied artificial intelligence Vol. 25; no. 2; pp. 97 - 115
Main Authors Kanellopoulos, Y., Antonellis, P., Tjortjis, C., Makris, C., Tsirakis, N.
Format Journal Article
LanguageEnglish
Published Philadelphia Taylor & Francis Group 28.02.2011
Taylor & Francis Ltd
Subjects
Online AccessGet full text
ISSN0883-9514
1087-6545
DOI10.1080/08839514.2011.534590

Cover

More Information
Summary:Clustering is a data analysis technique, particularly useful when there are many dimensions and little prior information about the data. Partitional clustering algorithms are efficient but suffer from sensitivity to the initial partition and noise. We propose here k-attractors, a partitional clustering algorithm tailored to numeric data analysis. As a preprocessing (initialization) step, it uses maximal frequent item-set discovery and partitioning to define the number of clusters k and the initial cluster "attractors." During its main phase the algorithm uses a distance measure, which is adapted with high precision to the way initial attractors are determined. We applied k-attractors as well as k-means, EM, and FarthestFirst clustering algorithms to several datasets and compared results. Comparison favored k-attractors in terms of convergence speed and cluster formation quality in most cases, as it outperforms these three algorithms except from cases of datasets with very small cardinality containing only a few frequent item sets. On the downside, its initialization phase adds an overhead that can be deemed acceptable only when it contributes significantly to the algorithm's accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:0883-9514
1087-6545
DOI:10.1080/08839514.2011.534590