Deep k-Means: Jointly clustering with k-Means and learning representations
•Differentiable reformulation of the k-Means problem in a learned embedding space.•Proposition of an alternative to pretraining based on deterministic annealing.•Straightforward training algorithm based on stochastic gradient descent.•Careful comparison against k-Means-related and deep clustering ap...
Saved in:
Published in | Pattern recognition letters Vol. 138; pp. 185 - 192 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Amsterdam
Elsevier B.V
01.10.2020
Elsevier Science Ltd Elsevier |
Subjects | |
Online Access | Get full text |
ISSN | 0167-8655 1872-7344 |
DOI | 10.1016/j.patrec.2020.07.028 |
Cover
Summary: | •Differentiable reformulation of the k-Means problem in a learned embedding space.•Proposition of an alternative to pretraining based on deterministic annealing.•Straightforward training algorithm based on stochastic gradient descent.•Careful comparison against k-Means-related and deep clustering approaches.
We study in this paper the problem of jointly clustering and learning representations. As several previous studies have shown, learning representations that are both faithful to the data to be clustered and adapted to the clustering algorithm can lead to better clustering performance, all the more so that the two tasks are performed jointly. We propose here such an approach for k-Means clustering based on a continuous reparametrization of the objective function that leads to a truly joint solution. The behavior of our approach is illustrated on various datasets showing its efficacy in learning representations for objects while clustering them. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2020.07.028 |