DeLTa: Deep local pattern representation for time-series clustering and classification using visual perception
Time-series analysis is of enormous significance to a multitude of domains such as Internet-of-Things (IoT), prognostics, health, and robotics. Machine learning tasks require time-series data in the form of features for the application of (un)supervised algorithms. The existing feature representatio...
Saved in:
| Published in | Knowledge-based systems Vol. 212; p. 106551 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Amsterdam
Elsevier B.V
05.01.2021
Elsevier Science Ltd |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0950-7051 1872-7409 1872-7409 |
| DOI | 10.1016/j.knosys.2020.106551 |
Cover
| Summary: | Time-series analysis is of enormous significance to a multitude of domains such as Internet-of-Things (IoT), prognostics, health, and robotics. Machine learning tasks require time-series data in the form of features for the application of (un)supervised algorithms. The existing feature representation methods lack generality and are domain-specific, especially those based on supervised learning. In this paper, we propose a novel time-series feature representation method based on feature transformation and feature learning. The feature transformation process is inspired by the human cognitive thinking used in visual recognition, where the 1-D time-series data is transformed into a 2-D image dataset. A feature set is learned by imposing a pre-trained convolutional neural network on the transformed search space. This generates two complementary high-dimensional feature sets: (1) one with the matching of the overall 2-D layout of the time-series; and (2) another with matching based on the activation of the local 2-D patterns irrespective of the overall layout. Empirical analysis on a large number of benchmark datasets shows the advantage of the domain-agnostic nature of DeLTa in achieving higher accuracy in comparison to relevant benchmarking methods. Source code is publicly available at https://github.com/technophyte/DeLTa. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0950-7051 1872-7409 1872-7409 |
| DOI: | 10.1016/j.knosys.2020.106551 |