Quantum K‐Nearest Neighbor Classification Algorithm via a Divide‐and‐Conquer Strategy
The K‐nearest neighbor algorithm is one of the most frequently applied supervised machine learning algorithms. Similarity computing is considered to be the most crucial and time‐consuming step among the classical K‐nearest neighbor algorithm. A quantum K‐nearest neighbor algorithm is proposed based...
Saved in:
| Published in | Advanced quantum technologies (Online) Vol. 7; no. 6 |
|---|---|
| Main Authors | , , , , |
| Format | Journal Article |
| Language | English |
| Published |
01.06.2024
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2511-9044 2511-9044 |
| DOI | 10.1002/qute.202300221 |
Cover
| Summary: | The K‐nearest neighbor algorithm is one of the most frequently applied supervised machine learning algorithms. Similarity computing is considered to be the most crucial and time‐consuming step among the classical K‐nearest neighbor algorithm. A quantum K‐nearest neighbor algorithm is proposed based on the divide‐and‐conquer strategy. A quantum circuit is designed to calculate the fidelity between the test sample and each feature vector of the training dataset. The quantum K‐nearest neighbor algorithm has higher classification efficiency in high‐dimensional data processing. The classification accuracy of the proposed algorithm is equivalent to that of the classical K‐nearest neighbor algorithm under the IRIS dataset. In addition, compared with the typical quantum K‐nearest neighbor algorithms, the proposed classification method possesses higher classification accuracy with less calculation time, which has wide applications in the industrial field.
An efficient quantum K‐nearest neighbor (QKNN) classification scheme is designed based on a divide‐and‐conquer strategy. The QKNN algorithm makes full use of the ability of parallel computation. The proposed QKNN classification algorithm can provide an accurate approximation to the classical KNN classification algorithm, and has more accurate classification performance with less running time than the typical QKNN classification algorithms. |
|---|---|
| ISSN: | 2511-9044 2511-9044 |
| DOI: | 10.1002/qute.202300221 |