BiKA: Binarized KAN-inspired Neural Network for Efficient Hardware Accelerator Designs
The continuously growing size of Neural Network (NN) models makes the design of lightweight neural network accelerators for edge devices an emerging subject in recent research. Previous works explored different lightweight technologies or even emerging neural network structures, such as quantization...
Saved in:
| Published in | Proceedings ... Annual IEEE Symposium on Field-Programmable Custom Computing Machines (Online) p. 276 |
|---|---|
| Main Authors | , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
04.05.2025
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2576-2621 |
| DOI | 10.1109/FCCM62733.2025.00036 |
Cover
| Summary: | The continuously growing size of Neural Network (NN) models makes the design of lightweight neural network accelerators for edge devices an emerging subject in recent research. Previous works explored different lightweight technologies or even emerging neural network structures, such as quantization, approximate computing, neuromorphic computing, etc., to reduce hardware resource consumption in accelerator designs. This inspired our interest in exploring the potential of other emerging network structures in hardware accelerator designs. Kolmogorov-Arnold Network (KAN) [1] is a recently proposed novel neural network structure by replacing the multiplication and activation function in Artificial Neural Networks (ANN) with learnable nonlinear functions, which has the potential to transform the paradigm of neural network design. However, considering the complexity of the nonlinear function on hardware, the design of the lightweight hardware accelerator of KAN lacks thoroughly related research. |
|---|---|
| ISSN: | 2576-2621 |
| DOI: | 10.1109/FCCM62733.2025.00036 |