BiKA: Binarized KAN-inspired Neural Network for Efficient Hardware Accelerator Designs

The continuously growing size of Neural Network (NN) models makes the design of lightweight neural network accelerators for edge devices an emerging subject in recent research. Previous works explored different lightweight technologies or even emerging neural network structures, such as quantization...

Full description

Saved in:
Bibliographic Details
Published inProceedings ... Annual IEEE Symposium on Field-Programmable Custom Computing Machines (Online) p. 276
Main Authors Liu, Yuhao, Ullah, Salim, Kumar, Akash
Format Conference Proceeding
LanguageEnglish
Published IEEE 04.05.2025
Subjects
Online AccessGet full text
ISSN2576-2621
DOI10.1109/FCCM62733.2025.00036

Cover

More Information
Summary:The continuously growing size of Neural Network (NN) models makes the design of lightweight neural network accelerators for edge devices an emerging subject in recent research. Previous works explored different lightweight technologies or even emerging neural network structures, such as quantization, approximate computing, neuromorphic computing, etc., to reduce hardware resource consumption in accelerator designs. This inspired our interest in exploring the potential of other emerging network structures in hardware accelerator designs. Kolmogorov-Arnold Network (KAN) [1] is a recently proposed novel neural network structure by replacing the multiplication and activation function in Artificial Neural Networks (ANN) with learnable nonlinear functions, which has the potential to transform the paradigm of neural network design. However, considering the complexity of the nonlinear function on hardware, the design of the lightweight hardware accelerator of KAN lacks thoroughly related research.
ISSN:2576-2621
DOI:10.1109/FCCM62733.2025.00036