Design of an energy-efficient accelerator for training of convolutional neural networks using frequency-domain computation

Convolutional neural networks (CNNs) require high computation and memory demand for training. This paper presents the design of a frequency-domain accelerator for energy-efficient CNN training. With Fourier representations of parameters, we replace convolutions with simpler pointwise multiplications...

Full description

Saved in:
Bibliographic Details
Published in2017 54th ACM/EDAC/IEEE Design Automation Conference (DAC) pp. 1 - 6
Main Authors Jong Hwan Ko, Mudassar, Burhan, Na, Taesik, Mukhopadhyay, Saibal
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.06.2017
Subjects
Online AccessGet full text
DOI10.1145/3061639.3062228

Cover

More Information
Summary:Convolutional neural networks (CNNs) require high computation and memory demand for training. This paper presents the design of a frequency-domain accelerator for energy-efficient CNN training. With Fourier representations of parameters, we replace convolutions with simpler pointwise multiplications. To eliminate the Fourier transforms at every layer, we train the network entirely in the frequency domain using approximate frequency-domain nonlinear operations. We further reduce computation and memory requirements using sinc interpolation and Hermitian symmetry. The accelerator is designed and synthesized in 28nm CMOS, as well as prototyped in an FPGA. The simulation results show that the proposed accelerator significantly reduces training time and energy for a target recognition accuracy.
DOI:10.1145/3061639.3062228