Resistive memory device requirements for a neural algorithm accelerator

Resistive memories enable dramatic energy reductions for neural algorithms. We propose a general purpose neural architecture that can accelerate many different algorithms and determine the device properties that will be needed to run backpropagation on the neural architecture. To maintain high accur...

Full description

Saved in:
Bibliographic Details
Published in2016 International Joint Conference on Neural Networks (IJCNN) pp. 929 - 938
Main Authors Agarwal, Sapan, Plimpton, Steven J., Hughart, David R., Hsia, Alexander H., Richter, Isaac, Cox, Jonathan A., James, Conrad D., Marinella, Matthew J.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.07.2016
Subjects
Online AccessGet full text
ISSN2161-4407
DOI10.1109/IJCNN.2016.7727298

Cover

More Information
Summary:Resistive memories enable dramatic energy reductions for neural algorithms. We propose a general purpose neural architecture that can accelerate many different algorithms and determine the device properties that will be needed to run backpropagation on the neural architecture. To maintain high accuracy, the read noise standard deviation should be less than 5% of the weight range. The write noise standard deviation should be less than 0.4% of the weight range and up to 300% of a characteristic update (for the datasets tested). Asymmetric nonlinearities in the change in conductance vs pulse cause weight decay and significantly reduce the accuracy, while moderate symmetric nonlinearities do not have an effect. In order to allow for parallel reads and writes the write current should be less than 100 nA as well.
ISSN:2161-4407
DOI:10.1109/IJCNN.2016.7727298