PyGeNN: A Python Library for GPU-Enhanced Neural Networks

More than half of the Top 10 supercomputing sites worldwide use GPU accelerators and they are becoming ubiquitous in workstations and edge computing devices. GeNN is a C++ library for generating efficient spiking neural network simulation code for GPUs. However, until now, the full flexibility of Ge...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in neuroinformatics Vol. 15; p. 659005
Main Authors Knight, James C., Komissarov, Anton, Nowotny, Thomas
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Research Foundation 22.04.2021
Frontiers Media S.A
Subjects
Online AccessGet full text
ISSN1662-5196
1662-5196
DOI10.3389/fninf.2021.659005

Cover

More Information
Summary:More than half of the Top 10 supercomputing sites worldwide use GPU accelerators and they are becoming ubiquitous in workstations and edge computing devices. GeNN is a C++ library for generating efficient spiking neural network simulation code for GPUs. However, until now, the full flexibility of GeNN could only be harnessed by writing model descriptions and simulation code in C++. Here we present PyGeNN, a Python package which exposes all of GeNN's functionality to Python with minimal overhead. This provides an alternative, arguably more user-friendly, way of using GeNN and allows modelers to use GeNN within the growing Python-based machine learning and computational neuroscience ecosystems. In addition, we demonstrate that, in both Python and C++ GeNN simulations, the overheads of recording spiking data can strongly affect runtimes and show how a new spike recording system can reduce these overheads by up to 10×. Using the new recording system, we demonstrate that by using PyGeNN on a modern GPU, we can simulate a full-scale model of a cortical column faster even than real-time neuromorphic systems. Finally, we show that long simulations of a smaller model with complex stimuli and a custom three-factor learning rule defined in PyGeNN can be simulated almost two orders of magnitude faster than real-time.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Reviewed by: Mikael Djurfeldt, Royal Institute of Technology, Sweden; Alexander K. Kozlov, Royal Institute of Technology, Sweden
Edited by: Gaute T. Einevoll, Norwegian University of Life Sciences, Norway
ISSN:1662-5196
1662-5196
DOI:10.3389/fninf.2021.659005