Temporal single spike coding for effective transfer learning in spiking neural networks
In this work, a supervised learning rule based on Temporal Single Spike Coding for Effective Transfer Learning (TS4TL) is presented, an efficient approach for training multilayer fully connected Spiking Neural Networks (SNNs) as classifier blocks within a Transfer Learning (TL) framework. A new targ...
        Saved in:
      
    
          | Published in | Scientific reports Vol. 15; no. 1; pp. 34094 - 25 | 
|---|---|
| Main Authors | , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
        London
          Nature Publishing Group UK
    
        30.09.2025
     Nature Publishing Group Nature Portfolio  | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 2045-2322 2045-2322  | 
| DOI | 10.1038/s41598-025-14619-3 | 
Cover
| Summary: | In this work, a supervised learning rule based on Temporal Single Spike Coding for Effective Transfer Learning (TS4TL) is presented, an efficient approach for training multilayer fully connected Spiking Neural Networks (SNNs) as classifier blocks within a Transfer Learning (TL) framework. A new target assignment method named as “Absolute Target” is proposed, which utilizes a fixed, non-relative target signal specifically designed for single-spike temporal coding. In this approach, the firing time of the correct output neuron is treated as the target spike time, while no spikes are assigned to the other neurons. Unlike existing relative target strategies, this method minimizes computational complexity, reduces training time, and decreases energy consumption by limiting the number of spikes required for classification, all while ensuring a stable and computationally efficient training process. By seamlessly integrating this learning rule into the TL framework, TS4TL effectively leverages pre-trained feature extractors, demonstrating robust performance even with limited labelled data and varying data distributions. The proposed learning rule scales efficiently across both shallow and deep network architectures while maintaining consistent accuracy and reliability. Extensive evaluations on benchmark datasets highlight the strength of this approach, achieving state-of-the-art accuracies, including 98.91% on Eth80, surpassing previous works, and 91.89% on Fashion-MNIST, outperforming all fully connected structures in the literature. Additionally, high accuracies of 98.45% and 97.75% were recorded on the MNIST and Caltech101-Face/Bike datasets, respectively. Furthermore, TS4TL addresses a critical challenge by effectively reducing neuron misfires, ensuring that neurons respond correctly based on first-spike coding, a significant improvement over manually imposed solutions seen in prior works. These contributions collectively highlight the potential of TS4TL as a scalable and high-performance solution for temporal learning in SNNs. | 
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23  | 
| ISSN: | 2045-2322 2045-2322  | 
| DOI: | 10.1038/s41598-025-14619-3 |