A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design

Yusuke Sakemi, Kai Morino, Takashi Morie, Kazuyuki Aihara

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Spiking neural networks (SNNs) are brain-inspired mathematical models with the ability to process information in the form of spikes. SNNs are expected to provide not only new machine-learning algorithms but also energy-efficient computational models when implemented in very-large-scale integration (VLSI) circuits. In this article, we propose a novel supervised learning algorithm for SNNs based on temporal coding. A spiking neuron in this algorithm is designed to facilitate analog VLSI implementations with analog resistive memory, by which ultrahigh energy efficiency can be achieved. We also propose several techniques to improve the performance on recognition tasks and show that the classification accuracy of the proposed algorithm is as high as that of the state-of-the-art temporal coding SNN algorithms on the MNIST and Fashion-MNIST datasets. Finally, we discuss the robustness of the proposed SNNs against variations that arise from the device manufacturing process and are unavoidable in analog VLSI implementation. We also propose a technique to suppress the effects of variations in the manufacturing process on the recognition performance.

Original languageEnglish
Pages (from-to)394-408
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number1
DOIs
Publication statusPublished - Jan 1 2023

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design'. Together they form a unique fingerprint.

Cite this