Seminar Computational Intelligence A (708.111)

WS 2018/2019

Institut für Grundlagen der Informationsverarbeitung (708)

Lecturer:

Assoc. Prof. Dr. Robert Legenstein

Office hours: by appointment (via e-mail)

E-mail: robert.legenstein@igi.tugraz.at
Homepage: www.tugraz.at/institute/igi/team/prof-legenstein/




Location: IGI-seminar room, Inffeldgasse 16b/I, 8010 Graz
Date: starting on Tuesday, Oct 2 2018, 15:15 - 17.00 p.m. (TUGonline)

Content of the seminar: Training Spiking Neural Networks

"In recent years, deep learning has revolutionized the field of machine learning, for computer vision in particular. In this approach, a deep (multilayer) artificial neural network (ANN) is trained in a supervised manner using backpropagation. Vast amounts of labeled training examples are required, but the resulting classification accuracy is truly impressive, sometimes outperforming humans. Neurons in an ANN are characterized by a single, static, continuous-valued activation. Yet biological neurons use discrete spikes to compute and transmit information, and the spike times, in addition to the spike rates, matter. Spiking neural networks (SNNs) are thus more biologically realistic than ANNs, and arguably the only viable option if one wants to understand how the brain computes. SNNs are also more hardware friendly and energy-efficient than ANNs, and are thus appealing for technology, especially for portable devices. However, training deep SNNs remains a challenge. Spiking neurons’ transfer function is usually non-differentiable, which prevents using backpropagation." [Tavanaei et al. Deep Learning in Spiking Neural Networks. arXiv 2018.].

Spiking neural networks are an important alternative to artificial neural networks, in particular in view of future low-energy hardware implementations of neural networks. In this seminar, we will discuss training methods for spiking neural networks, with an emphasis on supervised training and with a brief outlook on spiking neuromorphic hardware.


How to prepare and hold your talk:

The guide presented in the seminar: How to prepare and hold your talk


TALKS:

Date # Topic / paper title Presenter 1 Presenter 2 Presentation
13.11. 1 Maass (1997). Networks of spiking neurons: the third generation of neural network models.
PDF
Malic
Zach
PDF
13.11
2 Caporale, Dan (2008). Spike timing-dependent plasticity: a Hebbian learning rule.
PDF
Schober
Simon
PDF
20.11.
3 Song et al (2000). Competitive Hebbian learning through spike-timing-dependent synaptic plasticity.
PDF
Gruber
Kurz
PDF
20.11.
4 Masquelier, Thorpe (2007). Unsupervised learning of visual features through spike timing dependent plasticity.
PDF
Hadrovic
Schlüsselbauer
PDF
27.11.
5 Ponulak, Kasiński (2010). Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting.
PDF
Schlacher
Spataru
PDF
27.11.
6 Gütig, Sompolinsky (2006). The tempotron: a neuron that learns spike timing-based decisions.
PDF
Loiseau
Maiti
PDF
11.12.
7 Florian (2012). The chronotron: a neuron that learns to fire temporally precise spike patterns.
PDF
Srisrisawang

PDF
11.12.
8 Bohte et al (2002). Error-backpropagation in temporally encoded networks of spiking neurons.
PDF
Rauscher

PDF
8.1.
9 Diehl et al (2015). Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing.
PDF
Mittendrein

PDF
8.1.
10 Esser et al (2016). Convolutional networks for fast, energy-efficient neuromorphic computing.
PDF
Schwarzl
Walch
PDF
15.1.
11 Bellec et al (2018). Long short-term memory and Learning-to-learn in networks of spiking neurons.
PDF
Lackner

PDF

Papers:

General:

Maass, W. (1997). Networks of spiking neurons: the third generation of neural network models. Neural networks, 10(9), 1659-1671. PDF

 

Bohte, S. M. (2004). The evidence for neural information processing with precise spike-times: A survey. Natural Computing, 3(2), 195-206. PDF

 

Caporale, N., & Dan, Y. (2008). Spike timing–dependent plasticity: a Hebbian learning rule. Annu. Rev. Neurosci., 31, 25-46. PDF

 

 

Single layer unsupervised

Song, S., Miller, K. D., & Abbott, L. F. (2000). Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nature neuroscience, 3(9), 919. PDF

 

Masquelier, T., & Thorpe, S. J. (2007). Unsupervised learning of visual features through spike timing dependent plasticity. PLoS computational biology, 3(2), e31. PDF

 

 

Single layer supervised

Pfister, J. P., Toyoizumi, T., Barber, D., & Gerstner, W. (2006). Optimal spike-timing-dependent plasticity for precise action potential firing in supervised learning. Neural computation, 18(6), 1318-1348. PDF

 

Ponulak, F., & Kasiński, A. (2010). Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting. Neural computation, 22(2), 467-510. PDF

 

Gütig, R., & Sompolinsky, H. (2006). The tempotron: a neuron that learns spike timing–based decisions. Nature neuroscience, 9(3), 420. PDF

 

Florian, R. V. (2012). The chronotron: a neuron that learns to fire temporally precise spike patterns. PloS one, 7(8), e40233. PDF

 

 

Multilayer supervised

Bohte, S. M., Kok, J. N., & La Poutre, H. (2002). Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing, 48(1-4), 17-37. PDF

 

Booij, O., & tat Nguyen, H. (2005). A gradient descent rule for spiking neurons emitting multiple spikes. Information Processing Letters, 95(6), 552-558. PDF

 

Lee, J. H., Delbruck, T., & Pfeiffer, M. (2016). Training deep spiking neural networks using backpropagation. Frontiers in neuroscience, 10, 508. PDF

Zenke, F., & Ganguli, S. (2018). SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural computation, 30(6), 1514-1541. PDF

 

Huh, D., & Sejnowski, T. J. (2017). Gradient descent for spiking neural networks. arXiv preprint arXiv:1706.04698. PDF

 

Mostafa, H. (2018). Supervised learning based on temporal coding in spiking neural networks. IEEE transactions on neural networks and learning systems, 29(7), 3227-3235. PDF

 

Wu, Y., Deng, L., Li, G., Zhu, J., & Shi, L. (2018). Spatio-temporal backpropagation for training high-performance spiking neural networks. Frontiers in neuroscience, 12. PDF

 

Tavanaei, A., & Maida, A. S. (2017). Bp-stdp: Approximating backpropagation using spike timing dependent plasticity. arXiv preprint arXiv:1711.04214. PDF

 

Bellec, G., Salaj, D., Subramoney, A., Legenstein, R., & Maass, W. (2018). Long short-term memory and Learning-to-learn in networks of spiking neurons. arXiv preprint arXiv:1803.09574. PDF

 

 

Misc:

O'Connor, P., & Welling, M. (2016). Deep spiking networks. arXiv preprint arXiv:1602.08323. PDF

 

 

Hardware:

Diehl, P. U., Neil, D., Binas, J., Cook, M., Liu, S. C., & Pfeiffer, M. (2015, July). Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In Neural Networks (IJCNN), 2015 International Joint Conference on (pp. 1-8). IEEE. PDF

 

Rueckauer, B., Lungu, I. A., Hu, Y., Pfeiffer, M., & Liu, S. C. (2017). Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in neuroscience, 11, 682. PDF

 

Esser, S.k. et al. (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. PNAS, 113(41), 11441-11446. PDF

see also:

Esser, S. K., Appuswamy, R., Merolla, P., Arthur, J. V., & Modha, D. S. (2015). Backpropagation for energy-efficient neuromorphic computing. In Advances in Neural Information Processing Systems (pp. 1117-1125). PDF

Talks should be not longer than 35 minutes, and be clear, interesting and informative, rather than a reprint of the material. Select what parts of the material you want to present, and what not, and then present the selected material well (including definitions not given in the material: look them up on the web or if that is not successful, ask the seminar organizers). Often diagrams or figures are useful for a talk. on the other hand, giving in the talk numbers of references that are listed at the end is a no-no (a talk is an online process, not meant to be read). For the same reasons you can also quickly repeat earlier definitions or so if you suspect that the audience may not remember it.


Talks will be assigned at the first seminar meeting on October 2, 15:15-17:00.  Students are requested to have a quick glance at the papers prior to this meeting in order to determine their preferences.

General rules:

Participation in the seminar meetings is obligatory. We also request your courtesy and attention for the seminar speaker: no smartphones, laptops, etc during a talk. Furthermore your active attention, questions, and discussion contributions are expected.

After your talk (and possibly some corrections) send pdf of your talk to Charlotte Rumpf, who will post it on the seminar webpage.