Institut für Grundlagen der
Informationsverarbeitung (708)
Assoc. Prof. Dr. Robert Legenstein
Office hours: by appointment (via email)
Email: robert.legenstein@igi.tugraz.at
Homepage: www.igi.tugraz.at/legi/
Date  Speaker  Paper  
Mar 28, 2012 
Robert Legenstein 
A quick introduction to Boltzmann Machines 

Apr 25, 2012 
Daniel Markl 
Reducing the dimensionality of data with neural networks, Slides 





May 23, 2012 
Teresa Klatzer 
Learning Deep Architectures for AI (2), Slides 

Jun 6, 2012 
Florian Hubner 
Unsupervised learning of image transformations, Slides 

Jun 13, 2012 
Markus Eger 
The Recurrent Temporal Restricted Boltzmann Machine, Slides 

Jun 20, 2012 
Gernot Griesbacher 
Neural sampling: A model for stochastic computation in recurrent networks of spiking neurons, Slides 

Jun 20, 2012 
Michael Rath 
Probabilistic inference in general graphical models
through sampling in stochastic networks of spiking
neurons,
Slides 

Jul 04, 2012 
Philipp Singer 
Discovering Binary Codes
for Documents by Learning Deep Generative Models,
Slides 
Hinton, G. E. and
Salakhutdinov, R. R.
Reducing the dimensionality of data
with neural networks. The science paper that made deep networks
popularScience, Vol. 313. no. 5786, pp. 504  507, 28 July 2006. [ full paper ] [ supporting online material (pdf) ] [ Matlab code ] 
Hinton, G. E.,
Osindero, S. and Teh, Y.
A fast learning algorithm for deep
belief nets The basis for deep learning: the contrastive
divergence learning algorithmNeural Computation 18, pp 15271554. 2006. [pdf] 
Taylor, G. W., Hinton, G. E. and Roweis, S.
Modeling human motion using binary
latent variables Advances in Neural Information Processing Systems, 19 MIT Press, Cambridge, MA, 2007 [pdf] 
Memisevic, R. and Hinton, G. E.. 
Salakhutdinov R. R, Mnih, A. and Hinton, G. E.
Restricted Boltzmann Machines for
Collaborative Filtering
International Conference on Machine Learning, Corvallis, Oregon, 2007 [pdf] 
Sutskever, I., Hinton, G. E. and Taylor, G. W.
The Recurrent Temporal Restricted
Boltzmann Machine
Advances in Neural Information Processing Systems 21, MIT Press, Cambridge, MA [pdf] 
Memisevic, R. and Hinton, G. E.
Learning to represent spatial
transformations with factored higherorder Boltzmann
machines
Neural Computation, Vol 22, pp 14731492 [pdf] 
Hinton, G. E. and Salakhutdinov, R.
Discovering Binary Codes for Fast
Document Retrieval by Learning Deep Generative
Models Topics in Cognitive Science, Vol 3, pp 7491 [pdf] 
Ruslan Salakhutdinov, Josh Tenenbaum , Antonio
Torralba.
Learning to Learn with Compound
HierarchicalDeep Models
Neural Information Processing Systems (NIPS 25), 2012 [ pdf] 
Ruslan Salakhutdinov and Geoffrey Hinton.
An Efficient Learning Procedure for
Deep Boltzmann Machines
MIT Technical Report MITCSAILTR2010037, 2010 [ pdf] 
Yoshua Bengio.
Learning Deep Architectures for
AI
Foundations and Trends in Machine Learning: Vol. 2: No. 1, pp 1127, 2009 [pdf] 
L. Büsing, J. Bill, B. Nessler, and
W. Maass
Neural dynamics as sampling: A model
for stochastic computation in recurrent networks of
spiking neurons. PLoS Computational Biology,
published 03 Nov 2011. doi:10.1371/journal.pcbi.1002211
(pdf)

This paper shows how Boltzman
machines can be implemented by networks of spiking
neurons. 
D. Pecevski, L. Büsing, and
W. Maass
Probabilistic inference in general
graphical models through sampling in stochastic
networks of spiking neurons.
PLoS Computational Biology, 7(12):e1002294, 2011 (pdf) 