Now available from MIT Press

PULSED NEURAL NETWORKS
by Wolfgang Maass and Christopher M. Bishop (eds.)







General Information about the Book
Preface
Contributors
Table of Contents

Return to [IGI Homepage][Wolfgang Maass' Homepage]

General Information about the Book

Pulsed Neural Networks

MIT Press 
ISBN 0-262-13350-4 
408 pp., 195 illus.
$45.00 (cloth)
 
 

 link to MIT Press catalogue

 link to amazon bookstore

 

Pulsed Neural Networks
 

Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation.

This book presents the complete spectrum of current research in pulsed neural networks and includes the most important work from many of the key scientists in the field. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book.

Return to [Beginning of this page] [Wolfgang Maass' Homepage]

Preface

The majority of artificial neural network models are based on a computational paradigm involving the propagation of continuous variables from one processing unit to the next. In recent years, however, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of these pulses to transmit information and to perform computation. This realization has stimulated a significant growth of research activity in the area of pulsed neural networks ranging from neurobiological modeling and theoretical analyses, to algorithm development and hardware implementations. Such research is motivated both by the desire to enhance our understanding of information processing in biological networks, as well as by the goal of developing new information processing technologies. 

Our aim in producing this book has been to provide a comprehensive treatment of the field of pulsed neural networks, which will be accessible to researchers from diverse disciplines such as electrical engineering, signal processing, computer science, physics, and computational neuroscience. By virtue of its pedagogical emphasis, it will also find a place in many of the advanced undergraduate and graduate courses in neural networks now taught in many universities. 

The Isaac Newton Institute

This book originated from a two-day workshop entitled Pulsed Neural Networks that we organized in August 1997 at the Isaac Newton Institute for Mathematical Sciences in Cambridge1.The workshop formed part of the six-month Newton Institute program Neural Networks and Machine Learning, organized by Chris Bishop, David Haussler, Geoffrey Hinton, Mahesan Niranjan and Leslie Valiant. This research program was the largest international event of its kind to have taken place in the field of neural computing, and attracted several hundred participants for visits ranging from one or two weeks up to six months. 

The workshop on Pulsed Neural Networks comprised two days of invited presentations by many of the foremost researchers in the field, and proved to be a very timely event. In view of the interdisciplinary nature of this subject, the workshop included a number of tutorials that introduced pulsed neural networks from the point of view of different disciplines. As a result of the success of the workshop, there was considerable enthusiasm to capture the highlights of the meeting in book form and thereby make the workshop contributions, including both tutorials and research presentations, accessible to a much wider audience. All contributions were rewritten to take into account the special context of this book, and to use consistent terminology and notation across the different disciplines. We hope this book will convey some of the excitement of the workshop and of the field of pulsed neural networks. 

Overview of the Book

The Foreword by Terry Sejnowski sets the stage for the book. The core of the book consists of three parts. The first part (Basic Concepts and Models) comprises four tutorial chapters. The tutorial Spiking Neurons (Chapter 1) by Wulfram Gerstner introduces the neurophysiological background and motivations for computing with pulses. It discusses a simple mathematical model for a spiking neuron, the spike response model, that provides the basis for later chapters. The tutorial Computing with Spiking Neurons (Chapter 2) by Wolfgang Maass analyzes the computational power of networks of spiking neurons, and compares them with traditional neural network models. Hardware implementations of pulsed neural nets are discussed in the tutorial Pulsed-Based Computations in VLSI Neural Networks (Chapter 3) by Alan Murray. The tutorial Encoding Information in Neural Activity (Chapter 4) by Michael Recce surveys results about the way in which biological systems encode information in spatial-temporal patterns of pulses. 

The chapters in the second part of the book (Implementations) review a number of options for implementing pulsed neural nets in electronic hardware. Chapters 5 to 8 discuss approaches, and first results, for implementing artificial pulsed neural nets in analog VLSI. Chapter 9 reviews the state of the art regarding digital simulations of pulsed neural nets. 

The third part of the book (Design and Analysis of Pulsed Neural Systems) surveys current research on the design and analysis of pulsed neural networks, in both biological and artificial systems. 

Each of the chapters in the second and third part should be comprehensible to anyone who has worked through the four tutorials in the first part of the book. Together these chapters constitute a survey of current research issues across all aspects of pulsed neural networks including mathematical analyses, algorithms, hardware and software implementations, and neurobiology.

Christopher M. Bishop and Wolfgang Maass

1Further information about the Isaac Newton Institute can be found at http://www.newton.cam.ac.uk.

Return to [Beginning of this page] [Wolfgang Maass' Homepage]

Contributors

Christopher M. Bishop     Preface
Microsoft Research, Cambridge
Cambridge CB2 3NH, England, UK
cmbishop@microsoft.com
http://research.microsoft.com/~cmbishop/
Peter S. Burge     Chapter 13
Department of Computer Science
Royal Holloway, University of London
Egham, England, UK
peter@neurocolt.com
Max R. van Daalen     Chapter 13
Department of Computer Science
Royal Holloway, University of London
Egham, England, UK
max@dcs.rhbnc.ac.uk
Stephen R. Deiss    Chapter 6
Applied Neurodynamics
Encinitas, CA,  92024-5354, USA
deiss@sba.cerf.net
 
Rodney J. Douglas        Chapter 6
Institut für Neuroinformatik
Universität Zürich & ETH Zürich
Zürich, Switzerland
rjd@ini.phys.ethz.ch
John G. Elias     Chapter 5
Department of Electrical and 
Computer Engineering
University of Delaware
Newark, Delaware 19716, USA
elias@udel.edu
Wulfram Gerstner     Chapters 1, 10, 14
Center for Neuromimetic Systems
Swiss Federal Institute of Technology, EPFL
CH-1015 Lausanne, Switzerland
Wulfram.Gerstner@di.epfl.ch
Alister Hamilton    Chapter 8
Dept. of Electrical Engineering
University of Edinburgh
Edinburgh, Scotland, UK
Alister.Hamilton@ee.ed.ac.uk
J. Leo van Hemmen     Chapter 14
Physik Department, TU München
D-85747 Garching bei München
München, Germany
Leo.van.Hemmen@Physik.TU-München.de
David Horn     Chapter 11
School of Physics and Astronomy
Tel Aviv University
Tel Aviv, Israel
horn@neuron.tau.ac.il
Axel Jahnke     Chapter 9
Institut of Microelectronics
TU Berlin
Berlin, Germany
jahnke@mikro.ee.tu-berlin.de
Richard Kempter     Chapter 14
Institut für Theoretische Physik
Physik-Department der TU München
München, Germany
Richard.Kempter@Physik.TU-Muenchen.DE
Wolfgang Maass     Preface, Chapters 2, 12
Institute of Theoretical Computer Science
Technische Universität Graz
A-8010 Graz, Austria
maass@igi.tu-graz.ac.at
Alessandro Mortara     Chapter 7
Advanced Microelectronics Division 
Centre Suisse d'Electronique 
et de Microtechnique
Neuchatel, Switzerland 
mortara@csemne.ch
Alan F. Murray Chapter 3              
Dept. of Electrical Engineering 
University of Edinburgh
Edinburgh, EH9 3JL., England, UK       
Alan.Murray@ee.ed.ac.uk
David P. M. Northmore     Chapter 5
Department of Psychology
University of Delaware
Newark, Delaware 19716, USA
northmor@udel.edu
Irit Opher     Chapter 11
School of Physics and Astronomy
Tel Aviv University
Tel Aviv, Israel
irit@neuron.tau.ac.il
Kostas A. Papathanasiou   Chapter 8
Department of Electrical
Engineering
University of Edinburgh
Edinburgh, Scotland, UK
Kostas.Papathanasiou@ee.ed.ac.uk
Michael Recce     Chapter 4
Department of Computer and 
Information Science
New Jersey Institute of Technology
Newark, NJ 07102, USA
recce@homer.njit.edu
Barry J. P. Rising     Chapter 13
Department of Computer Science
Royal Holloway, University of
London
Egham, England, UK
barry@dcs.rhbnc.ac.uk
Ulrich Roth     Chapter 9
Institut of Microelectronics
TU Berlin
Berlin, Germany
roth@mikro.ee.tu-berlin.de
Tim Schönauer     Chapter 9
Institut of Microelectronics
TU Berlin
Berlin, Germany
tim@mikro.ee.tu-berlin.de
Terrence J. Sejnowski     Foreword
The Salk Institute
La Jolla, CA 92037, USA
terry@salk.edu
John S. Shawe-Taylor     Chapter 13
Department of Computer Science
Royal Holloway, University of 
London
Egham, UK
john@dcs.rhbnc.ac.uk
Philippe Venier     Chapter 7
Advanced Microelectronics Division 
Centre Suisse d'Electronique 
et de Microtechnique
Neuchatel, Switzerland
venier@csemne.ch
Hermann Wagner     Chapter 14
Institut für Biologie       
Lehrstuhl für Zoologie/Tier-
physiologie
RWTH Aachen
D-52074 Aachen, Germany
wagner@tyto.bio2.rwth-aachen.de
Adrian M. Whatley     Chapter 6
Institut für Neuroinformatik
Universität Zürich & ETH Zürich
Zürich, Switzerland
amw@ini.phys.ethz.ch
Anthony M. Zador    Chapter 12
The Salk Institute
La Jolla, CA 92037, USA
zador@salk.edu
Return to [Beginning of this page] [Wolfgang Maass' Homepage]

Table of Contents

for Pulsed Neural Networks by Wolfgang Maass and Christopher M. Bishop (eds.)

Foreword by Terrence J. Sejnowski

Preface

Contributors to the book
 

Basic Concepts and Models

1   Spiking Neurons
1.1 The Problem of Neural Coding
1.2 Neuron Models
1.3 Conclusions
References
2   Computing with Spiking Neurons
2.1 Introduction
2.2 A Formal Computational Model for a Network of Spiking Neurons
2.3 McCulloch-Pitts Neurons versus Spiking Neurons
2.4 Computing with Temporal Patterns
2.5 Computing with a Space-Rate Code
2.6 Computing with Firing Rates
2.7 Firing Rates and Temporal Correlations
2.8 Networks of Spiking Neurons for Storing and Retrieving Information
2.9 Computing on Spike Trains
2.10 Conclusions
References
3   Pulse-Based Computation in VLSI Neural Networks
3.1 Background
3.2 Pulsed Coding: A VLSI Perspective
3.3 A MOSFET Introduction
3.4 Pulse Generation VLSI
3.5 Pulsed Arithmetic in VLSI
3.6 Learning in Pulsed Systems
3.7 Summary and Issues Raised
References
4   Encoding Information in Neuronal Activity
4.1 Introduction
4.2 Synchronization and Oscillations
4.3 Temporal Binding
4.4 Phase Coding
4.5 Dynamic Range and Firing Rate Codes
4.6 Interspike Interval Variability
4.7 Synapses and Rate Coding
4.8 Summary and Implications
References

Implementations

5   Building Silicon Nervous Systems with Dendritic Tree Neuromorphs
5.1 Introduction
5.2 Implementation in VLSI
5.3 Neuromorphs in Action
5.4 Conclusions
References
6   A Pulse-Coded Communications Infrastructure
6.1 Introduction
6.2 Neuromorphic Computational Nodes
6.3 Neuromorphic aVLSI Neurons
6.4 Address Event Representation (AER)
6.5 Implementations of AER
6.6 Silicon Cortex
6.7 Functional Tests of Silicon Cortex
6.8 Future Research on AER Neuromorphic Systems
References
7   Analog VLSI Pulsed Networks for Perceptive Processing
7.1 Introduction
7.2 Analog Perceptive Nets Communication Requirements
7.3 Analysis of the NAPFM Communication System
7.4 Address Coding
7.5 Silicon Retina Equipped with the NAPFM Commuincation System
7.6 Projective Field Generation
7.7 Description of the Integrated Circuit for Orientation Enhancement
7.8 Display Interface
7.9 Conclusion
References
8   Preprocessing for Pulsed Neural VLSI Systems
8.1 Introduction
8.2 A Sound Segmentation System
8.3 Signal Processing in Analog VLSI
8.4 Palmo - Pulse Based Signal Processing
8.5 Conclusions
8.6 Further Work
8.7 Acknowledgements
References
9   Digital Simulation of Spiking Neural Networks
9.1 Introduction
9.2 Implementation Issues of Pulse-Coded Neural Networks
9.3 Programming Environment
9.4 Concepts of efficient Simulation
9.5 Mapping Neural Networks on Parallel Computers
9.6 Performance Study
References

Design and Analysis of Pulsed Neural Systems

10   Populations of Spiking Neurons
10.1 Introduction
10.2 Model
10.3 Population Activity Equation
10.4 Noise-Free Population Dynamics
10.5 Locking
10.6 Transients
10.7 Incoherent Firing
10.8 Conclusions
References
11   Collective Excitation Phenomena and Their Applications
11.1 Introduction
11.2 Synchronization of Pulse Coupled Oscillators
11.3 Clustering via Temporal Segmentation
11.4 Limits on Temporal Segmentation
11.5 Image Analysis
11.6 Solitary Waves
11.7 The Importance of Noise
11.8 Conclusions
References
12   Computing and Learning with Dynamic Synapses
12.1 Introduction
12.2 Biological Data on Dynamic Synapses
12.3 Quantitative Models
12.4 On the Computational Role of Dynamic Synapses
12.5 Implications for Learning in Pulsed Neural Nets
12.6 Conclusions
References
13   Stochastic Bit-Stream Neural Networks
13.1 Introduction
13.2 Basic Neural Modelling
13.3 Feedforward Networks and Learning
13.4 Generalization Analysis
13.5 Recurrent Networks
13.6 Applications to Graph Colouring
13.7 Hardware Implementation
13.8 Conclusions
References
14   Hebbian Learning of Pulse Timing in the Barn Owl Auditory System
14.1 Introduction
14.2 Hebbian Learning
14.3 Barn Owl Auditory System
14.4 Phase Locking
14.5 Delay Tuning by Hebbian Learning
14.6 Conclusions
References
Return to [Beginning of this page] [Wolfgang Maass' Homepage]

created by Heike Graf, 1998-19-11
updated 2002-08-19