## Modeling Neural Circuits Made Simple with Python

by Rosenbaum

| ISBN: 9780262378741 | Copyright 2024

### Instructor Requests

An accessible undergraduate textbook in computational neuroscience that provides an introduction to the mathematical and computational modeling of neurons and networks of neurons.

Understanding the brain is a major frontier of modern science. Given the complexity of neural circuits, advancing that understanding requires mathematical and computational approaches. This accessible undergraduate textbook in computational neuroscience provides an introduction to the mathematical and computational modeling of neurons and networks of neurons. Starting with the biophysics of single neurons, Robert Rosenbaum incrementally builds to explanations of neural coding, learning, and the relationship between biological and artificial neural networks. Examples with real neural data demonstrate how computational models can be used to understand phenomena observed in neural recordings. Based on years of classroom experience, the material has been carefully streamlined to provide all the content needed to build a foundation for modeling neural circuits in a one-semester course.

•Proven in the classroom
•Example-rich, student-friendly approach
•Includes Python code and a mathematical appendix reviewing the requisite background in calculus, linear algebra, and probability
•Ideal for engineering, science, and mathematics majors and for self-study

Expand/Collapse All
Contents (pg. vii)
List of Figures (pg. ix)
Preface (pg. xi)
Acknowledgments (pg. xii)
1. Modeling Single Neurons (pg. 1)
1.1. The Leaky Integrator Model (pg. 1)
1.2. The EIF Model (pg. 5)
1.3. Modeling Synapses (pg. 10)
2. Measuring and Modeling Neural Variability (pg. 15)
2.1. Spike Train Variability, Firing Rates, and Tuning (pg. 15)
2.2. Modeling Spike Train Variability with Poisson Processes (pg. 21)
2.3. Modeling a Neuron with Noisy Synaptic Input (pg. 25)
3. Modeling Networks of Neurons (pg. 33)
3.1. Feedforward Spiking Networks and Their Mean-Field Approximation (pg. 33)
3.2. Recurrent Spiking Networks and Their Mean-Field Approximation (pg. 37)
3.3. Modeling Surround Suppression with Rate Network Models (pg. 43)
4. Modeling Plasticity and Learning (pg. 49)
4.1. Synaptic Plasticity (pg. 49)
4.2. Feedforward Artificial Neural Networks (pg. 54)
Appendix A: Mathematical Background (pg. 61)
A.1. Introduction to ODEs (pg. 61)
A.2. Exponential Decay as a Linear, Autonomous ODE (pg. 63)
A.3. Convolutions (pg. 65)
A.4. One-Dimensional Linear ODEs with Time-Dependent Forcing (pg. 69)
A.5. The Forward Euler Method (pg. 71)
A.6. Fixed Points, Stability, and Bifurcations in One-Dimensional ODEs (pg. 74)
A.7. Dirac Delta Functions (pg. 78)
A.8. Fixed Points, Stability, and Bifurcations in Systems of ODEs (pg. 81)
Appendix B: Additional Models and Concepts (pg. 89)
B.1. Ion Channel Currents and the HH Model (pg. 89)
B.2. Other Simplified Models of Single Neurons (pg. 97)
B.3. Conductance-Based Synapse Models (pg. 113)
B.4. Neural Coding (pg. 115)
B.5. Derivations and Alternative Formulations of Rate Network Models (pg. 124)
B.6. Hopfield Networks (pg. 127)
B.7. Training Readouts from Chaotic RNNs (pg. 131)
B.8. DNNs and Backpropagation (pg. 136)
References (pg. 141)
Index (pg. 147)

#### Robert Rosenbaum

Robert Rosenbaum is Associate Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame. His research in computational neuroscience is focused on using computational models of neural circuits to help understand the dynamics and statistics of neural activity underlying sensory processing and learning.

eTextbook