THEORETICAL NEUROSCIENCE - PART 1 docx

43 112 0
THEORETICAL NEUROSCIENCE - PART 1 docx

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

THEORETICAL NEUROSCIENCE Peter Dayan and L.F. Abbott Preface PART I - ANALYZING AND MODELING NEURAL RESPONSES Chapter 1 - Neural Encoding I: Firing Rates and Spike Statistics Introduction Properties of Neurons Recording Neuronal Responses From Stimulus to Response Spike Trains and Firing Rates Measuring Firing Rates Tuning Curves Spike-Count Variability What Makes a Neuron Fire? Describing the Stimulus The Spike-Triggered Average White-Noise Stimuli Multiple-Spike-Triggered Averages and Spike-Triggered Correlations Spike Train Statistics The Homogeneous Poisson Process The Spike-Train Autocorrelation Function The Inhomogeneous Poisson Process The Poisson Spike Generator Comparison with Data The Neural Code Independent-Spike, Independent Neuron and Correlation Codes Temporal Codes Chapter Summary Appendices A) The Power Spectrum of White Noise B) Moments of the Poisson Distribution D) Inhomogeneous Poisson Statistics Annotated Bibliography Chapter 2 - Neural Encoding II: Reverse Correlation and Receptive Fields Introduction Estimating Firing Rates The Most Effective Stimulus Static Nonlinearities Introduction to the Early Visual System THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (1 of 7) [15-02-2002 0:32:12] The Retinotopic Map Visual Stimuli The Nyquist Frequency Reverse Correlation Methods - Simple Cells Spatial Receptive Fields Temporal Receptive Fields Response of a Simple Cell to a Counterphase Grating Space-Time Receptive Fields Nonseparable Receptive Fields Static Nonlinearities - Simple Cells Static Nonlinearities - Complex Cells Receptive Fields in the Retina and LGN Constructing V1 Receptive Fields Chapter Summary Appendices A) The Optimal Kernel B) The Most Effective Stimulus C) Bussgang's Theorem Annotated Bibliography Chapter 3 - Neural Decoding Encoding and Decoding Discrimination ROC Curves ROC Analysis of Motion Discrimination The Likelihood Ratio Test Population Decoding Encoding and Decoding Direction Optimal Decoding Methods Fisher Information Optimal Discrimination Spike Train Decoding Chapter Summary Appendices A) The Neymann-Pearson Lemma B) The Cramér-Rao Bound C) The Optimal Spike-Decoding Filter Annotated Bibliography Chapter 4 - Information Theory Entropy and Mutual Information Entropy Mutual Information THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (2 of 7) [15-02-2002 0:32:12] Entropy and Mutual Information for Continuous Variables Information and Entropy Maximization Entropy Maximization for a Single Neuron Populations of Neurons Application to Retinal Ganglion Cell Receptive Fields The Whitening Filter Filtering Input Noise Temporal Processing in the LGN Cortical Coding Entropy and Information for Spike Trains Chapter Summary Appendix Positivity of the Kulback-Leibler Divergence Annotated Bibliography PART II - MODELING NEURONS AND NETWORKS Chapter 5 - Model Neurons I: Neuroelectronics Levels of Neuron Modeling Electrical Properties of Neurons Intracellular Resistance Membrane Capacitance and Resistance Equilibrium and Reversal Potentials The Membrane Current Single-Compartment Models Integrate-and-Fire Models Spike-Rate Adaptation and Refractoriness Voltage-Dependent Conductances Persistent Conductances Transient Conductances Hyperpolarization-Activated Conductances The Hodgkin-Huxley Model Modeling Channels Synaptic Conductances The Postsynaptic Conductance Release Probability and Short-Term Plasticity Synapses on Integrate-and-Fire Neurons Regular and Irregular Firing Modes Chapter Summary Appendices A) Integrating the Membrane Potential B) Integrating the Gating Variables Annotated Bibliography Chapter 6 - Model Neurons II: Conductances and Morphology THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (3 of 7) [15-02-2002 0:32:12] Levels of Neuron Modeling Conductance-Based Models The Connor-Stevens Model Postinhibitory Rebound and Bursting The Cable Equation Linear Cable Theory An Infinite Cable An Isolated Branching Node The Rall Model The Morphoelectrotonic Transform Multi-Compartment Models Action Potential Propagation Along an Unmyelinated Axon Propagation Along a Myelinated Axon Chapter Summary Appendices A) Gating Functions for Conductance-Based Models Connor-Stevens Model Transient Ca 2+ Conductances Ca 2+ -dependent K + Condutances B) Integrating Multi-Compartment Models Annotated Bibliography Chapter 7 - Network Models Introduction Firing-Rate Models Feedforward and Recurrent Networks Continuously Labelled Networks Feedforward Networks Neural Coordinate Transformations Recurrent Networks Linear Recurrent Networks Selective Amplification Input Integration Continuous Linear Recurrent Networks Nonlinear Recurrent Networks Nonlinear Amplification A Recurrent Model of Simple Cells in Primary Visual Cortex A Recurrent Model of Complex Cells in Primary Visual Cortex Winner-Take-All Input Selection Gain Modulation Sustained Activity Maximum Likelihood and Network Recoding Network Stability THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (4 of 7) [15-02-2002 0:32:12] Associative Memory Excitatory-Inhibitory Networks Homogeneous Excitatory and Inhibitory Populations Phase-Plane Methods and Stability Analysis The Olfactory Bulb Oscillatory Amplification Stochastic Networks Chapter Summary Appendix Lyapunov Function for the Boltzman Machine Annotated Bibliography PART III - PLASTICITY AND LEARNING Chapter 8 - Plasticity and Learning Introduction Stability and Competition Synaptic Plasticity Rules The Basic Hebb Rule The Covariance Rule The BCM Rule Synaptic Normalization Subtractive Normalization Multiplicative Normalization and the Oja Rule Timing-Based Rules Unsupervised Learning Single Postsynaptic Neuron Principal Component Projection Hebbian Development and Ocular Dominance Hebbian Development of Orientation Selectivity Temproal Hebbian Rules and Trace Learning Multiple Postsynaptic Neurons Fixed Linear Recurrent Connections Competitive Hebbian Learning Feature-Based Models Anti-Hebbian Modification Timing-Based Plasticity and Prediction Supervised Learning Supervised Hebbian Learning Classification and the Perceptron Function Approximation Supervised Error-Correcting Rules The Perceptron Learning Rule The Delta Rule Contrastive Hebbian Learning THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (5 of 7) [15-02-2002 0:32:12] Chapter Summary Appendix Convergence of the Perceptron Learning Rule Annotated Bibliography Chapter 9 - Classical Conditioning and Reinforcement Learning Introduction Classical Conditioning Predicting Reward - The Rescola-Wagner Rule Predicting Reward Timing - Temporal-Difference Learning Dopamine and Prediction of Reward Static Action Choice The Indirect Actor The Direct Actor Sequential Action Choice The Maze Task Policy Evaluation Policy Improvement Generalizations of Actor-Critic Learning Learning the Water Maze Chapter Summary Appendix - Markov Decision Problems The Bellman Equation Policy Iteration Annotated Bibliography Chapter 10 - Representational Learning Introduction Density Estimation Factor Analysis Principal Components Analysis Clustering Sparse Coding Independent Components Analysis Multi-Resolution and Wavelet Models The Helmholtz Machine Chapter Summary Annotated Bibliography Appendix - Mathematical Methods Introduction Linear Algebra Differential Equations Probability Theory Fourier Transforms THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (6 of 7) [15-02-2002 0:32:12] Electrical Circuits The δ Function Lagrange Multipliers Annotated Bibliography THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (7 of 7) [15-02-2002 0:32:12] Preface Theoretical analysis and computational modeling are important tools for characterizing what nervous systems do, determining how they function, and understanding why they operate in particular ways. Neuroscience encompasses approaches ranging from molecular and cellular studies to human psychophysics and psychology. Theoretical neuroscience encour- ages cross-talk among these sub-disciplines by constructing compact rep- resentations of what has been learned, building bridges between different levels of description, and identifying unifying concepts and principles. In this book, we present the basic methods used for these purposes and dis- cuss examples in which theoretical approaches have yielded insight into nervous system function. The questions what, how, and why are addressed by descriptive, mecha- nistic, and interpretive models, each of which we discuss in the following chapters. Descriptive models summarize large amounts of experimental descriptive models data compactly yet accurately, thereby characterizing what neurons and neural circuits do. These models may be based loosely on biophysical, anatomical, and physiological findings, but their primary purpose is to de- scribe phenomena not to explain them. Mechanistic models, on the other mechanistic models hand, address the question of how nervous systems operate on the ba- sis of known anatomy, physiology, and circuitry. Such models often form a bridge between descriptive models couched at different levels. Inter- pretive models use computational and information-theoretic principles to interpretive models explore the behavioral and cognitive significance of various aspects of ner- vous system function, addressing the question of why nervous system op- erate as they do. It is often difficul to identify the appropriate level of modeling for a partic- ular problem. A frequent mistake is to assume that a more detailed model is necessarily superior. Because models act as bridges between levels of understanding, they must be detailed enough to make contact with the lower level yet simple enough to yield clear results at the higher level. Draft: December 17, 2000 Theoretical Neuroscience 2 Organization and Approach This book is organized into three parts on the basis of general themes. Part I (chapters 1-4) is devoted to the coding of information by action potentials and the represention of information by populations of neurons with selective responses. Modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics is presented in part II (chapters 5-7). The role of plasticity in development and learning is discussed in Part III (chapters 8-10). With the exception of chapters 5 and 6, which jointly cover neuronal modeling, the chapters are largely independent and can be selected and ordered in a variety of ways for a one- or two-semester course at either the undergraduate or graduate level. Although we provide some background material, readers without previ- ous exposure to neuroscience should refer to a neuroscience textbook such as Kandel, Schwartz & Jessell (2000); Nicholls, Martin & Wallace (1992); Bear, Connors & Paradiso (1996); Shepherd (1997); Zigmond, Bloom, Lan- dis & Squire (1998); Purves et al (2000). Theoretical neuroscience is based on the belief that methods of mathemat- ics, physics, and computer science can elucidate nervous system function. Unfortunately, mathematics can sometimes seem more of an obstacle than an aid to understanding. We have not hesitated to employ the level of analysis needed to be precise and rigorous. At times, this may stretch the tolerance of some of our readers. We encourage such readers to consult the mathematical appendix, which provides a brief review of most of the mathematical methods used in the text, but also to persevere and attempt to understand the implications and consequences of a difficult derivation even if its steps are unclear. Theoretical neuroscience, like any skill, can only be mastered with prac- tice. We have provided exercises for this purpose on the web site for this book and urge the reader to do them. In addition, it will be highly in- structive for the reader to construct the models discussed in the text and explore their properties beyond what we have been able to do in the avail- able space. Referencing In order to maintain the flow of the text, we have kept citations within the chapters to a minimum. Each chapter ends with an annotated bib- liography containing suggestions for further reading (which are denoted by a bold font), information about work cited within the chapter, and ref- erences to related studies. We concentrate on introducing the basic tools of computational neuroscience and discussing applications that we think best help the reader to understand and appreciate them. This means that a number of systems where computational approaches have been applied Peter Dayan and L.F. Abbott Draft: December 17, 2000 3 with significant success are not discussed. References given in the anno- tated bibliographies lead the reader toward such applications. In most of the areas we cover, many people have provided critical insights. The books and review articles in the further reading category provide more comprehensive references to work that we apologetically have failed to cite. Acknowledgments We are extremely grateful to a large number of students at Brandeis, the Gatsby Computational Neuroscience Unit and MIT, and colleagues at many institutions, who have painstakingly read, commented on, and criticized, numerous versions of all the chapters. We particularly thank Bard Ermentrout, Mark Kvale, Mark Goldman, John Hertz, Zhaoping Li, Eve Marder, and Read Montague for providing extensive discussion and advice on the whole book. A number of people read significant portions of the text and provided valuable comments, criticism, and in- sight: Bill Bialek, Pat Churchland, Nathanial Daw, Dawei Dong, Peter F ¨ oldi ´ ak, Fabrizio Gabbiani, Zoubin Ghahramani, Geoff Goodhill, David Heeger, Geoff Hinton, Ken Miller, Tony Movshon, Phil Nelson, Sacha Nel- son, Bruno Olshausen, Mark Plumbley, Alex Pouget, Fred Rieke, John Rinzel, Emilio Salinas, Sebastian Seung, Mike Shadlen, Satinder Singh, Rich Sutton, Nick Swindale, Carl Van Vreeswijk, Chris Williams, David Willshaw, Charlie Wilson, Angela Yu, and Rich Zemel. We have received significant additional assistance and advice from: Greg DeAngelis, Matt Beal, Sue Becker, Tony Bell, Paul Bressloff, Emery Brown, Matteo Caran- dini, Frances Chance, Yang Dan, Kenji Doya, Ed Erwin, John Fitzpatrick, David Foster, Marcus Frean, Ralph Freeman, Enrique Garibay, Frederico Girosi, Charlie Gross, Mike Jordan, Sham Kakade, Szabolcs K ´ ali, Christof Koch, Simon Laughin, John Lisman, Shawn Lockery, Guy Mayraz, Quaid Morris, Randy O’Reilly, Max Riesenhuber, Sam Roweis, Simon Osindero, Tomaso Poggio, Clay Reid, Dario Ringach, Horacio Rotstein, Lana Ruther- ford, Ken Sagino, Maneesh Sahani, Alexei Samsonovich, Idan Segev, Terry Sejnowski, Haim Sompolinksy, Fiona Stevens, David Tank, Alessandro Treves, Gina Turrigiano, David Van Essen, Martin Wainwright, Xiao-Jing Wang, Max Welling, Matt Wilson, Danny Young, and Ketchen Zhang. We apologise to anyone we may have inadvertently omitted from these lists. Karen Abbott provided valuable help with the figures. From MIT Press, we thank Michael Rutter for his patience and consistent commitment, and Sara Meirowitz and Larry Cohen for picking up where Michael left off. Draft: December 17, 2000 Theoretical Neuroscience [...]... must be half-wave rectified in these cases (see equation 1. 13), f (s ) = [r0 + (rmax − r0 ) cos(s − smax )]+ Peter Dayan and L.F Abbott (1. 16) Draft: December 17 , 2000 1. 2 Spike Trains and Firing Rates 15 Figure 1. 7B shows how the average firing rate of a V1 neuron depends on retinal disparity and illustrates another important type of tuning curve A B F 40 30 f (Hz) s 20 10 0 -1 .0 -0 .5 0.0 0.5 1. 0 s (retinal... process from the probabilities 1. 29 For spikes counted over an interval of duration T, the variance of the spike count (derived in appendix B) is 2 σn = n2 − n 2 = rT Peter Dayan and L.F Abbott (1. 30) Draft: December 17 , 2000 1. 4 Spike Train Statistics B 1. 0 n=0 n =1 n=2 n=5 PT[n] 0.8 0.6 0.4 0 .15 0 .10 PT[n] A 25 0.05 0.2 0.0 0.00 0 1 2 3 4 5 6 0 5 10 15 20 n rT Figure 1. 11: A) The probability that a... prior to a spike being fired by this neuron Draft: December 17 , 2000 Theoretical Neuroscience reverse correlation function Neural Encoding I: Firing Rates and Spike Statistics 10 0 600 mV 50 mV 200 10 0 τ (ms) τ (ms) -1 00 C (mV) 300 C(τ) (mV) 20 -2 00 200 ms -3 00 Figure 1. 9: The spike-triggered average stimulus for a neuron of the electrosensory lateral-line lobe of the weakly electric fish Eigenmania The upper... computed by triggering on various combinations of spikes Figure 1. 10 shows Draft: December 17 , 2000 Theoretical Neuroscience 22 Neural Encoding I: Firing Rates and Spike Statistics B 50 velocity (degs/s) velocity (degs/s) A 0 0 25 ms velocity (degs/s) C 50 10 ms 50 0 5 ms Figure 1. 10: Single- and multiple-spike-triggered average stimuli for a blowfly H1 neuron responding to a moving visual image A) The average... occurring at time ti , we determine s (ti − τ), and then we sum over all n spikes in a trial, i = 1, 2, , n and divide the total by n In addition, we average over trials Thus, C (τ) = 1 n Draft: December 17 , 2000 n i =1 s (ti − τ) ≈ 1 n n s (ti − τ) (1. 19) i =1 Theoretical Neuroscience spike-triggered average C (τ) 18 Neural Encoding I: Firing Rates and Spike Statistics The approximate equality of the last... Rieke et al 19 97.) The spike-triggered average stimulus can be expressed as an integral of the stimulus times the neural response function of equation 1. 1 If we replace the sum over spikes by an integral, as in equation 1. 2, and use the Peter Dayan and L.F Abbott Draft: December 17 , 2000 1. 3 What Makes a Neuron Fire? 19 approximate expression for C (τ) in equation 1. 19, we find C (τ) = 1 n T 0 dt ρ(t... visual image with a time-varying velocity s (t ) is presented Figure 1. 10A, showing the spike-triggered average stimulus, indicates that this neuron responds to positive angular velocities after alatency of about 15 ms Figure 1. 10B is the average stimulus prior to the appearance of two spikes separated by 10 ± 1 ms This two-spike average is approximately equal to the sum of two single-spiketriggered average... −r t, we find that lim (1 − r t ) M−n = lim (1 + )1/ t→0 →0 −rT = e−rT = exp(−rT ) because lim →0 (1 + )1/ is, by definition, e = exp (1 ) M!/( M − n )! ≈ Mn = ( T/ t )n so PT [n] = Poisson distribution (rT )n exp(−rT ) n! (1. 28) For large M, (1. 29) This is called the Poisson distribution The probabilities PT [n], for a few n values, are plotted as a function of rT in figure 1. 11A Note that, as n increases,... Average firing rate of a cat V1 neuron plotted as a function of the orientation angle of the light bar stimulus The curve is a fit using the function 1. 14 with parameters rmax = 52 .14 Hz, smax = 0◦ , and σ f = 14 .73◦ (A from Hubel and Wiesel, 19 68; adapted from Wandell, 19 95 B data points from Henry et al., 19 74).) f (s ) = rmax exp − 1 2 s − smax σf Gaussian tuning curve 2 (1. 14) where s is the orientation... exp(−ατ)]+ half-wave rectification [ ]+ (1. 12) where 1/ α determines the temporal resolution of the resulting firing-rate estimate The notation [z]+ for any quantity z stands for the half-wave rectification operation, [z]+ = z if z ≥ 0 0 otherwise (1. 13) Figure 1. 4E shows the firing rate approximated by such a causal scheme Note that this rate tends to peak later than the rate computed in figure 1. 4D using . Bibliography Chapter 6 - Model Neurons II: Conductances and Morphology THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics /Theoretical_ Neuroscience/ TOC.htm (3 of 7) [1 5-0 2-2 002 0:32 :12 ] Levels. Multipliers Annotated Bibliography THEORETICAL NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics /Theoretical_ Neuroscience/ TOC.htm (7 of 7) [1 5-0 2-2 002 0:32 :12 ] Preface Theoretical analysis. NEUROSCIENCE file:///E|/Media_folder/Books/books.pdox.net/Physics /Theoretical_ Neuroscience/ TOC.htm (4 of 7) [1 5-0 2-2 002 0:32 :12 ] Associative Memory Excitatory-Inhibitory Networks Homogeneous Excitatory and Inhibitory Populations Phase-Plane Methods

Ngày đăng: 09/08/2014, 20:22

Tài liệu cùng người dùng

Tài liệu liên quan