1 / 48

NENS220 Computational methods in Neuroscience

NENS220 Computational methods in Neuroscience. John Huguenard and Terry Sanger. Goals of the course. Overview of computational methods Mathematical techniques for creating models of neural behavior - the tools of computational methods. Computational Modeling.

norman
Télécharger la présentation

NENS220 Computational methods in Neuroscience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NENS220 Computational methods in Neuroscience John Huguenard and Terry Sanger

  2. Goals of the course • Overview of computational methods • Mathematical techniques for creating models of neural behavior - the tools of computational methods

  3. Computational Modeling • The ultimate purpose is to relate different levels (scales) of neural behavior • e.g.: how do properties of ion channels determine the spiking behavior in response to synaptic input? • e.g.: what is the relationship between spike activity in a population of M1 neurons and movement of the arm?

  4. Scope of the course • This is essentially an overview of some (but not all) of the general methods • Intended for graduate students in neuroscience • In order to learn how this is done, you will have to practice • Necessarily involves knowledge of statistics, mathematics, and some computer programming (matlab, NEURON)

  5. Background material • Probability theory • Information theory • Matrix algebra • Correlation integrals • Fourier analysis • Matlab programming • Membrane potentials • Cable theory

  6. Background review • We will do much of this as we go. • Additional help in TA sessions • You may need to do extra reading

  7. Two major areas • I: Neurons • How information is processed at the level of synapses, membranes, and dendrites • Relationship between inputs, membrane potentials, and spike generation • II: Spikes: • What information is carried in single spikes, temporal sequences of spikes, and spikes over populations • How learning results in changes in spike patterns

  8. Textbook • Theoretical Neuroscience, Peter Dayan and Larry Abbott, (MIT Press: Cambridge MA), 2001. • Available from Amazon.com and the Stanford bookstore, about $45 • Other useful references: • Neural Engineering, Andersen and Elliasmith • Spikes, Bialek • Computational Neuroscience, Churchland and Sejnowski • Handbook of computational neuroscience, Arbib • Foundations of Cellular Neurophysiology, Johnston and Wu • You must have access to a workstation with matlab/NEURON. • Matlab available on cluster computers (firebirds, etc) • NEURON available for multiple platforms via free download • We can set up accounts on linux machines with NEURON installed.

  9. Class structure • Tuesdays and Thursdays, 3:15-5:00pm. Room H3150. • Tuesdays will be lectures • Lecture will usually follow the text chapters; you may want to read these in advance • A paper will be assigned, to be read before Thursday (first paper assigned next Tuesday) • Thursdays will be discussions of the assigned paper and the lecture, led by the TA. THESE ARE REQUIRED.

  10. Homework assignments • 4-6 problems sets during quarter. They will be assigned on Tuesday and due the Thursday of the following week (9 days later) • Will usually require simulation of some component of the paper being discussed. • Will require use of matlab/NEURON. You should submit the program output and source code with detailed comments • Should require 1-3 hours, depending on how good you are with matlab/NEURON

  11. Grading • 70% weekly assignments • Based on output plots, code, and comments • 30% Class participation • Based on contributions to discussion groups

  12. Syllabus, Part I

  13. Syllabus, Part II

  14. Part I Modeling of realistic neurons and networks John Huguenard

  15. Neuroelectronics, Part I John Huguenard

  16. The big picture, a la Terry Sanger • An “external signal” x(t) is something that the experimenter controls (either a sensory stimulus or a learned motor task) • We observe spikes that are the result of a transformation of the external signal External World Sensors Spike Generator x(t) spikes

  17. The big picture, a la John Huguenard • Neurons receive synaptic input • Neurons produce output • The currency of neuronal communication is spikes (action potentials) • Spike generation is in many cases a nonlinear function of synaptic input External World Sensors Spike Generator x(t) spikes

  18. Why it is important to consider neuronal properties.. • STDP, dendritic back propagation, dendritic signaling • Resonance • Oscillations • Synchronization • Gain control • Persistent activity • Phase precession • Coincidence detection vs. integration

  19. Electrochemical signaling

  20. Diversity of Neuronal Morphology

  21. Neocortical Networks

  22. Pyramidal Neurons in Layer V thy1-YFP mouse Feng et al., (2000) Neuron 28:41 200 µm

  23. Canonical Microcircuits Recurrent excitatory connections are prominent. Function: Amplification of signals for enhanced feature detection. Rodney Douglas & Kevan Martin

  24. Inhibitory cells are sparse I II III IV V VI

  25. Inhibitory interneuron diversity Modified From: Karube et al., (2004) J Neurosci 24:2853-65

  26. Morphology can influence firing pattern Mainen & Sejnowski, 1996

  27. Electrical properties of neurons Dominated by membrane capacitance Neurons are integrators whose time constant is dynamically variable Spike output depends on voltage-dependent gating of ion channels

  28. Passive properties of neurons Semipermeable lipid bilayer membrane with high [K+]i maintained by electrogenic pump (ATPase) Equivalent radius ~ 25 mm, Surface Area ~ 8000 mm2=.008 mm2=8e-5cm2

  29. _ + Electrical capacitance • Ability to store charge • Charge required to create potential difference between two conductors • A 1 Farad capacitor will store 1 Coulomb/Volt Hille, 2001

  30. _ + Capacitance of cell membranes • Capacitance is a function of • Surface area (A) • Dielectric constant (e) • Distance between plates (d) • For membranes specific capacitance  1 mF/cm2 • is for the most part invariant • for a 0.8e-5cm2 cell ~ 80 pF Hille, 2001

  31. Resting potential, single permeant ion Nernst Potentials EK = -75 mV ENa = +50 mV ECl = -60 mV ECa = +100 mV Nernst equation:

  32. Uncompensated charge • [K+]i 130 mM • [K+]o 3 mM • EK –100 mV • q=CV • = 80 pF*100mV • = 8pC • = 50e6 K+ ions • Total K+I = 5e12 ions • Fraction uncompensated = 0.001% • Will vary with surface to volume ratio

  33. Membrane Resistance • Ion selective pores • Ohm’s law E = IR • 1V is the potential difference produced by 1A passing through 1 Ohm • Conductance is reciprocal of resistance,1 Siemen = 1 Ohm-1 • Resistance is dependent on length of conductive path, cross sectional area, and resistivity of the media • Ion channels have conductances in the 2-250 pS range, but may open only briefly

  34. Input Resistance • “Leak” channels are open at rest and determine the input resistance • i.e. the impedance to extrinsic current injection • Specific input resistance for neurons is in the range of 1 MW mm2 or 1 mS/mm2 • 50,000 20 pS leak channels/mm2 = 1 channel / 20 mm2 • Our “typical” cell of 0.008 mm2 would have an input resistance of 125 MW, or input conductance of 8 nS (equivalent to ~400 open leak channels).

  35. Resting membrane potential>1 permeant ion Parallel conductance model

  36. Ohmic channels • Characterized by an open channel I/V that is linear • I = (Vrev- Vm)/R • I = (Vrev- Vm) * g g = slope Vrev

  37. Non-ohmic channels • Goldman-Hodgkin-Katz (GHK) theory • Ions pass independently • Electrical field within membrane is constant • sometimes known as the constant field equations • GHK current equation (flux in two directions) • GHK Voltage equation

  38. Nonlinear driving force • GHK current equation [S]i>[S]o Better description than ohmic for some channels e.g. Ca2+, K+

  39. Voltage dependent conductances • Channel opening is a function of transmembrane voltage

  40. Latching, up- and down-states • Stable systems have positive slope to I/V curve. E.g., neurons with only leak currents. • Voltage dependent conductances can lead to regions of negative slope conductance with two stable states.

  41. Two types of channels • gK:gNa =3:1, both linear

  42. Two types of channels, one with V dependent conductance • gK:gNa =3:5

  43. Integration, membrane time constant

  44. Signalling • Synaptic input • Transient inputs from other sources • i.e. sensors or lower level neurons • Spike output • Generation of action potentials, which will then propagate the signal to the neurons at the next level, again via synapses

  45. Chemical synapses • Excitatory (ES > Em) • Inhibitory (Es < Em) • shunting • Rapid increase in gs followed by exponential decay (tD = 1 - 100 ms) • Approximated by alpha function • Or sum of exponentials (more realistic)

  46. Spike generation • Nonlinear, “all or none” response • Based on avalanche type reaction • Characterized by • Threshold • High conductance reset • Refractory period • Can be simulated by integrate and fire synthetic neuron

More Related