Is the superposition of many random spike trains a Poisson process?
|
|
- Tobias Harrington
- 6 years ago
- Views:
Transcription
1 Is the superposition of many random spike trains a Poisson process? Benjamin Lindner Max-Planck-Institut für Physik komplexer Systeme, Dresden Reference: Phys. Rev. E 73, 2291 (26)
2 Outline Point processes spike train statistics and interval statistics Renewal and nonrenewal processes Poisson process Superposition of many trains - the paradox Resolution of the paradox Conclusions
3 Stationary point processes Given by an ordered set of random points {t i } t i-3 t i-2 t i-1 t i t i+1 t i+2 t with statistics depending only on time differences (stationarity) We can associate each point with a δ spike spike train x(t) = δ(t t i ) Examples for series of events at random (or random ) instances Emission of an electron (shot noise), occurrences of earthquakes, neural firings (action potentials), breakdown of power supply,
4 Interval statistics Interspike intervals are: I i = t i+1 t i I i-1 I i I i+1 I i+2 I i+3 t i-1 t i t i+1 t i+2 t i+3 t Can be equivalently described a hierarchy of interval sequences T 3,i {T n,k } T 2,i T 1,i t i-1 t i t i+1 t i+2 t i+3 t
5 Interval statistics Interspike intervals (ISIs) are: I i = t i+1 t i. Equivalent: n-th order interval T n,k = t i+n t i T 3,i I i-1 I i I i+1 I i+2 I i+3 T 2,i T 1,i t i-1 t i t i+1 t i+2 t i+3 t t i t i+1 t i+2 Serial correlation coefficient measures how strongly the ISIs are correlated t i-1 ρ k = (I i I i )(I i+k I i+k ) (I i I i ) 2 Special class of point processes: renewal processes for which all ISIs are statistically independent PDF of ISI P(I) determines the statistics of the point process completely t i+3 t
6 Spike train statistics Mean value of spike train = spike rate probability to have a spike in [t,t + dt] is given by r dt For stationary process equivalent to time average TR 1 N(T ) x(t) = lim T T dt δ(t t i ) = lim T T = r
7 Second-order statistics Spike train statistics II Correlation function: k(τ) = x(t)x(t + τ) x(t) x(t + τ) Can be written as x(t)x(t + τ) = rδ(τ) + rp(τ ) conditional probability of having a spike at t = τ given we p(τ ) observed another one at t = Power spectrum: S( f ) = Z dτe 2πi f τ k(τ) saturates at high frequency at r, i.e. S( f ) = r +
8 Example Process with reduced probability of short intervals 1 P 1.5 k(τ) τ ISI S(f) f
9 Relation between ISI and spike-train statistics Given the ISI and n-th order interval densities can we calculate the power spectrum ( f > )? S( f ) = Z Z dtk(t)e 2πi ft = r + 2rR Z = r + 2rR = r [ 1 + n=1 dt 2πi ft P n (t)e n=1 ] P n ( f ) + P n ( f ) 2πi ft dt p(t )e For a renewal process: P n ( f ) = P 1 n( f ) S( f ) = [ ( ) ( )] 1 r P 1 ( f ) P 1 ( f ) 1 = r 1 P 1 ( f ) 2 1 P 1 ( f ) 2
10 The simplest process Poissonian shot noise Completely random occurrence of points No correlations between spikes flat power spectrum k(τ) = rδ(τ), S( f ) = r Exponential ISI density, ISIs are uncorrelated (special renewal process) P(I) = r exp[ ri], ρ k =,k > PDF for spike count (number of events in [,T ]) P T (N) = (rt )N exp[ rt ] N!
11 Three ways of generating a Poisson process Distribute N points independently and uniformly on the interval [, T ]. For a time window [,T ] with T T the resulting point process will be Poissonian. Go through the interval [,T ] by steps of t 1/r. In each step draw an independent random number ξ i uniformly distributed in [,1]. If ξ i < r t, there is an event, otherwise not. Draw a sequence of independent random numbers according to the exponential PDF r exp[ rt]. The sequence of spike times is then given by t i = i j=1 T j.
12 Is the superposition of many trains Poissonian? Given a large number of independent non-poissonian spike trains: Is their superposition a Poisson process? Is there a central limit theorem for point processes? Important, for instance, for the problem of estimating the effective input of a neuron connected to 1 4 other neurons
13 Apparently it is... If the total rate of X(t) = N j=1 x j (t) which is r = N j=1 r j remains constant in the limit N, X(t) approaches a Poisson process in this limit [additional assumptions on x j (t) exclude some pathological cases] More recent claims Classical proof by Khinchine (196) Similarly to the central limit theorem [for random variables]..., the sum of independent renewal point processes tends to a Poisson process... (Shimokawa et al., PRE 1999) Since the sum of independent renewal processes tends to a Poisson process, the superposition of a large number of output spike trains can be approximated by a... Poisson process... (Hohn & Burkitt, PRE, 21)
14 Numerical results seem to support it... Numerics shows that for N we get an exponential ISI and a flat power spectrum, i.e. a Poisson process. Power spectrum of summed spike train (Hohn & Burkitt, PRE, 21) lower curve: spectrum of single spike train, because of refractory period it shows a dip at low frequencies upper curve: spectrum of the superposition of 1 independent spike trains, apparently the dip starts to vanish and the spectrum becomes flat also at low frequencies.
15 My test model Single spike train is a renewal but non-poissonian process with ISI density and power spectrum p α (T ) = 4r 2 T exp[ 2rT ], [ 2r 2 ] S α = r 1 4r 2 + (π f ) 2 Consider X(t) = 1 N N x n (t) n=1 (prefactor to keep the mean value of the spike train constant, not of any relevance for the problem) Is X(t) a Poisson spike train?
16 ISI statistics why a Poissonian statistics can be naively concluded from it Spike trains PDF of ISI Correlation coefficient N=1 N=1.1 N= T i T i N=2 N=2.1 N= N=1 N=1.1 N= N=1 N=1.1 N= time k N T i For large N ISI density approaches a simple exponential, ISI correlations for any finite lag vanish X(t) approaches a renewal process with exponential ISI density, i.e. a Poisson process
17 Spectral statistics why X(t) is not a Poissonian spike train Correlation function of summed spike train K X (τ) = X(t)X(t + τ) X(t) X(t + τ) = 1 N 2 x n (t)x l (t + τ) x n (t) x l (t + τ) n,l = 1 N 2 [ x n (t)x n (t + τ) x n (t) x n (t + τ) ] + 1 n N 2 [ x n (t)x l (t + τ) x n (t) x l (t + τ) ] n l Hence K X (τ) = 1 N K x(τ) S X ( f ) = 1 N S x( f ) N S X (f) N=1 N= f N=1 N= f Similar recent finding by Cateau & Reyes Phys. Rev. Lett. 96, 5811 (26)
18 Result for the power spectrum by Hohn and Burkitt? Reminder: H&B considered output of 1 leaky integrate-and-fire neurons driven by white noise and used X(t) = x(t) [not X(t) = 1 N x(t)] Hence S X ( f ) = NS x ( f ) power spectra (db) S 1 (f) S 1 (f) S 1 (f) f Shifted curve is in good agreement with the spectrum of the single spike train. Result by Hohn and Burkitt is a numerical artifact!
19 The paradox On the one hand... ISI statistics tells us that we deal for large N with a renewal process with exponential density. A renewal process is completely determined by its density and thus if its density agrees with that of a Poisson process, then it is a Poisson process. On the other hand... The power spectrum of X(t) for arbitrary N is exactly proportional to that of the single spike train, it is not flat for a non-poissonian x(t) and thus X(t) is not a Poisson process.???
20 Resolution of the paradox X(t) is not a renewal process even in the limit N but a process with infinitesimally small correlations between the ISIs for infinitely many lags Spectrum at zero frequency, where S α () = r/2 while for a Poisson shot noise we would have S poi = r. At zero frequency, we have S() = r 3 (T i T i ) 2 [ k=1 ρ k ]. (for exp. ISI statistics (T i T i ) 2 = r 2 ) Sum over correlation coefficients makes a finite impact ρ k 1/4 for N k=1 m N=1 Σρ N=2 k k= N=1 N= m
21 Sticking to Khinchine s exact condition Diluting the single spike train when adding more and more trains such that the total rate is conserved ˆX = x n (Nt) Then S X (f) ^ N=1 N=2 2 4 f N=1 N=1 2 4 f which is plainly so because the spectrum of the single spike train tends to a flat spectrum at fixed frequency (interesting spectral features are shifted to lower and lower frequencies) However, not relevant for the neural (and other) applications!
22 Khinchine s exact condition are not met for neurons! Input rate of a single presynaptic neuron output rate of the postsynaptic neuron Output rate is a functional of the superposed input spike trains
23 Feed-forward network - transmission of transient stimuli FIG. 2. Simulated and predicted Cateau behavior & Reyes of feedforward Phys. Rev. net- Lett. 96, 5811 (26) Left: Simulation of the time-dependent firing rate for different layers, synchrony among neurons increases while progressing through the different layers Right: Theoretical predictions of firing rates grey: using the Poisson approximation, the rate flattens out (no synchrony), black:: taking the spectral features of the superposed spike train into account yields increasing synchrony as in the simulations
24 Conclusions Be careful and cautious with limit theorems! Check all the models that are based on the Poisson assumption (diffusion approximation)
The homogeneous Poisson process
The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,
More informationNeural Encoding: Firing Rates and Spike Statistics
Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following
More informationConsider the following spike trains from two different neurons N1 and N2:
About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in
More informationNeuronal Dynamics: Computational Neuroscience of Single Neurons
Week 5 part 3a :Three definitions of rate code Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne,
More informationModeling point processes by the stochastic differential equation
Modeling point processes by the stochastic differential equation Vygintas Gontis, Bronislovas Kaulakys, Miglius Alaburda and Julius Ruseckas Institute of Theoretical Physics and Astronomy of Vilnius University,
More informationNON-MARKOVIAN SPIKING STATISTICS OF A NEURON WITH DELAYED FEEDBACK IN PRESENCE OF REFRACTORINESS. Kseniia Kravchuk and Alexander Vidybida
MATHEMATICAL BIOSCIENCES AND ENGINEERING Volume 11, Number 1, February 214 doi:1.3934/mbe.214.11.xx pp. 1 xx NON-MARKOVIAN SPIKING STATISTICS OF A NEURON WITH DELAYED FEEDBACK IN PRESENCE OF REFRACTORINESS
More informationPoisson Processes for Neuroscientists
Poisson Processes for Neuroscientists Thibaud Taillefumier This note is an introduction to the key properties of Poisson processes, which are extensively used to simulate spike trains. For being mathematical
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationProblem Sheet 1 Examples of Random Processes
RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give
More informationExercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.
1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationInformation Theory and Neuroscience II
John Z. Sun and Da Wang Massachusetts Institute of Technology October 14, 2009 Outline System Model & Problem Formulation Information Rate Analysis Recap 2 / 23 Neurons Neuron (denoted by j) I/O: via synapses
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationBroadband coding with dynamic synapses
Broadband coding with dynamic synapses Benjamin Lindner Max-Planck-Institute for the Physics of Complex Systems, Nöthnitzer Str. 38 1187 Dresden, Germany André Longtin Department of Physics and Center
More informationRelation between Single Neuron and Population Spiking Statistics and Effects on Network Activity
PRL 96, 5811 (26) 1 FEBRUARY 26 Relation between Single Neuron and Population Spiking Statistics and Effects on Network Activity Hideyuki Câteau 1,2 and Alex D. Reyes 1 1 Center for Neural Science, New
More information2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf
Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic
More information+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing
Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationDescribing Spike-Trains
Describing Spike-Trains Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2012 Neural Coding The brain manipulates information by combining and generating action
More information2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES
2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral
More informationEli Barkai. Wrocklaw (2015)
1/f Noise and the Low-Frequency Cutoff Paradox Eli Barkai Bar-Ilan University Niemann, Kantz, EB PRL (213) Theory Sadegh, EB, Krapf NJP (214) Experiment Stefani, Hoogenboom, EB Physics Today (29) Wrocklaw
More informationChapter 6 - Random Processes
EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationActivity of any neuron with delayed feedback stimulated with Poisson stream is non-markov
Activity of any neuron with delayed feedback stimulated with Poisson stream is non-markov arxiv:1503.03312v1 [q-bio.nc] 11 Mar 2015 Alexander K.Vidybida Abstract For a class of excitatory spiking neuron
More information1/f Noise: the Low Frequency Cutoff Paradox HIT (2018) Eli Barkai. Bar-Ilan University
1/f Noise: the Low Frequency Cutoff Paradox Eli Barkai Bar-Ilan University Niemann, Kantz, EB PRL (213) Sadegh, EB, Krapf NJP (214) Leibovich, EB PRL (215) Leibovich, Dechant, Lutz, EB PRE (216) Leibovich,
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationTime and Spatial Series and Transforms
Time and Spatial Series and Transforms Z- and Fourier transforms Gibbs' phenomenon Transforms and linear algebra Wavelet transforms Reading: Sheriff and Geldart, Chapter 15 Z-Transform Consider a digitized
More informationInformation filtering by synchronous spikes in a neural population
J Comput Neurosci (3) 34:85 3 DOI.7/s87--4-9 Information filtering by synchronous spikes in a neural population Nahal Sharafi Jan Benda Benjamin Lindner Received: 4 May / Revised: July / Accepted: 8 August
More informationStochastic and Adaptive Optimal Control
Stochastic and Adaptive Optimal Control Robert Stengel Optimal Control and Estimation, MAE 546 Princeton University, 2018! Nonlinear systems with random inputs and perfect measurements! Stochastic neighboring-optimal
More informationThe Spike Response Model: A Framework to Predict Neuronal Spike Trains
The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology
More informationFourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007
Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)
More informationLecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona
Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point
More information2A1H Time-Frequency Analysis II
2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.
More informationEvolution of the Average Synaptic Update Rule
Supporting Text Evolution of the Average Synaptic Update Rule In this appendix we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate log P (yk Y k, X k ) γ log P (yk Y k ). ()
More informationSynchrony in Stochastic Pulse-coupled Neuronal Network Models
Synchrony in Stochastic Pulse-coupled Neuronal Network Models Katie Newhall Gregor Kovačič and Peter Kramer Aaditya Rangan and David Cai 2 Rensselaer Polytechnic Institute, Troy, New York 2 Courant Institute,
More informationAn Introductory Course in Computational Neuroscience
An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of
More informationModel neurons!!poisson neurons!
Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence
More informationEstimating Lyapunov Exponents from Time Series. Gerrit Ansmann
Estimating Lyapunov Exponents from Time Series Gerrit Ansmann Example: Lorenz system. Two identical Lorenz systems with initial conditions; one is slightly perturbed (10 14 ) at t = 30: 20 x 2 1 0 20 20
More informationThe Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017
The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent
More informationCCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationDeterministic. Deterministic data are those can be described by an explicit mathematical relationship
Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact
More informationThis is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or
Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects
More informationRENEWAL PROCESSES AND POISSON PROCESSES
1 RENEWAL PROCESSES AND POISSON PROCESSES Andrea Bobbio Anno Accademico 1997-1998 Renewal and Poisson Processes 2 Renewal Processes A renewal process is a point process characterized by the fact that the
More informationPART 1. Review of DSP. f (t)e iωt dt. F(ω) = f (t) = 1 2π. F(ω)e iωt dω. f (t) F (ω) The Fourier Transform. Fourier Transform.
PART 1 Review of DSP Mauricio Sacchi University of Alberta, Edmonton, AB, Canada The Fourier Transform F() = f (t) = 1 2π f (t)e it dt F()e it d Fourier Transform Inverse Transform f (t) F () Part 1 Review
More informationECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1
ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes
More informationAn algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector
An algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector David Hsu, Murielle Hsu, He Huang and Erwin B. Montgomery, Jr Department of Neurology University
More informationDynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches
Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches Anna Levina 3,4, J. Michael Herrmann 1,2, Theo Geisel 1,2,4 1 Bernstein Center for Computational Neuroscience Göttingen 2
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationFig 1: Stationary and Non Stationary Time Series
Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.
More informationMost Probable Escape Path Method and its Application to Leaky Integrate And Fire Neurons
Most Probable Escape Path Method and its Application to Leaky Integrate And Fire Neurons Simon Fugmann Humboldt University Berlin 13/02/06 Outline The Most Probable Escape Path (MPEP) Motivation General
More informationESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS
ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS S.B. LOWEN Department of Electrical and Computer Engineering 44 Cummington Street Boston University,
More informationCHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS
44 CHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS 3.1 INTRODUCTION MANET analysis is a multidimensional affair. Many tools of mathematics are used in the analysis. Among them, the prime
More informationEffects of synaptic conductance on the voltage distribution and firing rate of spiking neurons
PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More information2. (a) What is gaussian random variable? Develop an equation for guassian distribution
Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &
More informationHigh-conductance states in a mean-eld cortical network model
Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical
More informationF(t) equilibrium under H 0
Physics 17b: Statistical Mechanics Linear Response Theory Useful references are Callen and Greene [1], and Chandler [], chapter 16. Task To calculate the change in a measurement B t) due to the application
More informationDecorrelation of neural-network activity by inhibitory feedback
Decorrelation of neural-network activity by inhibitory feedback Tom Tetzlaff 1,2,#,, Moritz Helias 1,3,#, Gaute T. Einevoll 2, Markus Diesmann 1,3 1 Inst. of Neuroscience and Medicine (INM-6), Computational
More informationNeural variability and Poisson statistics
Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More informationPopulation dynamics of interacting spiking neurons
Population dynamics of interacting spiking neurons Maurizio Mattia* Physics Laboratory, Istituto Superiore di Sanità, INFN - Gr. Coll. Roma I, V.le Regina Elena 299, 00161 Roma, Italy Paolo Del Giudice
More informationStatistical Analysis of Fluctuation Characteristics at High- and Low-Field Sides in L-mode SOL Plasmas of JT-60U
1 EX/P4-18 Statistical Analysis of Fluctuation Characteristics at High- and Low-Field Sides in L-mode SOL Plasmas of JT-60U N. Ohno1), H. Tanaka1), N. Asakura2), Y. Tsuji1), S. Takamura3), Y. Uesugi4)
More informationInfinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets
Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets Antonio Galves Universidade de S.Paulo Fapesp Center for Neuromathematics Eurandom,
More informationEE 574 Detection and Estimation Theory Lecture Presentation 8
Lecture Presentation 8 Aykut HOCANIN Dept. of Electrical and Electronic Engineering 1/14 Chapter 3: Representation of Random Processes 3.2 Deterministic Functions:Orthogonal Representations For a finite-energy
More information1/f Noise on the nano-scale Aeging Wiener-Khinchin approach. Pohang (2016)
1/f Noise on the nano-scale Aeging Wiener-Khinchin approach Eli Barkai Bar-Ilan University Niemann, Kantz, EB PRL (213) MMNP (216) Sadegh, EB, Krapf NJP (214) Leibovich, EB PRL (215) Leibovich, Dechant,
More informationTime Series and Forecasting
Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationIV. Covariance Analysis
IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.
More informationAnalytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise
LETTER Communicated by Bard Ermentrout Analytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise Benjamin Lindner Lindner.Benjamin@science.uottawa.ca André Longtin alongtin@physics.uottawa.ca
More informationNeural Spike Train Analysis 1: Introduction to Point Processes
SAMSI Summer 2015: CCNS Computational Neuroscience Summer School Neural Spike Train Analysis 1: Introduction to Point Processes Uri Eden BU Department of Mathematics and Statistics July 27, 2015 Spikes
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationChapter 4 Random process. 4.1 Random process
Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,
More information3. Poisson Processes (12/09/12, see Adult and Baby Ross)
3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous
More informationOrder of Magnitude Astrophysics - a.k.a. Astronomy 111. Photon Opacities in Matter
1 Order of Magnitude Astrophysics - a.k.a. Astronomy 111 Photon Opacities in Matter If the cross section for the relevant process that scatters or absorbs radiation given by σ and the number density of
More informationarxiv: v1 [math.pr] 18 Nov 2018
Integral Equation Approach to Stationary Stochastic Counting Process with Independent Increments arxiv:1811.7262v1 [math.pr] 18 Nov 218 Enzhi Li Suning R & D Center, Palo Alto, USA November 2, 218 Abstract:
More informationStructure of Correlations in Neuronal Networks
Structure of Correlations in Neuronal Networks Krešimir Josić University of Houston James Trousdale (UH), Yu Hu (UW) Eric Shea-Brown (UW) The Connectome Van J. Wedeen, MGH/Harvard The Connectome Van J.
More informationPART 2 : BALANCED HOMODYNE DETECTION
PART 2 : BALANCED HOMODYNE DETECTION Michael G. Raymer Oregon Center for Optics, University of Oregon raymer@uoregon.edu 1 of 31 OUTLINE PART 1 1. Noise Properties of Photodetectors 2. Quantization of
More informationThis script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms.
9.16 Problem Set #4 In the final problem set you will combine the pieces of knowledge gained in the previous assignments to build a full-blown model of a plastic synapse. You will investigate the effects
More informationFluctuation and Noise Letters Vol. 4, No. 1 (2004) L195 L205 c World Scientific Publishing Company
Fluctuation and Noise Letters Vol. 4, No. 1 (24) L195 L25 c World Scientific Publishing Company ISI CORRELATIONS AND INFORMATION TRANSFER MAURICE J. CHACRON, BENJAMIN LINDNER and ANDRÉ LONGTIN Department
More informationReading: Karlin and Taylor Ch. 5 Resnick Ch. 3. A renewal process is a generalization of the Poisson point process.
Renewal Processes Wednesday, December 16, 2015 1:02 PM Reading: Karlin and Taylor Ch. 5 Resnick Ch. 3 A renewal process is a generalization of the Poisson point process. The Poisson point process is completely
More informationStochastic resonance in neuron models: Endogenous stimulation revisited
PHYSICAL REVIEW E, VOLUME 63, 031916 Stochastic resonance in neuron models: Endogenous stimulation revisited Hans E. Plesser 1,2, * and Theo Geisel 1 1 Max-Planck-Institut für Strömungsforschung and Fakultät
More informationPhysical Measurement. Uncertainty Principle for Measurement
Physical Measurement Uncertainty Principle for Measurement Measuring rod is marked in equispaced intervals of which there are N of one unit of measurement size of the interval is 1/N The measuring variable
More informationEmpirical properties of large covariance matrices in finance
Empirical properties of large covariance matrices in finance Ex: RiskMetrics Group, Geneva Since 2010: Swissquote, Gland December 2009 Covariance and large random matrices Many problems in finance require
More informationTime Reversibility and Burke s Theorem
Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal
More informationUtility of Correlation Functions
Utility of Correlation Functions 1. As a means for estimating power spectra (e.g. a correlator + WK theorem). 2. For establishing characteristic time scales in time series (width of the ACF or ACV). 3.
More informationSession 1: Probability and Markov chains
Session 1: Probability and Markov chains 1. Probability distributions and densities. 2. Relevant distributions. 3. Change of variable. 4. Stochastic processes. 5. The Markov property. 6. Markov finite
More informationDYNAMICS OF CURRENT-BASED, POISSON DRIVEN, INTEGRATE-AND-FIRE NEURONAL NETWORKS
DYNAMICS OF CURRENT-BASED, POISSON DRIVEN, INTEGRATE-AND-FIRE NEURONAL NETWORKS KATHERINE A. NEWHALL, GREGOR KOVAČIČ, PETER R. KRAMER, DOUGLAS ZHOU, AADITYA V. RANGAN, AND DAVID CAI Abstract. Synchronous
More informationFitting a Stochastic Neural Network Model to Real Data
Fitting a Stochastic Neural Network Model to Real Data Christophe Pouzat, Ludmila Brochini, Pierre Hodara and Guilherme Ost MAP5 Univ. Paris-Descartes and CNRS Neuromat, USP christophe.pouzat@parisdescartes.fr
More information13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.
For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval
More informationAT2 Neuromodeling: Problem set #3 SPIKE TRAINS
AT2 Neuromodeling: Problem set #3 SPIKE TRAINS Younesse Kaddar PROBLEM 1: Poisson spike trains Link of the ipython notebook for the code Brain neuron emit spikes seemingly randomly: we will aim to model
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More information1 Elementary probability
1 Elementary probability Problem 1.1 (*) A coin is thrown several times. Find the probability, that at the n-th experiment: (a) Head appears for the first time (b) Head and Tail have appeared equal number
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationPoisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.
Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationSignals and Spectra (1A) Young Won Lim 11/26/12
Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later
More information