Is the superposition of many random spike trains a Poisson process?

Similar documents
The homogeneous Poisson process

Neural Encoding: Firing Rates and Spike Statistics

Consider the following spike trains from two different neurons N1 and N2:

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Modeling point processes by the stochastic differential equation

NON-MARKOVIAN SPIKING STATISTICS OF A NEURON WITH DELAYED FEEDBACK IN PRESENCE OF REFRACTORINESS. Kseniia Kravchuk and Alexander Vidybida

Poisson Processes for Neuroscientists

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Problem Sheet 1 Examples of Random Processes

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Stochastic Processes. A stochastic process is a function of two variables:

Information Theory and Neuroscience II

Stochastic process. X, a series of random variables indexed by t

Broadband coding with dynamic synapses

Relation between Single Neuron and Population Spiking Statistics and Effects on Network Activity

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Describing Spike-Trains

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Eli Barkai. Wrocklaw (2015)

Chapter 6 - Random Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Activity of any neuron with delayed feedback stimulated with Poisson stream is non-markov

1/f Noise: the Low Frequency Cutoff Paradox HIT (2018) Eli Barkai. Bar-Ilan University

Signals and Spectra - Review

Time and Spatial Series and Transforms

Information filtering by synchronous spikes in a neural population

Stochastic and Adaptive Optimal Control

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

2A1H Time-Frequency Analysis II

Supporting Online Material for

Evolution of the Average Synaptic Update Rule

Synchrony in Stochastic Pulse-coupled Neuronal Network Models

An Introductory Course in Computational Neuroscience

Model neurons!!poisson neurons!

Estimating Lyapunov Exponents from Time Series. Gerrit Ansmann

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

RENEWAL PROCESSES AND POISSON PROCESSES

PART 1. Review of DSP. f (t)e iωt dt. F(ω) = f (t) = 1 2π. F(ω)e iωt dω. f (t) F (ω) The Fourier Transform. Fourier Transform.

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

An algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector

Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Fig 1: Stationary and Non Stationary Time Series

Most Probable Escape Path Method and its Application to Leaky Integrate And Fire Neurons

ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS

CHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

ELEMENTS OF PROBABILITY THEORY

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

High-conductance states in a mean-eld cortical network model

F(t) equilibrium under H 0

Decorrelation of neural-network activity by inhibitory feedback

Neural variability and Poisson statistics

Random Processes Why we Care

Population dynamics of interacting spiking neurons

Statistical Analysis of Fluctuation Characteristics at High- and Low-Field Sides in L-mode SOL Plasmas of JT-60U

Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets

EE 574 Detection and Estimation Theory Lecture Presentation 8

1/f Noise on the nano-scale Aeging Wiener-Khinchin approach. Pohang (2016)

Time Series and Forecasting

Manual for SOA Exam MLC.

IV. Covariance Analysis

Analytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise

Neural Spike Train Analysis 1: Introduction to Point Processes

Part I Stochastic variables and Markov chains

Chapter 4 Random process. 4.1 Random process

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Order of Magnitude Astrophysics - a.k.a. Astronomy 111. Photon Opacities in Matter

arxiv: v1 [math.pr] 18 Nov 2018

Structure of Correlations in Neuronal Networks

PART 2 : BALANCED HOMODYNE DETECTION

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms.

Fluctuation and Noise Letters Vol. 4, No. 1 (2004) L195 L205 c World Scientific Publishing Company

Reading: Karlin and Taylor Ch. 5 Resnick Ch. 3. A renewal process is a generalization of the Poisson point process.

Stochastic resonance in neuron models: Endogenous stimulation revisited

Physical Measurement. Uncertainty Principle for Measurement

Empirical properties of large covariance matrices in finance

Time Reversibility and Burke s Theorem

Utility of Correlation Functions

Session 1: Probability and Markov chains

DYNAMICS OF CURRENT-BASED, POISSON DRIVEN, INTEGRATE-AND-FIRE NEURONAL NETWORKS

Fitting a Stochastic Neural Network Model to Real Data

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

Universal examples. Chapter The Bernoulli process

1 Elementary probability

Multiple Random Variables

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Exponential Distribution and Poisson Process

Signals and Spectra (1A) Young Won Lim 11/26/12

Transcription:

Is the superposition of many random spike trains a Poisson process? Benjamin Lindner Max-Planck-Institut für Physik komplexer Systeme, Dresden Reference: Phys. Rev. E 73, 2291 (26)

Outline Point processes spike train statistics and interval statistics Renewal and nonrenewal processes Poisson process Superposition of many trains - the paradox Resolution of the paradox Conclusions

Stationary point processes Given by an ordered set of random points {t i } t i-3 t i-2 t i-1 t i t i+1 t i+2 t with statistics depending only on time differences (stationarity) We can associate each point with a δ spike spike train x(t) = δ(t t i ) Examples for series of events at random (or random ) instances Emission of an electron (shot noise), occurrences of earthquakes, neural firings (action potentials), breakdown of power supply,

Interval statistics Interspike intervals are: I i = t i+1 t i I i-1 I i I i+1 I i+2 I i+3 t i-1 t i t i+1 t i+2 t i+3 t Can be equivalently described a hierarchy of interval sequences T 3,i {T n,k } T 2,i T 1,i t i-1 t i t i+1 t i+2 t i+3 t

Interval statistics Interspike intervals (ISIs) are: I i = t i+1 t i. Equivalent: n-th order interval T n,k = t i+n t i T 3,i I i-1 I i I i+1 I i+2 I i+3 T 2,i T 1,i t i-1 t i t i+1 t i+2 t i+3 t t i t i+1 t i+2 Serial correlation coefficient measures how strongly the ISIs are correlated t i-1 ρ k = (I i I i )(I i+k I i+k ) (I i I i ) 2 Special class of point processes: renewal processes for which all ISIs are statistically independent PDF of ISI P(I) determines the statistics of the point process completely t i+3 t

Spike train statistics Mean value of spike train = spike rate probability to have a spike in [t,t + dt] is given by r dt For stationary process equivalent to time average TR 1 N(T ) x(t) = lim T T dt δ(t t i ) = lim T T = r

Second-order statistics Spike train statistics II Correlation function: k(τ) = x(t)x(t + τ) x(t) x(t + τ) Can be written as x(t)x(t + τ) = rδ(τ) + rp(τ ) conditional probability of having a spike at t = τ given we p(τ ) observed another one at t = Power spectrum: S( f ) = Z dτe 2πi f τ k(τ) saturates at high frequency at r, i.e. S( f ) = r +

Example Process with reduced probability of short intervals 1 P 1.5 k(τ) 1.5 -.5-1 -2-1 1 2 τ 2 4 6 ISI S(f) 1.5 1.5 2 4 6 f

Relation between ISI and spike-train statistics Given the ISI and n-th order interval densities can we calculate the power spectrum ( f > )? S( f ) = Z Z dtk(t)e 2πi ft = r + 2rR Z = r + 2rR = r [ 1 + n=1 dt 2πi ft P n (t)e n=1 ] P n ( f ) + P n ( f ) 2πi ft dt p(t )e For a renewal process: P n ( f ) = P 1 n( f ) S( f ) = [ ( ) ( )] 1 r 1 + 1 P 1 ( f ) 1 1 + 1 P 1 ( f ) 1 = r 1 P 1 ( f ) 2 1 P 1 ( f ) 2

The simplest process Poissonian shot noise Completely random occurrence of points No correlations between spikes flat power spectrum k(τ) = rδ(τ), S( f ) = r Exponential ISI density, ISIs are uncorrelated (special renewal process) P(I) = r exp[ ri], ρ k =,k > PDF for spike count (number of events in [,T ]) P T (N) = (rt )N exp[ rt ] N!

Three ways of generating a Poisson process Distribute N points independently and uniformly on the interval [, T ]. For a time window [,T ] with T T the resulting point process will be Poissonian. Go through the interval [,T ] by steps of t 1/r. In each step draw an independent random number ξ i uniformly distributed in [,1]. If ξ i < r t, there is an event, otherwise not. Draw a sequence of independent random numbers according to the exponential PDF r exp[ rt]. The sequence of spike times is then given by t i = i j=1 T j.

Is the superposition of many trains Poissonian? Given a large number of independent non-poissonian spike trains: Is their superposition a Poisson process? Is there a central limit theorem for point processes? Important, for instance, for the problem of estimating the effective input of a neuron connected to 1 4 other neurons

Apparently it is... If the total rate of X(t) = N j=1 x j (t) which is r = N j=1 r j remains constant in the limit N, X(t) approaches a Poisson process in this limit [additional assumptions on x j (t) exclude some pathological cases] More recent claims Classical proof by Khinchine (196) Similarly to the central limit theorem [for random variables]..., the sum of independent renewal point processes tends to a Poisson process... (Shimokawa et al., PRE 1999) Since the sum of independent renewal processes tends to a Poisson process, the superposition of a large number of output spike trains can be approximated by a... Poisson process... (Hohn & Burkitt, PRE, 21)

Numerical results seem to support it... Numerics shows that for N we get an exponential ISI and a flat power spectrum, i.e. a Poisson process. Power spectrum of summed spike train (Hohn & Burkitt, PRE, 21) lower curve: spectrum of single spike train, because of refractory period it shows a dip at low frequencies upper curve: spectrum of the superposition of 1 independent spike trains, apparently the dip starts to vanish and the spectrum becomes flat also at low frequencies.

My test model Single spike train is a renewal but non-poissonian process with ISI density and power spectrum p α (T ) = 4r 2 T exp[ 2rT ], [ 2r 2 ] S α = r 1 4r 2 + (π f ) 2 Consider X(t) = 1 N N x n (t) n=1 (prefactor to keep the mean value of the spike train constant, not of any relevance for the problem) Is X(t) a Poisson spike train?

ISI statistics why a Poissonian statistics can be naively concluded from it Spike trains PDF of ISI Correlation coefficient 1 4 1 N=1 N=1.1 N=1.5 1 2 T i T i+1 -.1 1 4 1 N=2 N=2.1 N=2.5 1 2 -.1 1 4 1 N=1 N=1.1 N=1.5 1 2 -.1 1 4 1 N=1 N=1.1 N=1.5 1 2 -.1 1 2 2 4 6 8 1 time k 2 4 6 N T i For large N ISI density approaches a simple exponential, ISI correlations for any finite lag vanish X(t) approaches a renewal process with exponential ISI density, i.e. a Poisson process

Spectral statistics why X(t) is not a Poissonian spike train Correlation function of summed spike train K X (τ) = X(t)X(t + τ) X(t) X(t + τ) = 1 N 2 x n (t)x l (t + τ) x n (t) x l (t + τ) n,l = 1 N 2 [ x n (t)x n (t + τ) x n (t) x n (t + τ) ] + 1 n N 2 [ x n (t)x l (t + τ) x n (t) x l (t + τ) ] n l Hence K X (τ) = 1 N K x(τ) S X ( f ) = 1 N S x( f ) N S X (f) 1.5 1.5 N=1 N=2 2 4 6 f N=1 N=1 2 4 6 f Similar recent finding by Cateau & Reyes Phys. Rev. Lett. 96, 5811 (26)

Result for the power spectrum by Hohn and Burkitt? Reminder: H&B considered output of 1 leaky integrate-and-fire neurons driven by white noise and used X(t) = x(t) [not X(t) = 1 N x(t)] Hence S X ( f ) = NS x ( f ) power spectra (db) 8 6 4 2 S 1 (f) S 1 (f) S 1 (f)+3-2.5 1 1.5 2 2.5 f Shifted curve is in good agreement with the spectrum of the single spike train. Result by Hohn and Burkitt is a numerical artifact!

The paradox On the one hand... ISI statistics tells us that we deal for large N with a renewal process with exponential density. A renewal process is completely determined by its density and thus if its density agrees with that of a Poisson process, then it is a Poisson process. On the other hand... The power spectrum of X(t) for arbitrary N is exactly proportional to that of the single spike train, it is not flat for a non-poissonian x(t) and thus X(t) is not a Poisson process.???

Resolution of the paradox X(t) is not a renewal process even in the limit N but a process with infinitesimally small correlations between the ISIs for infinitely many lags Spectrum at zero frequency, where S α () = r/2 while for a Poisson shot noise we would have S poi = r. At zero frequency, we have S() = r 3 (T i T i ) 2 [ 1 + 2 k=1 ρ k ]. (for exp. ISI statistics (T i T i ) 2 = r 2 ) Sum over correlation coefficients makes a finite impact ρ k 1/4 for N k=1 m -.5 -.1 N=1 Σρ N=2 k k=1 -.15 -.2 -.25 N=1 N=1 2 4 6 8 1 m

Sticking to Khinchine s exact condition Diluting the single spike train when adding more and more trains such that the total rate is conserved ˆX = x n (Nt) Then S X (f) ^ 1.5 1.5 N=1 N=2 2 4 f N=1 N=1 2 4 f which is plainly so because the spectrum of the single spike train tends to a flat spectrum at fixed frequency (interesting spectral features are shifted to lower and lower frequencies) However, not relevant for the neural (and other) applications!

Khinchine s exact condition are not met for neurons! Input rate of a single presynaptic neuron output rate of the postsynaptic neuron Output rate is a functional of the superposed input spike trains

Feed-forward network - transmission of transient stimuli FIG. 2. Simulated and predicted Cateau behavior & Reyes of feedforward Phys. Rev. net- Lett. 96, 5811 (26) Left: Simulation of the time-dependent firing rate for different layers, synchrony among neurons increases while progressing through the different layers Right: Theoretical predictions of firing rates grey: using the Poisson approximation, the rate flattens out (no synchrony), black:: taking the spectral features of the superposed spike train into account yields increasing synchrony as in the simulations

Conclusions Be careful and cautious with limit theorems! Check all the models that are based on the Poisson assumption (diffusion approximation)