Bayesian Estimation of Stimulus Responses in Poisson Spike Trains

Size: px
Start display at page:

Download "Bayesian Estimation of Stimulus Responses in Poisson Spike Trains"

Transcription

1 NOTE Communicated by Kechen Zhang Bayesian Estimation of Stimulus Responses in Poisson Spike Trains Sidney R. Lehky Cognitive Brain Mapping Laboratory, RIKEN Brain Science Institute, Saitama , Japan, and Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, M 20892, U.S.A. A Bayesian method is developed for estimating neural responses to stimuli, using likelihood functions incorporating the assumption that spike trains follow either pure Poisson statistics or Poisson statistics with a refractory period. The Bayesian and standard estimates of the mean and variance of responses are similar and asymptotically converge as the size of the data sample increases. However, the Bayesian estimate of the variance of the variance is much lower. This allows the Bayesian method to provide more precise interval estimates of responses. Sensitivity of the Bayesian method to the Poisson assumption was tested by conducting simulations perturbing the Poisson spike trains with noise. This did not affect Bayesian estimates of mean and variance to a signi cant degree, indicating that the Bayesian method is robust. The Bayesian estimates were less affected by the presence of noise than estimates provided by the standard method. 1 Introduction The goal here is to use Bayesian methods to improve estimates of neural responses. The Bayesian analysis accomplishes this by incorporating additional information about spike train statistics into the analysis relative to standard methods, which simply calculate moments (e.g., mean, variance) of the data. In this analysis, we incorporate the assumption that spike counts are Poisson distributed. There have been previous applications of Bayesian methods to neural data (Brown et al., 1998; Martignon et al., 2000; Sanger, 1996; Zhang, Ginzburg, McNaughton, & Sejnowski, 1998). However, the focus of those studies was determining the most likely stimulus given a set of responses within a neural population. Here we are not concerned with determining the most likely stimulus, but rather move the analysis to a lower level and infer the probability distribution of the response ring rate to one particular stimulus, given observations of spike counts within xed time periods. The emphasis is on practical data analysis methods rather than theoretical issues in neural coding. Neural Computation 16, (2004) c 2004 Massachusetts Institute of Technology

2 1326 S. Lehky The Bayes formulation for the case at hand is prob. j n; T/ prob.n j ; T/ prob. ; T/ ; (1.1) prob.n; T/ where n is the observed spike count during time interval T and is the estimated spike rate. Omitting the normalization factor prob.n; T/ in the denominator leaves the proportionality: prob. j n; T/ / prob.n j ; T/ prob. ; T/ posterior / likelihood prior: (1.2) 2 The Likelihood Function The experimental situation we envisage is that a particular stimulus is presented during multiple trials. Within each trial, there is a prestimulus period, when activity of the neuron is at spontaneous level, and a stimulus period. Spike counts from the two periods are used to form likelihood functions for spontaneous and stimulus ring rates, spont and stim. From these, the likelihood function of the response stim spont is derived. The Poisson probability density function (pdf) we assume gives the distribution of spike count n during period T, given mean spike count T: prob.n j ; T/. T/n e T : (2.1) n! What we actually need for a Bayesian estimate, however, is the likelihood of, which involves keeping the same equation but now making n a constant and a variable, opposite to the situation in equation 2.1. This change in perspective converts equation 2.1 into gamma-distribution-shaped likelihood function of, with shape parameter n C 1 and rate parameter 1=T: prob.n j ; T/ likelihood. j n; T/. T/n 0.n C 1/ e T (2.2a) C ne T : (2.2b) (Multiplying this likelihood function by a constant factor T would normalize it to a gamma pdf.) In equation 2.2b, all the multiplicative constants are collected into C. The precise value of C is unimportant for likelihood calculations as we are interested in relative rather than absolute values of the function. Assuming prestimulus and stimulus spike rates are independent, the joint likelihood of spont and stim is formed by multiplying their individual likelihoods: likelihood i. stim ; spont j n i stim ; ni spont ; T stim; T spont / C i ni stim stim ni spont spont e. stimtstimc sponttspont/ (2.3)

3 Bayesian Estimation of Neural Responses 1327 (for the ith stimulus presentation). All multiplicative constants are grouped into C i, whose value again is not important. The joint likelihood for N stimulus trials is then found by multiplying the likelihood functions for the individual trials: likelihood. stim ; spont j En stim ; En spont ; T stim ; T spont / NY i1 3 Prior Probability istribution [C i ni stim stim ni spont spont e. stim Tstim C spontt spont/ ]: (2.4) We de ne two prior distributions as examples. One possibility, the agnostic assumption, is that all response magnitudes are equally likely. This is represented by a two-dimensional uniform distribution, priorprob Ag. stim ; spont / 1 a 2 0 spont ; stim a; 0 otherwise (3.1) with a de ning a biophysically plausible range of ring rates. Another possibility, the skeptical assumption, is that there is no stimulus response. That is, spont and stim are on average identical. Given the longterm mean spontaneous spike rate, Ņ spont, this distribution, analogous to equation 2.3, is: priorprob Sk. stim ; spont j Ņ spont ; T stim ; T spont / C. stim / Ņ spontt stim. spont / Ņ spontt spont e. stimtstimc sponttspont/ ; (3.2) where C is the normalizing constant. Besides these example prior distributions, others may be suggested by the particular aspects of an experiment. 4 Posterior Probability istribution The posterior probability of the ring rates is proportional to the prior probability (see equation 3.1 or 3.2), and the joint likelihood function (see equation 2.4): postprob. stim ; spont j En stim ; En spont ; T stim ; T spont / C[priorProb. stim ; spont / likelihood. stim ; spont j En stim ; En spont ; T stim ; T spont /]; (4.1) in which C normalizes the area under postprob. stim ; spont /.

4 1328 S. Lehky What we require, however, rather than the joint pdf of spont and stim, is the distribution of the response resp stim spont. This is accomplished by rst changing variables from prob. stim ; spont / to prob. stim spont ; stim C spont /, which involves a rotation and dilation of the coordinate axes. Then prob. stim spont ; stim C spont / is integrated along the stim C spont axis to give the marginal distribution prob. stim spont /. For convenience, we relabel the transformed variables as follows: sum stim C spont dif stim spont : (4.2) Under the transformed variables, the likelihood function for a single trial (analogous to equation 2.3) becomes: likelihood i. sum ; dif j n i stim ; ni spont ; T stim; T spont / C i. sum C dif / ni stim. sum dif / ni spont e 1 2 [. sumc dif /TstimC. sum dif /Tspont] : (4.3) The joint likelihood over N stimulus trials is then found by taking the product of individual trials, analogous to equation 2.4. The agnostic prior, equation 3.1, remains essentially unchanged under transformed coordinates because it is a constant. The transformed equation for the skeptical prior, equation 3.2, becomes: priorprob Sk. sum ; dif j Ņ spont ; T stim ; T spont / C. sum C dif / Ņ spontt stim. sum dif / Ņ spontt spont e 1 2 [. sumc dif /TstimC. sum dif /Tspont] : (4.4) Given the transformed likelihood and prior probability functions, the posterior probability is postprob. sum ; dif j En stim ; En spont ; T stim ; T spont / C[priorProb. sum ; dif / likelihood. sum ; dif j En stim ; En spont ; T stim ; T spont /]; (4.5) where again C normalizes the distribution. Integrating equation 4.5 with respect to sum takes us from postprob. sum ; dif / to postprob. dif /: postprob. dif j En stim ; En spont ; T stim ; T spont / C Z 1 1 priorprob. sum ; dif / likelihood. sum ; dif j En stim ; En spont ; T stim ; T spont /d sum : (4.6)

5 Bayesian Estimation of Neural Responses 1329 This is the nal result we seek the posterior probability distribution of the stimulus response. 5 Refractory Periods Thus far, the analysis has been carried out for a pure Poisson process. A signi cant perturbation away from this ideal is caused by the existence of a refractory period. The refractory period acts to regularize a spike train, reducing the variance of spike counts within a xed period, and therefore the variance of observed ring rates. Given mean interspike interval Nt ISI and refractory period t ref, the fractional variance relative to a pure Poisson process is º Var. / ref Var. / pure Á! t N 2 ISI t ref : (5.1) Nt ISI (By de nition of t ref, Nt ISI < t ref is impossible.) Since Nt ISI T=n: º ± 1 n T t ref ² 2 : (5.2) To take into account the refractory period, therefore, the variance of the likelihood function (as well as nonuniform priors) should be reduced by the refractory period correction factor º. A modi cation to the likelihood function equation 2.2b that approximates that transform is: likelihood. j n; T; º/ C nc1 1 º e T º 0 < º 1 ± n C 1 º 0; (5.3) T where ± is the irac delta function. This approximation leads to small biases in the estimates of spike statistics. Because the purpose of this study is to quantify properties of the Bayesian method, an exact refractory period correction was carried out numerically in the simulations described below. Using a likelihood function and prior corrected for refractory periods, the rest of the analysis proceeds as before. 6 Mean, Variance, and Variance of Variance of Response Estimates The equations that follow are derived for the situations of a pure Poisson process and a pure Poisson process plus refractory period. A uniform prior is assumed. In a later section, we present simulations that examine what happens when the pure Poisson process plus refractory period is further perturbed by adding noise.

6 1330 S. Lehky Response statistics can be presented as either a function of observed spike counts n or the spike density parameter r (which is known in the case of simulations). The relation between these two variables is given by the formula for average spike count per trial: i1 ni N 6.1 Mean rt: (6.1) Bayesian. Under the Bayesian analysis, if spike counts are Poisson distributed, then spike rates have a gamma-distribution-shaped likelihood function (see equation 2.2a). Given a set of N spike trains with spike count n i, i 1; : : : ; N, and duration T, the posterior probability formed by multiplying together a set of such likelihood functions with a uniform prior and the appropriate normalization factor gives a gamma distribution with shape parameter i1 ni C 1 and scale parameter 1=.NT/. Given this posterior probability distribution, the optimal location estimate of can be de ned as either the mode or mean of the distribution, depending on the optimality criterion (Kay, 1993). Using mean gives the minimum mean square error (MMSE) estimate, while using the mode gives the maximum a posteriori (MAP) estimate, which for a uniform prior is also the maximum likelihood (ML) estimate. The MMSE estimate leads to an expected spike rate of i1 E. / bay ni C 1 NT r C 1 NT ; (6.2) while the MAP estimate is i1 ni mode. / bay NT r: (6.3) The trade-off between the two is lower bias or lower mean square error. The estimated value of the response resp stim spont is then either i1 E. resp / bay ni stim C 1 i1 ni spont C 1 NT stim NT spont r stim r resp C N T stim T spont (6.4)

7 Bayesian Estimation of Neural Responses 1331 or i1 ni spont i1 mode. resp / bay ni stim NT stim NT spont r stim r resp : (6.5) In the discussion below, we shall assume the MMSE estimate Standard. Under the standard analysis, the mean ring rate is i1 ni E. / std NT r: (6.6) The expected value of resp then becomes i1 ni spont i1 E. resp / std ni stim NT stim NT spont r stim r spont : (6.7) Thus, the Bayesian MMSE estimate and the standard estimates of the response differ (except for the special case T stim T spont ), with the two estimates asymptotically converging as the sample size NT increases. The Bayesian MAP estimate is identical with the standard estimate. 6.2 Variance Bayesian. Given a posterior probability that is a gamma distribution with shape parameter i1 ni C1 and scale parameter 1=.NT/, expected variance of the ring rate is i1 E.Var. // bay ni C 1.NT/ 2 r NT C 1.NT/ 2 : (6.8) The variance of the response resp is then i1 E.Var. resp // bay ni stim C 1 i1.nt stim / 2 C ni spont C 1.NT spont / 2 1 rstim C r Á spont C 1N 1 N T stim T 2 spont T 2 stim! C 1 Tspont 2 : (6.9)

8 1332 S. Lehky The effect of the refractory period on expected variance can be approximated if we de ne, following equation 5.2, a mean refractory period correction factor: Nº Á 1 P! N 2 i1 ni t ref NT.1 rt ref / 2 : Using this, the corrected equation 6.9 becomes (6.10) ± PNi1 ² Nº stim n i stim C Nº stim E.Var. resp // bay.nt stim / ± 2 PN ² Nº spont i1 ni spont C Nº spont C.NT spont / 2 1 Nºstim r stim C Nº spontr spont N T stim T spont C 1 Á Nº stim N 2 Tstim 2 C Nº! spont Tspont 2 : (6.11) Although it can be convenient to use Nº on pooled data rather than doing full trial-by-trial calculations with different values of º for each trial, this introduces biases in calculated refractory-corrected variances, as well as variances of variances Standard. Given spike counts n that are Poisson distributed, Var.n/ i1 ni =N. We also have mean ring rate E. / as a function of n, equation 6.6. Combining these two relations gives the variance of from its mean, leading to i1 ni E.Var. // std.nt/ 2 r NT : (6.12) Response variance is then i1 ni spont i1 E.Var. resp // std ni stim.nt stim / 2 C.NT spont / 2 1 rstim C r spont : (6.13) N T stim T spont

9 Bayesian Estimation of Neural Responses 1333 Bayesian response variances (see equation 6.9) are larger than those calculated using the standard method (see equation 6.13), although they converge for large sample sizes. With correction for refractory period: E.Var. resp // std Nº stim i1 ni stim.nt stim / 2 C Nº spont i1 ni spont.nt spont / 2 1 N Nºstim r stim T stim C Nº spontr spont T spont : (6.14) 6.3 Variance of the Variance. Variance of the variance is important because it determines the precision of an interval estimate of a parameter Bayesian. From equation 6.8, we have ring rate variance as a function of n. Again, since spike counts are Poisson distributed, Var.n/ i1 n i =N. Combining these two relations gives the variance of variance: i1 ni Var.Var. // bay.nt/ 4 r.nt/ 3 : (6.15) Response variance of variance is Var.Var. resp // bay 1 N 4 Á PNi1 n i stim T 4 stim 1 N 3 Á r stim T 3 stim C C r spont T 3 spont i1 ni spont! T 4 spont! ; (6.16) and with correction for refractory period: Var.Var. resp // bay 1 N 4 Á Nº 2 stim i1 ni stim T 4 stim 1 N 3 Á Nº 2 stim r stim T 3 stim C Nº2 spont C Nº2 spont r spont T 3 spont! i1 ni spont T 4 spont! : (6.17) Standard. For mean spike count per trial i1 n i =N not close to zero, the ring rate estimate is approximately normally distributed. Performing the subtraction resp stim spont provides an even better approximation of normality. A normally distributed variable has variances that are

10 1334 S. Lehky chi-square distributed with N 1 degrees of freedom, and the variance of variance is Var.Var. // std 2.E.Var. ///2 : (6.18) N 1 Substituting the value of E.Var. // in equation 6.12 gives ± PNi1 ² 2 2 n i Var.Var. // std.n 1/.NT/ 4 The response variance of variance is: Var.Var. resp // std and corrected for refractory period: Var.Var. resp // std 2r 2.N 1/.NT/ 2 : (6.19) Á P P 2 N N!2 i1 ni stim i1.n 1/N 4 Tstim 2 C ni spont Tspont 2 2 rstim.n 1/N 2 C r 2 spont : (6.20) T stim T spont Á P 2 Nº N stim i1 ni stim.n 1/N 4 Tstim 2 Nºstim r stim 2.N 1/N 2 T stim C Nº spont! 2 i1 ni spont T 2 spont C Nº 2 spontr spont : (6.21) T spont Firing rate statistics calculated from the above equations for a pure Poisson process (without refractory period) are presented in Table 1, for parameters r stim 24 Hz, r spont 8 Hz, and T stim T spont 0:5 sec. Table 1 shows identical values for E. / under both methods, as expected for the special case T stim T spont. Bayesian variances E.Var. // are slightly larger than standard estimates but converge as the number of trials increases. Finally, the Bayesian method gives much smaller values for the variance of variance Var.Var. // than the standard analysis. 7 Estimating Neural Responses for Spike Trains with a Refractory Period The analysis is now applied to simulated data shown in Figure 1. At the bottom is a raster plotofspike trains from 20 trials in a simulated experiment.

11 Bayesian Estimation of Neural Responses 1335 Table 1: Stimulus Response Statistics for a Pure Poisson Process. E. resp / E.Var. resp // Var.Var. resp // Trials Standard Bayesian Standard Bayesian Standard Bayesian Notes: These were calculated using parameters r stim 24 Hz, r spont 8 Hz, and T stim T spont 0:5 sec. Bayesian estimates are based on a uniform prior and use the MMSE method. Spikes/sec Trial Time (msec) Figure 1: Simulated spike trains used to demonstrate the Bayesian analysis. At bottom is a raster plot of 20 trials; the top shows the peristimulus time histogram. Spike counts are Poisson distributed, but with a 3 ms refractory period included. The timescale has been shifted by the latency so that zero coincides with the onset of stimulus response.

12 1336 S. Lehky The spikes are Poisson distributed but include a 3 ms refractory period. In each trial, there is a 0.5 sec prestimulus period with spontaneous activity of 8 Hz, followed by a 0.5 sec stimulus period in which activity rises to 24 Hz. Activity then returns to spontaneous. The timescale has been translated by the stimulus latency so that the stimulus period starts at zero. The response distribution is found by starting with a prior distribution and successively multiplying in the likelihood function for each trial. For example, the rst trial has 4 spikes in the prestimulus and 20 spikes in the stimulus period. Substituting n 1 spont 4, n1 stim 20, T1 stim 0:500, and Tspont 1 0:500 into the refractory-period corrected analog of equation 4.3 gives the likelihood. sum ; dif / for that trial. Following incorporation of data from the nal trial, the distribution postprob. sum ; dif / is integrated along sum to give the response distribution, postprob. dif /. (For the discussion below, dif shall be relabeled resp.) Figure 2 illustrates the evolution of the response distribution as data from successive trials are included, starting from two prior distributions. Also shown are the conventional means and standard errors of the response. No standard error bar is shown for the Trial = 1 panel, because the conventional method does not allow calculating that statistic from a single trial. The rst panel plots the priors themselves (0th trial). We can obtain a better idea of the relation between the Bayesian and conventional estimates by comparing statistics from multiple replications of the experiment in Figure 1, determined for a uniform prior. Results of the simulations are given in Table 2, based on 10,000 experiment replications with N stimulus trials per experiment, and with r stim 24 Hz, r stim 8 Hz, and T stim T spont 0:5 sec, where r is the spike density parameter used to generate the spike trains. Table 2 shows E. resp /, mean response over 10,000 replications; E.Var. resp //, mean of the 10,000 response variances; and Var.Var. resp //, variance of the 10,000 response variances. Under these simulations, E. resp / is identical under both methods, as expected for the special case T stim T spont, and equal to r stim r spont. Values of resp are approximately normally distributed in both cases. Table 2: Stimulus Response Statistics from Simulations Using Homogeneous Poisson Spike Trains with a 3 Msec Refractory Period Added. E. resp / E.Var. resp // Var.Var. resp // Trials Standard Bayesian Standard Bayesian Standard Bayesian Note: Simulation parameters are as given in Table 1.

13 Bayesian Estimation of Neural Responses 1337 Rel Prob ensity Trial=0 (Prior prob) Trial= Rel Prob ensity Rel Prob ensity Trial= Trial= l resp 60 (spikes/sec) Trial= Trial= l resp (spikes/sec) Figure 2: Evolution of the Bayesian probability distribution of responses. resp stim spont / as data from increasing numbers of trials accumulate. The rst panel (Trial 0) shows two prior distributions the uniform distribution (dotted line) and a distribution that assumes stimulus and spontaneous ring rates are identical (dashed line). Solid vertical and horizontal lines in the panels indicate means and standard errors calculated in the standard way. All curves have been normalized to a height of one.

14 1338 S. Lehky Prob ensity Bayesian Standard Var( l resp ) (spikes/sec) 2 Figure 3: istributions of response variances for standard and Bayesian methods, derived from 25,000 replications of an experiment with ve stimulus trials per experiment.n 5/. The probability density functions were estimated from the simulated data by kernel smoothing. Although both Var. resp / distributions have almost identical E.Var. resp // values (see Table 2), the Bayesian method produces a much smaller Var.Var. resp //. The distribution produced by the standard method is chi square with N 1 4 degrees of freedom (see equation 7.1), while the Bayesian method leads to approximately normally distributed variances. Moving to E.Var. resp //, the variances for both methods are smaller than those for a pure Poisson process (comparing Table 2 with Table 1). This re- ects the presence of a 3 msec refractory period in the simulated spike trains that served as data. Bayesian variances are slightly larger than the standard ones, with the values converging as the number of trials N increases. Although both methods produce almost identical values of E.Var. resp //, we see in Table 2 that Var.Var. resp // is substantially smaller for the Bayesian method (as was the case calculated for a pure Poisson process in Table 1). This relationship can also be seen in the plot of the distribution of Var. resp / for N 5 trials (see Figure 3). As resp is normally distributed, under the standard analysis the distribution of Var. resp / is chi-square distributed with m N 1 degrees of freedom: prob.var. / j m; ¾ / E.Var. // m¾ 2 ± ².Var. // m 2 1 exp Var. / 2¾ 2 2. m 2 / 0. m 2 /¾ m Var.Var. // 2m¾ 4 : (7.1)

15 Bayesian Estimation of Neural Responses 1339 Combining expressions for E.Var. // in equations 6.14 and 7.1, a calculated value of ¾ can be derived: s Nº stim r stim ¾ C Nº spontr spont : (7.2).N 1/NT stim.n 1/NT spont For the N 5 simulations shown in Figure 3, it can be determined that the distribution for the standard method does indeed follow a chi-square distribution with a least-squares t of ¾ 1:70, compared to a calculated value of ¾ 1:68. The value of Var.Var. resp // observed in these simulations was 67.9, compared to 64.0 calculated from equation The bias in variance of variance calculations arises because of the use of an average refractory period correction factor Nº rather than trial-by-trial values of º. In contrast, the Bayesian method, despite also having a normally distributed response estimate, produces a Var. resp / that is not chi square but approximately normally distributed. The reason is that under the Bayesian method, Var. resp / is not calculated from the sample of resp estimates, but rather from the width of the likelihood functions. For the N 5 example, the observed Var.Var. resp // in the simulations was 1.10, lower than the value of 1.60 calculated from equation 6.17, again showing a bias in the calculated estimate caused by doing refractory period corrections on pooled data rather than on a trial-by-trial basis. 8 Effects of Noise on Bayesian Estimates The Bayesian analysis assumes that spike trains (either spontaneous or stimulus) form homogeneous Poisson processes, in which ring rates do not vary with time within each epoch. In reality, the data will deviate from this. Here we examine the robustness of the Bayesian statistics by testing how well the method works when inhomogeneous Poisson spike trains serve as inputs. Two cases are considered: fast- uctuation noise, in which spike rates vary rapidly relative to the duration of a single trial, and slow- uctuation noise, in which spike rates are constant within a trial but vary randomly from trial to trial. In both cases, we retain the presence of a 3 msec refractory period. Fast- uctuation noise consisted of Poisson spike trains whose randomly varying instantaneous spike density parameter was described by low-pass ltered gaussian white noise. The white noise perturbed spike density was set with means Nr spont 8 Hz and Nr stim 24 Hz and standard deviations of r.t/ equal to 0.5 Nr. The noise was passed through a Butterworth low-pass lter: A out A in 1 [1 C. f=f c / 2n ] 1 2 n 4; f c 20Hz: (8.1) Since spike rate uctuations on average integrated to zero within each trial, they did not change the spike count but merely rearranged their timing,

16 1340 S. Lehky making the spike train more bursty. Fano factors remained at 1.0 over the timescale of a complete trial, unchanged from homogeneous Poisson trains. For slow- uctuation noise, both spontaneous and stimulus spike rates were constant within a trial, but changed randomly from trial to trial according to a log-normal distribution: 8 < 1 prob.r j ¹; ¾ / r¾ p 2¼ e.ln.r/ ¹/ 2 2¾ 2 r > 0 : 0 r 0 : (8.2) Parameters for the spontaneous ring rate were ¹ 2:02, ¾ 0:35, and for the stimulus rate, ¹ 3:15, ¾ 0:22. Those values satis ed two criteria. First, the mean spike densities were the same as used previously, Nr spont 8 and Nr stim 24, and second, Fano factors were approximately 2.0 in both cases. Using noise-perturbed spike trains, stimulus response statistics were calculated as before. Results for the two noise conditions are given in Tables 3 and 4, to be compared with results for homogeneous Poisson trains in Table 2. Comparison of Tables 2 and 3 shows that fast- uctuation noise has no effect on either Bayesian or standard estimates, not surprising given that on extended timescales, spike count is conserved under this noise. Slow- uctuation noise produces no signi cant change in Bayesian estimates of E.Var. resp //, as a comparison of Tables 2 and 4 shows. On the other hand, standard estimates of E.Var. resp // were increased substantially. Thus, under this noise condition, Bayesian variance estimates become more accurate than standard ones (closer to the noise-free estimates), in addition to being more precise (smaller variance of the variance). In the slow- uctuation noise case, we have spike trains that appear non- Poisson by one criterion, a Fano factor of two, yet the estimates of a Poisson- Table 3: Response Statistics Using Inhomogeneous Poisson Spike Trains with a 3 Msec Refractory Period and Whose Firing Rates are Perturbed with Fast- Fluctuation Noise. E. resp / E.Var. resp // Var.Var. resp // Trials Standard Bayesian Standard Bayesian Standard Bayesian Notes: Fast- uctuation noise varies rapidly relative to the duration of a stimulus trial, as de ned in equation 8.1. Simulation parameters are as given in Table 1.

17 Bayesian Estimation of Neural Responses 1341 Table 4: Response Statistics Using Inhomogeneous Poisson Spike Trains with a 3 Msec Refractory Period and Whose Firing Rates are Perturbed with Slow- Fluctuation Noise. E. resp / E.Var. resp // Var.Var. resp // Trials Standard Bayesian Standard Bayesian Standard Bayesian Notes: Slow- uctuation noise is slow relative to the duration of a stimulus trial, as de ned in equation 8.2. Simulation parameters are as given in Table 1. Table 5: Example of Bayesian Estimation Applied to Actual ata. E. resp / E.Var. resp // Var.Var. resp // Trials Standard Bayesian Standard Bayesian Standard Bayesian Notes: The data consisted of 10 trials recorded from a unit in macaque striate cortex, presented with grating stimuli. The Fano factor was 1.7. Bayesian estimation was applied to 50 replications of a bootstrap resampling of the data, with 5 of the 10 trials chosen at random with replacement for each resampling. based model are little affected as spike generation is still Poisson at a local timescale. The insensitivity of Bayesian estimates to either type of noise perturbation away from a homogeneous Poisson process indicates the robustness of the method. As a nal item on noise-perturbed spike trains, we apply the Bayesian method to actual data recorded from monkey striate cortex (see Table 5). In this case, E. resp / is slightly different for the two methods because T stim 6 T spont (see equation 6.4). The pattern of E.Var. resp // values for the actual data appears to follow those of the simulations with slow- uctuation noise added, in which Bayesian variances are substantially smaller than standard variances. We noted previously that under the slow- uctuation noise condition, the smaller Bayesian variance estimates are more accurate than the standard estimates. 9 iscussion The Bayesian and standard methods provide similar estimates for the mean E. resp / and variance E.Var. resp // of the response ring rate for parameters typical of experimental conditions. The estimates of both methods converge

18 1342 S. Lehky as the data sample size increases, by increasing either the number of trials or the duration of each trial, so that the Bayesian model is asymptotically unbiased. Given a uniform prior, the Bayesian estimate is a maximum likelihood estimate and therefore asymptotically ef cient (given certain regularity conditions) (Kay, 1993). That is, the Bayesian E.Var. resp // estimate asymptotically approaches the Cramér-Rao bound. As the Bayesian estimate also asymptotically approaches the standard estimate, the implication is that the standard estimate is, at minimum, also asymptotically ef cient. The Bayesian method provides much smaller estimates of the variance of the variance.var.var. resp /// than the standard method, despite having almost identical values of expected variance E.Var. resp //. That is because the two methods compute variances using entirely different procedures. Under the standard method, the variance of the ring rate Var. / is computed directly from the sample of itself by plugging those values into the standard formula for variance, without making assumptions about the distribution of. Under the Bayesian method, statistics are calculated indirectly from the data in a two step process, by rst tting a model (the likelihood model) and then determining statistics from the model curve t rather than directly from the data. We can identify three bene ts to using the Bayesian analysis relative to the standard analysis. The rst is smaller values for the variance of the variance. This allows more precise interval estimates of a parameter, which in turn improves the precision of tests of signi cance. The second is reduced sensitivity of variance estimates to perturbations in the data caused by noise. This can be seen by comparing Tables 2 and 4. The nal bene t is the ability to estimate response variance after only a single trial. Underlying the Bayesian analysis was the assumption that spike trains have Poisson statistics. Sensitivity to this assumption was tested by perturbing spike trains away from Poisson statistics with two kinds of noise, whose uctuations were fast or slow relative to the timescale of a single trial. espite the noise perturbations, Bayesian estimates of response mean and variance were not signi cantly affected. This indicates that the Bayesian method is robust. Given the bene ts of Bayesian analysis listed above and the robustness of the method, it appears to be a promising candidate for practical data analysis. Acknowledgments I thank Keiji Tanaka and Leslie Ungerleider for their support. Portions of this work were completed while I was visiting the Sloan-Schwartz Center for Theoretical Neurobiology at the Salk Institute.

19 Bayesian Estimation of Neural Responses 1343 References Brown, E. N., Frank, L. M., Tang,., Quirk, M. C., & Wilson, M. A. (1998). A statistical paradigm for neural spike train decoding applied to position prediction from ensemble ring patterns of rat hippocampal place cells. J. Neurosci., 18, Kay, S. M. (1993). Fundamentals of statistical signal processing: Estimation theory. Upper Saddle River, NJ: Prentice Hall. Martignon, L., eco, G., Laskey, K., iamond, M., Freiwald, W., & Vaadia, E. (2000). Neural coding: Higher order temporal patterns in the neurostatistics of cell assemblies. Neural Comput., 12, Sanger, T.. (1996). Probability density estimation for the interpretation of neural population codes. J. Neurophysiol., 76, Zhang, K., Ginzburg, I., McNaughton, B. L., & Sejnowski, T. J. (1998). Interpreting neuronal population activity by reconstruction: Uni ed framework with application to hippocampal place cells. J. Neurophysiol., 79, Received April 16, 2003; accepted ecember 3, 2003.

Neuronal Tuning: To Sharpen or Broaden?

Neuronal Tuning: To Sharpen or Broaden? NOTE Communicated by Laurence Abbott Neuronal Tuning: To Sharpen or Broaden? Kechen Zhang Howard Hughes Medical Institute, Computational Neurobiology Laboratory, Salk Institute for Biological Studies,

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 part 3a :Three definitions of rate code Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne,

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

Neural variability and Poisson statistics

Neural variability and Poisson statistics Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic

More information

The Effect of Correlated Variability on the Accuracy of a Population Code

The Effect of Correlated Variability on the Accuracy of a Population Code LETTER Communicated by Michael Shadlen The Effect of Correlated Variability on the Accuracy of a Population Code L. F. Abbott Volen Center and Department of Biology, Brandeis University, Waltham, MA 02454-9110,

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

3.3 Population Decoding

3.3 Population Decoding 3.3 Population Decoding 97 We have thus far considered discriminating between two quite distinct stimulus values, plus and minus. Often we are interested in discriminating between two stimulus values s

More information

Firing Rate Distributions and Ef ciency of Information Transmission of Inferior Temporal Cortex Neurons to Natural Visual Stimuli

Firing Rate Distributions and Ef ciency of Information Transmission of Inferior Temporal Cortex Neurons to Natural Visual Stimuli LETTER Communicated by Dan Ruderman Firing Rate Distributions and Ef ciency of Information Transmission of Inferior Temporal Cortex Neurons to Natural Visual Stimuli Alessandro Treves SISSA, Programme

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each

More information

Statistical Smoothing of Neuronal Data p.1

Statistical Smoothing of Neuronal Data p.1 Statistical Smoothing of Neuronal Data Robert E. Kass, Valérie Ventura, and Can Cai kass@stat.cmu.edu Department of Statistics and the Center for the Neural Basis of Cognition Carnegie Mellon University

More information

Finding a Basis for the Neural State

Finding a Basis for the Neural State Finding a Basis for the Neural State Chris Cueva ccueva@stanford.edu I. INTRODUCTION How is information represented in the brain? For example, consider arm movement. Neurons in dorsal premotor cortex (PMd)

More information

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

Signal detection theory

Signal detection theory Signal detection theory z p[r -] p[r +] - + Role of priors: Find z by maximizing P[correct] = p[+] b(z) + p[-](1 a(z)) Is there a better test to use than r? z p[r -] p[r +] - + The optimal

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

Dynamical Constraints on Computing with Spike Timing in the Cortex

Dynamical Constraints on Computing with Spike Timing in the Cortex Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and

More information

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation NOTE Communicated by Jonathan Victor Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation Robert E. Kass kass@stat.cmu.edu Valérie Ventura vventura@stat.cmu.edu

More information

Improved characterization of neural and behavioral response. state-space framework

Improved characterization of neural and behavioral response. state-space framework Improved characterization of neural and behavioral response properties using point-process state-space framework Anna Alexandra Dreyer Harvard-MIT Division of Health Sciences and Technology Speech and

More information

Decoding Poisson Spike Trains by Gaussian Filtering

Decoding Poisson Spike Trains by Gaussian Filtering LETTER Communicated by Paul Tiesinga Decoding Poisson Spike Trains by Gaussian Filtering Sidney R. Lehky sidney@salk.edu Computational Neuroscience Lab, Salk Institute, La Jolla, CA 92037, U.S.A. The temporal

More information

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

More information

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although, 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining what

More information

Lateral organization & computation

Lateral organization & computation Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar

More information

Model neurons!!poisson neurons!

Model neurons!!poisson neurons! Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence

More information

Statistical Modeling of Temporal Evolution in Neuronal Activity. Rob Kass

Statistical Modeling of Temporal Evolution in Neuronal Activity. Rob Kass . p.1 Statistical Modeling of Temporal Evolution in Neuronal Activity Rob Kass Department of Statistics and The Center for the Neural Basis of Cognition Carnegie Mellon University . p.2 Statistical Collaborators:

More information

Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex

Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex Y Gao M J Black E Bienenstock S Shoham J P Donoghue Division of Applied Mathematics, Brown University, Providence, RI 292 Dept

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering

Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering LETTER Communicated by Emanuel Todorov Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering Uri T. Eden tzvi@neurostat.mgh.harvard.edu Neuroscience Statistics Research Laboratory, Department

More information

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU http://www.cns.nyu.edu/~pillow Oct 5, Course lecture: Computational Modeling of Neuronal Systems

More information

The receptive fields of neurons are dynamic; that is, their

The receptive fields of neurons are dynamic; that is, their An analysis of neural receptive field plasticity by point process adaptive filtering Emery N. Brown*, David P. Nguyen*, Loren M. Frank*, Matthew A. Wilson, and Victor Solo *Neuroscience Statistics Research

More information

Describing Spike-Trains

Describing Spike-Trains Describing Spike-Trains Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2012 Neural Coding The brain manipulates information by combining and generating action

More information

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Maneesh Sahani Gatsby Computational Neuroscience Unit University College, London Jennifer

More information

Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models

Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models Ryan Greenaway-McGrevy y Bureau of Economic Analysis Chirok Han Korea University February 7, 202 Donggyu Sul University

More information

Neural Decoding. Chapter Encoding and Decoding

Neural Decoding. Chapter Encoding and Decoding Chapter 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining

More information

Trial-to-Trial Variability and its. on Time-Varying Dependence Between Two Neurons

Trial-to-Trial Variability and its. on Time-Varying Dependence Between Two Neurons Trial-to-Trial Variability and its Effect on Time-Varying Dependence Between Two Neurons Can Cai, Robert E. Kass, and Valérie Ventura Department of Statistics and Center for the Neural Basis of Cognition

More information

Bayesian Inference. 2 CS295-7 cfl Michael J. Black,

Bayesian Inference. 2 CS295-7 cfl Michael J. Black, Population Coding Now listen to me closely, young gentlemen. That brain is thinking. Maybe it s thinking about music. Maybe it has a great symphony all thought out or a mathematical formula that would

More information

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28

More information

A new look at state-space models for neural data

A new look at state-space models for neural data A new look at state-space models for neural data Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University http://www.stat.columbia.edu/ liam liam@stat.columbia.edu

More information

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) =

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) = LNP cascade model Simplest successful descriptive spiking model Easily fit to (extracellular) data Descriptive, and interpretable (although not mechanistic) For a Poisson model, response is captured by

More information

2. A Basic Statistical Toolbox

2. A Basic Statistical Toolbox . A Basic Statistical Toolbo Statistics is a mathematical science pertaining to the collection, analysis, interpretation, and presentation of data. Wikipedia definition Mathematical statistics: concerned

More information

ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS

ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS S.B. LOWEN Department of Electrical and Computer Engineering 44 Cummington Street Boston University,

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Disambiguating Different Covariation Types

Disambiguating Different Covariation Types NOTE Communicated by George Gerstein Disambiguating Different Covariation Types Carlos D. Brody Computation and Neural Systems Program, California Institute of Technology, Pasadena, CA 925, U.S.A. Covariations

More information

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln RESEARCH STATEMENT Nora Youngs, University of Nebraska - Lincoln 1. Introduction Understanding how the brain encodes information is a major part of neuroscience research. In the field of neural coding,

More information

Transductive neural decoding for unsorted neuronal spikes of rat hippocampus

Transductive neural decoding for unsorted neuronal spikes of rat hippocampus Transductive neural decoding for unsorted neuronal spikes of rat hippocampus The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Föreläsning /31

Föreläsning /31 1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +

More information

Dynamic Analyses of Information Encoding in Neural Ensembles

Dynamic Analyses of Information Encoding in Neural Ensembles LETTER Communicated by Jonathan Victor Dynamic Analyses of Information Encoding in Neural Ensembles Riccardo Barbieri barbieri@neurostat.mgh.harvard.edu Neuroscience Statistics Research Laboratory, Department

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Spatio-temporal correlations and visual signaling in a complete neuronal population Jonathan W. Pillow 1, Jonathon Shlens 2, Liam Paninski 3, Alexander Sher 4, Alan M. Litke 4,E.J.Chichilnisky 2, Eero

More information

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Younesse Kaddar. Covariance and Correlation Assume that we have recorded two neurons in the two-alternative-forced

More information

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS AT2 Neuromodeling: Problem set #3 SPIKE TRAINS Younesse Kaddar PROBLEM 1: Poisson spike trains Link of the ipython notebook for the code Brain neuron emit spikes seemingly randomly: we will aim to model

More information

Nature Neuroscience: doi: /nn.2283

Nature Neuroscience: doi: /nn.2283 Supplemental Material for NN-A2678-T Phase-to-rate transformations encode touch in cortical neurons of a scanning sensorimotor system by John Curtis and David Kleinfeld Figure S. Overall distribution of

More information

Empirical Bayes interpretations of random point events

Empirical Bayes interpretations of random point events INSTITUTE OF PHYSICS PUBLISHING JOURNAL OF PHYSICS A: MATHEMATICAL AND GENERAL J. Phys. A: Math. Gen. 38 (25) L531 L537 doi:1.188/35-447/38/29/l4 LETTER TO THE EDITOR Empirical Bayes interpretations of

More information

Accuracy and learning in neuronal populations

Accuracy and learning in neuronal populations M.A.L. Nicolelis (Ed.) Pmgress in Brain Research, Vol. 130 O 2001 Elsevier Science B.V. All rights reserved CHAPTER 21 Accuracy and learning in neuronal populations Kechen Zhang '.* and Terrence J. Sejnowski

More information

Improvements to the Sensitivity of Gravitational Clustering for Multiple Neuron Recordings

Improvements to the Sensitivity of Gravitational Clustering for Multiple Neuron Recordings LETTER Communicated by Ad Aertsen Improvements to the Sensitivity of Gravitational Clustering for Multiple Neuron Recordings Stuart N. Baker Department of Anatomy, University of Cambridge, Cambridge, CB2

More information

Statistical models for neural encoding

Statistical models for neural encoding Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk

More information

Linking non-binned spike train kernels to several existing spike train metrics

Linking non-binned spike train kernels to several existing spike train metrics Linking non-binned spike train kernels to several existing spike train metrics Benjamin Schrauwen Jan Van Campenhout ELIS, Ghent University, Belgium Benjamin.Schrauwen@UGent.be Abstract. This work presents

More information

F & B Approaches to a simple model

F & B Approaches to a simple model A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 215 http://www.astro.cornell.edu/~cordes/a6523 Lecture 11 Applications: Model comparison Challenges in large-scale surveys

More information

Encoding or decoding

Encoding or decoding Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking

Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

Basic concepts in estimation

Basic concepts in estimation Basic concepts in estimation Random and nonrandom parameters Definitions of estimates ML Maimum Lielihood MAP Maimum A Posteriori LS Least Squares MMS Minimum Mean square rror Measures of quality of estimates

More information

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,

More information

An improved estimator of Variance Explained in the presence of noise

An improved estimator of Variance Explained in the presence of noise An improved estimator of Variance Explained in the presence of noise Ralf. M. Haefner Laboratory for Sensorimotor Research National Eye Institute, NIH Bethesda, MD 89 ralf.haefner@gmail.com Bruce. G. Cumming

More information

Sean Escola. Center for Theoretical Neuroscience

Sean Escola. Center for Theoretical Neuroscience Employing hidden Markov models of neural spike-trains toward the improved estimation of linear receptive fields and the decoding of multiple firing regimes Sean Escola Center for Theoretical Neuroscience

More information

When is an Integrate-and-fire Neuron like a Poisson Neuron?

When is an Integrate-and-fire Neuron like a Poisson Neuron? When is an Integrate-and-fire Neuron like a Poisson Neuron? Charles F. Stevens Salk Institute MNL/S La Jolla, CA 92037 cfs@salk.edu Anthony Zador Salk Institute MNL/S La Jolla, CA 92037 zador@salk.edu

More information

Econometrics Midterm Examination Answers

Econometrics Midterm Examination Answers Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i

More information

Fitting a Stochastic Neural Network Model to Real Data

Fitting a Stochastic Neural Network Model to Real Data Fitting a Stochastic Neural Network Model to Real Data Christophe Pouzat, Ludmila Brochini, Pierre Hodara and Guilherme Ost MAP5 Univ. Paris-Descartes and CNRS Neuromat, USP christophe.pouzat@parisdescartes.fr

More information

ECE531 Lecture 10b: Maximum Likelihood Estimation

ECE531 Lecture 10b: Maximum Likelihood Estimation ECE531 Lecture 10b: Maximum Likelihood Estimation D. Richard Brown III Worcester Polytechnic Institute 05-Apr-2011 Worcester Polytechnic Institute D. Richard Brown III 05-Apr-2011 1 / 23 Introduction So

More information

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl Spiking neurons Membrane equation V GNa GK GCl Cm VNa VK VCl dv dt + V = V Na G Na + V K G K + V Cl G Cl G total G total = G Na + G K + G Cl = C m G total Membrane with synaptic inputs V Gleak GNa GK

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

Economics 241B Estimation with Instruments

Economics 241B Estimation with Instruments Economics 241B Estimation with Instruments Measurement Error Measurement error is de ned as the error resulting from the measurement of a variable. At some level, every variable is measured with error.

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Inference and estimation in probabilistic time series models

Inference and estimation in probabilistic time series models 1 Inference and estimation in probabilistic time series models David Barber, A Taylan Cemgil and Silvia Chiappa 11 Time series The term time series refers to data that can be represented as a sequence

More information

Temporal context calibrates interval timing

Temporal context calibrates interval timing Temporal context calibrates interval timing, Mehrdad Jazayeri & Michael N. Shadlen Helen Hay Whitney Foundation HHMI, NPRC, Department of Physiology and Biophysics, University of Washington, Seattle, Washington

More information

ADJUSTED POWER ESTIMATES IN. Ji Zhang. Biostatistics and Research Data Systems. Merck Research Laboratories. Rahway, NJ

ADJUSTED POWER ESTIMATES IN. Ji Zhang. Biostatistics and Research Data Systems. Merck Research Laboratories. Rahway, NJ ADJUSTED POWER ESTIMATES IN MONTE CARLO EXPERIMENTS Ji Zhang Biostatistics and Research Data Systems Merck Research Laboratories Rahway, NJ 07065-0914 and Dennis D. Boos Department of Statistics, North

More information

Parameter estimation! and! forecasting! Cristiano Porciani! AIfA, Uni-Bonn!

Parameter estimation! and! forecasting! Cristiano Porciani! AIfA, Uni-Bonn! Parameter estimation! and! forecasting! Cristiano Porciani! AIfA, Uni-Bonn! Questions?! C. Porciani! Estimation & forecasting! 2! Cosmological parameters! A branch of modern cosmological research focuses

More information

When is a copula constant? A test for changing relationships

When is a copula constant? A test for changing relationships When is a copula constant? A test for changing relationships Fabio Busetti and Andrew Harvey Bank of Italy and University of Cambridge November 2007 usetti and Harvey (Bank of Italy and University of Cambridge)

More information

A Statistical Paradigm for Neural Spike Train Decoding Applied to Position Prediction from Ensemble Firing Patterns of Rat Hippocampal Place Cells

A Statistical Paradigm for Neural Spike Train Decoding Applied to Position Prediction from Ensemble Firing Patterns of Rat Hippocampal Place Cells The Journal of Neuroscience, September 15, 1998, 18(18):7411 7425 A Statistical Paradigm for Neural Spike Train Decoding Applied to Position Prediction from Ensemble Firing Patterns of Rat Hippocampal

More information

Fig. S5: Same as Fig. S3 for the combination Sample

Fig. S5: Same as Fig. S3 for the combination Sample Fig. S: Study of the histogram of the distribution of values { in; i N; N } represented in Fig... Logarithm of the posterior probability (LPP) of bin number N (as defined in ref. ). The maximum is obtained

More information

Real and Modeled Spike Trains: Where Do They Meet?

Real and Modeled Spike Trains: Where Do They Meet? Real and Modeled Spike Trains: Where Do They Meet? Vasile V. Moca 1, Danko Nikolić,3, and Raul C. Mureşan 1, 1 Center for Cognitive and Neural Studies (Coneural), Str. Cireşilor nr. 9, 4487 Cluj-Napoca,

More information

The Multivariate Gaussian Distribution

The Multivariate Gaussian Distribution 9.07 INTRODUCTION TO STATISTICS FOR BRAIN AND COGNITIVE SCIENCES Lecture 4 Emery N. Brown The Multivariate Gaussian Distribution Analysis of Background Magnetoencephalogram Noise Courtesy of Simona Temereanca

More information

10/8/2014. The Multivariate Gaussian Distribution. Time-Series Plot of Tetrode Recordings. Recordings. Tetrode

10/8/2014. The Multivariate Gaussian Distribution. Time-Series Plot of Tetrode Recordings. Recordings. Tetrode 10/8/014 9.07 INTRODUCTION TO STATISTICS FOR BRAIN AND COGNITIVE SCIENCES Lecture 4 Emery N. Brown The Multivariate Gaussian Distribution Case : Probability Model for Spike Sorting The data are tetrode

More information

Practice Problems Section Problems

Practice Problems Section Problems Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Supplementary discussion 1: Most excitatory and suppressive stimuli for model neurons The model allows us to determine, for each model neuron, the set of most excitatory and suppresive features. First,

More information

Part 4: Multi-parameter and normal models

Part 4: Multi-parameter and normal models Part 4: Multi-parameter and normal models 1 The normal model Perhaps the most useful (or utilized) probability model for data analysis is the normal distribution There are several reasons for this, e.g.,

More information

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS EMERY N. BROWN AND CHRIS LONG NEUROSCIENCE STATISTICS RESEARCH LABORATORY DEPARTMENT

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS Parametric Distributions Basic building blocks: Need to determine given Representation: or? Recall Curve Fitting Binary Variables

More information

What is the neural code? Sekuler lab, Brandeis

What is the neural code? Sekuler lab, Brandeis What is the neural code? Sekuler lab, Brandeis What is the neural code? What is the neural code? Alan Litke, UCSD What is the neural code? What is the neural code? What is the neural code? Encoding: how

More information

p(z)

p(z) Chapter Statistics. Introduction This lecture is a quick review of basic statistical concepts; probabilities, mean, variance, covariance, correlation, linear regression, probability density functions and

More information

Lossless Online Bayesian Bagging

Lossless Online Bayesian Bagging Lossless Online Bayesian Bagging Herbert K. H. Lee ISDS Duke University Box 90251 Durham, NC 27708 herbie@isds.duke.edu Merlise A. Clyde ISDS Duke University Box 90251 Durham, NC 27708 clyde@isds.duke.edu

More information

arxiv:physics/ v1 [physics.data-an] 7 Jun 2003

arxiv:physics/ v1 [physics.data-an] 7 Jun 2003 Entropy and information in neural spike trains: Progress on the sampling problem arxiv:physics/0306063v1 [physics.data-an] 7 Jun 2003 Ilya Nemenman, 1 William Bialek, 2 and Rob de Ruyter van Steveninck

More information

Why is the field of statistics still an active one?

Why is the field of statistics still an active one? Why is the field of statistics still an active one? It s obvious that one needs statistics: to describe experimental data in a compact way, to compare datasets, to ask whether data are consistent with

More information

1/12/2017. Computational neuroscience. Neurotechnology.

1/12/2017. Computational neuroscience. Neurotechnology. Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording

More information

Large Sample Properties of Estimators in the Classical Linear Regression Model

Large Sample Properties of Estimators in the Classical Linear Regression Model Large Sample Properties of Estimators in the Classical Linear Regression Model 7 October 004 A. Statement of the classical linear regression model The classical linear regression model can be written in

More information