Statistical Modeling of Temporal Evolution in Neuronal Activity. Rob Kass

Size: px
Start display at page:

Download "Statistical Modeling of Temporal Evolution in Neuronal Activity. Rob Kass"

Transcription

1 . p.1 Statistical Modeling of Temporal Evolution in Neuronal Activity Rob Kass Department of Statistics and The Center for the Neural Basis of Cognition Carnegie Mellon University

2 . p.2 Statistical Collaborators: Sam Behseta, Can Cai, Cari Kaufman, Alex Rojas, Liuxia Wang and Anthony Brockwell, Emery Brown, Satish Iyengar, Valérie Ventura Neurophysiological Collaborators: Tai-Sing Lee, Carl Olson, Andy Schwartz, Tony Reina Peter Strick, Donna Hoffman, Natalie Picard

3 . p.3 REFERENCES Brown, E.N., Barbieri, R., Ventura, V., Kass, R.E., and Frank, L.M. (2002) The time-rescaling theorem and its application to neuronal spike train data analysis, Neural Comput., 14: DiMatteo, I., Genovese, C.R., and Kass, R.E. (2001) Bayesian curve-fitting with free-knot splines. Biometrika, 88: Kass, R.E., and Ventura, V. (2001) A spike-train probability model. Neural Comput., 13: Olson, C.R., Gettner, S.N., Ventura, V., Carta, R. and Kass, R.E. (2000) Neuronal activity in macaque supplementary eye field during planning of saccades in response to pattern and spatial cues. J. Neurophys., 84: Ventura, V., Carta, R., Kass, R.E., Gettner, S.N., and Olson, C.R. (2001) Statistical analysis of temporal evolution in single-neuron firing rates. Biostatistics, 3: 1-20.

4 . p.4 Also An on-line, self-contained talk Statistical Smoothing of Neuronal Data is available at

5 . p.5 Stochastic Models for 1. Instantaneous firing rate of single neurons; 2. Variation in firing rate across many neurons; 3. Decoding for movement prediction; 4. Within-trial firing rate: Non-Poisson spiking; 5. Correlated spiking across pairs of neurons.

6 . p.6 Statistical Motivation for Modeling Probability models offer efficiency; flexibility; formal inferences.

7 . p.7 Statistical Motivation for Modeling Probability models offer efficiency; flexibility; formal inferences. Comment: Probability models require assumptions; so does every well-justified statistical method; of course, some make stronger assumptions than others.

8 . p.8 Statistical Motivation for Modeling Efficency: toy problem; smoothing the PSTH; Bayesian decoding; Flexibility: smoothing with BARS (and MBARS); non-poisson firing; Formal inferences: all of the methods discussed here.

9 . p.9 Some Data From Chris Baker, in Carl Olson s lab. Recorded from IT. Passive fixation task: Visual stimulus (simple geometric figure, white on dark background) presented (in receptive field) for 500 ms; if fixation maintained, reward given. Time = 0 is onset of stimulus.

10 Time (ms). p.10 Firing rate Trial number

11 p(xjμ) = e μ μx x! :. p.11 Assume number of spikes x in given time interval (a; b) follows Poisson distribution, with mean μ: Obvious estimator of μ: μx, mean number of spikes in (a; b) across observed trials Alternative: s 2, sample variance

12 . p.12 Compare μx to s 2 using Mean Squared Error (MSE): MSE(^μ) = E((^μ μ) 2 ) Result: For n = 100 trials and μ = 10, s 2 has about 21 times larger MSE than μx. E.g., need about 2100 trials using s 2 to get accuracy of 100 trials using μx.

13 . p.13 Maximum Likelihood Estimators Poisson μx case: is MLE. Fisher (1922, 1925; and subsequent theoreticians): Quite generally, for large samples, the MLE has the minimal possible Mean Squared Error. Applies to parametric models; formally, number of parameters remains finite while sample size grows indefinitely.

14 P (AjB) = P (BjA)P (A) P (BjA)P (A) + P (BjA c )P (A c ). p.14 Bayes s Theorem:

15 P (AjB) = P (BjA)P (A) P (BjA)P (A) + P (BjA c )P (A c ) R p(dataj )p( ) p(dataj )p( )d. p.14 Bayes s Theorem: p( jdata) =

16 . p.15 Likelihood function: L( ) = p(dataj ) Bayes Estimators 1. Often: Bayes ß MLE; both are optimal for large n. 2. But, sometimes prior information is also important. 3. Bayes Theorem provides very powerful machinery, especially via Monte Carlo.

17 . p.16 Instantaneous Firing Rate May be inherently interesting. (E.g., SEF data)

18 (A) (B) Time (ms). p.17 Firing rate Firing rate Trial number

19 . p.18 Instantaneous Firing Rate May be inherently interesting. (E.g., SEF data) May be necessary (or helpful) for subsequent calculations.

20 1 ;:::;s n ) = e R T 0 (t)dt Π n k=1 (s k) p(s. p.19 Probability density for set of spikes s 1 ;:::;s n (in non-poisson case (t) is generalized)

21 . p.20 In Estimating Instantaneous Firing Rate Smoothing is a Good Idea Illustration: small simulation study.

22 . p.21

23 . p.21

24 . p.21

25 MISE = Time. p.22 Firing rate

26 . p.22 Firing rate MISE = 4.68 MISE = times smaller Time

27 . p % simulation bands for unsmoothed PSTH with 14 times more trials

28 . p.23 For SEF data above, doesn t matter much how the smoothing is accomplished, e.g., Gaussian filtering is fine. Sometimes fixed-bandwidth methods are not adequate.

29 Time (ms). p.24 Firing rate

30 . p.25 BARS (Bayesian Adaptive Regression Splines) Fits firing rate curve with cubic splines, finding optimal knot locations via Monte Carlo simulation. Seems to be the most powerful existing method of generalized curve-fitting (small MSE in various examples). In many contexts (firing rate, fmri, EEG, EMG) BARS tracks the signal very well, effectively capturing the high-frequency changes in signal while filtering out high-frequency noise.

31 (a) Original EEG and EOG Orig EEG Orig EOG Time(sec) (b) Original EOG with Fixed Knot (dashed) and Free Knot (bold) Splines volts -1.5*10^-4-5*10^ p.26

32 . p.27 BARS (Bayesian Adaptive Regression Splines) Provides formal statistical inferences. See DiMatteo, Genovese, Kass (2001, Biometrika); Kass and Wallstrom (2002, Statistical Science). Applies to usual curve-fitting setting, and to generalized curve-fitting of intensity functions (in Poisson and non-poisson settings).

33 . p.28 Consider first the usual curve-fitting setting Example 2 f(x) x

34 Y i = f (x i ) + " i f is (approximated by) cubic spline with k knots at ο 1 ;:::;ο k. p.29 Usual curve-fitting framework With free-knot splines

35 . p.30 0 T 1 2 ο ο ο 3 1

36 Y i = f (x i ) + " i f (x) = b j (x)fi j. p.31 X k+2 j=1 b j (x) is j-th spline basis function. If knots (ο; k) were pre-determined, would have a linear regression problem. Hard (statistically interesting) problem is to determine (ο; k).

37 . p.32 BARS algorithm: first integrate (fi; ff), then apply Monte Carlo to find knots. Get: (i) estimated curve and (ii) uncertainty.

38 . p.33 Average MSE (with simulation SE) SARS DMS BARS Example (0.001) (0.002) (0.001) SARS: Zhou and Shen (J. Amer. Statist. Assoc., 2001) DMS: Denison, Mallick, and Smith (J. Royal Statist. Soc., B, 1998)

39 . p.34 SARS DMS modified-dms BARS

40 . p.35 Generalization to Firing-Rate Intensity Function Straightforward (because of approximation to integrals studied by Kass and Wasserman, 1995, J. Amer. Statist. Assoc.). Simulation study based on IT neuron shown above: Gaussian filtering reduces MSE compared to PSTH by factor of 3 and BARS reduces further by additional factor of 3.

41 Time (ms). p.36 Firing rate

42 . p.37 Assessing Population Variability Want: variability of intensity functions and also of particular characteristics (e.g., location of maximal firing rate; peak minus trough difference). Standard principal components analysis (Optican and Richmond, J. Neurophys., 1987) useful descriptively.

43 . p.38 M1 Data from Peter Strick s Lab Button-touch task in either learned, sequential order or random order. Standard analysis is useful. We extend BARS to MBARS to get automated smoothing, then can also make formal statistical inferences.

44 . p.39 neuron 1 neuron 2 neuron 3 neuron 4 neuron 5 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms) neuron 6 neuron 7 neuron 8 neuron 9 neuron 10 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms) neuron 11 neuron 12 neuron 13 neuron 14 neuron 15 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms) neuron 16 neuron 17 neuron 18 neuron 19 neuron 20 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms) neuron 21 neuron 22 neuron 23 neuron 24 neuron 25 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms) neuron 26 neuron 27 neuron 28 neuron 29 neuron 30 rates(hz) rates(hz) rates(hz) rates(hz) rates(hz) Time(ms) Time(ms) Time(ms) Time(ms) Time(ms)

45 . p.40 neuron 1 neuron 2 neuron 3 neuron 4 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec) neuron 6 neuron 7 neuron 8 neuron 9 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec) neuron 11 neuron 12 neuron 13 neuron 14 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec) neuron 16 neuron 17 neuron 18 neuron 19 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec) neuron 21 neuron 22 neuron 23 neuron 24 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec) neuron 26 neuron 27 neuron 28 neuron 29 neuron rates (Hz) rates (Hz) rates (Hz) rates (Hz) rates (Hz) Time (msec) Time (msec) Time (msec) Time (msec) Time (msec)

46 . p.41 First Principal Component Prop.of.Var=0.63 Second Principal Component Prop.of.Var=0.88 Third Principal Component P.of.V=

47 . p Comp Comp Comp

48 . p.43 Repeating Mode Movement 2 to 3 - Muscle Activity Random Mode Movement 2 to 3 - Muscle Activity EMG Recording (Hz) EMG Recording (Hz) time (msec) time (msec) Repeating Mode Movement 2 to 3 - Neuron Activity Random Mode Movement 2 to 3 - Neuron Activity Rate (Hz) Rate (Hz) time (msec) time (msec)

49 . p.44 Assessing Population Variability Issues: Want formal inferences. Should keep in mind intensity functions are estimated, from limited data, with uncertainty. Methodology: View (t) as realization from random process, want to estimate Cov( (u); (v)). Extend BARS to MBARS (M for Multiple).

50 . p.45 Proportion of Variance Results From 30 neurons First principal component: :697 ± :092 Compare standard PCA:.78.

51 . p.46 Population Coding for Movement Prediction M1 neurons are broadly directionally tuned. Each tuning curve may be characterized (reasonably well) by that cell s preferred direction! D. By combining firing from hundreds of neurons, movement can be predicted reasonably well.

52 . p.47

53 Direction of movement (2 pi radians) Firing Counts in 200 ms Cosine Regression: Deviance = BARS: Deviance = p.48

54 . p.49 The Population Vector Algorithm (PVA) The PVA! P = P w i! D i is simple and reasonably effective.! is predicted movement direction P! is i-th neuron s preferred direction Di w i scaled version of firing rate of i-th neuron

55 . p.50 Population Vector Algorithm vs. Bayesian Decoding The PVA! P = P w i! D i is simple and reasonably effective. Problem is to improve the prediction and be able to use a small set of M1 neurons more reliably. Our Bayesian decoding scheme uses (i) a flexible probability model for the firing rate as a function of movement (velocity) (ii) a prior that assumes movement at time t is similar to movement at time t 1. (Similar to Brown et al. for hippocampus; our particle filter implements a generalized Kalman filter.) We performed a small simulation study, assuming 100 neurons with directional tuning curves similar to those from real data, to compare PVA with Bayesian decoding.

56 . p.51

57 . p.52 4 X and Y Components Time

58 5 Population Vector Reconstruction X and Y components of Velocity. p Time Bin

59 5 PV Reconstruction with Smoothing X and Y components of Velocity. p Time Bin

60 5 Particle Filter Reconstruction X and Y components of Velocity. p Time Bin

61 . p.56 Population Vector Algorithm vs. Bayesian Decoding In our simulation study, Bayesian decoding was 7 times more efficient than the plain PVA, and 2:6 times more efficient than the PVA with smoothing. Also 2:3 times more efficient than OLE (not shown). Other advantages: Bayesian decoding can accommodate non-uniformly distributed preferred directions, variable time lags for behavior, non-cosine tuning functions, fine resolution in time, etc.

62 Peri-Stimulus Time Histogram PSTH. p.57

63 Time (ms). p.58 Firing rate Trial number

64 . p.59 Non-Poisson neuron firing Data pooled across trials ) Poisson Within trials, not Poisson Probability of spike in interval (t; t + dt) Poisson: (t)dt General: (tjs 1 ;s 2 ;:::;s n )dt Here, s 1 ;s 2 ;:::;s n are spikes up to time t and (tjs 1 ;s 2 ;:::;s n ) is conditional intensity of process. In Poisson case, intensity does not depend on the past.

65 . p.60 Non-Poisson neuron firing Data pooled across trials ) Poisson Within trials, not Poisson Probability of spike in interval (t; t + dt) Poisson: (t)dt General: (tjs 1 ;s 2 ;:::;s n )dt Here, s 1 ;s 2 ;:::;s n are spikes up to time t and (tjs 1 ;s 2 ;:::;s n ) is conditional intensity of process. In Poisson case, intensity does not depend on the past. Difficulty: (tjs 1 ;s 2 ;:::;s n ) is function of n + 1 variables. Need some simplifying assumption.

66 . p.61 Inhomogeneous Markov interval (IMI) processes Assume j s 1 ;s 2 ;:::;s n ) = (t; t s Λ (t)) (t s where (t) Λ is last spike time preceding t.

67 . p.62 Inhomogeneous Markov interval (IMI) processes Assume j s 1 ;s 2 ;:::;s n ) = (t; t s Λ (t)) (t s where (t) Λ is last spike time preceding t. Special cases: Barbieri et al. (2001), Miller and Mark (1992). Likelihood approximated as in Poisson case (but now within trials) and splines may be used again to estimate this function of two variables.

68 . p.63 (A) (B) Spikes/sec Poisson IMI Time (ms) Inter-spike time (ms) (C) 50 Spike Previous spike at t = 50 ms Time (ms) Spikes/sec (D) Poisson IMI Time (ms)

69 IP IMI Model Quantiles. p.64 Empirical Quantiles

70 We have also used a version of BARS in IMI models.. p.65

71 . p.66 PSTH Counts number of times a spike occurs in bin at time t (then divides by time interval and number of trials to get rate). Displays time course of firing rate.

72 . p.67 Joint PSTH Counts number of times a spike occurs in bin at time t for neuron 1 AND a spike occurs in bin at time u for neuron 2 (then divides by time interval and number of trials to get rate). Displays time course of joint firing rate, BUT hard to see time course of EXCESS firing rate above what would be expected from INDEPENDENT neurons.

73 Time (ms). p.68 Time (ms)

74 . p.69 prob. of spike at t time for neuron 1: prob. of spike at t time for neuron 2: 1 (t)ffit 2 (t)ffit prob. of simultaneous spike at time t: 12 (t; t)(ffit) 2 independence: 12 (t; t) = 1 (t) 2 (t)

75 12 (t; t) = (t) 1 (t) 2 (t). p.69 prob. of spike at t time for neuron 1: prob. of spike at t time for neuron 2: 1 (t)ffit 2 (t)ffit prob. of simultaneous spike at time t: (t; t)(ffit) 2 independence: (t; t) = 1 (t) 2 (t) replace independence with ratio of joint firing rate to firing rate under independence. (t) H Independence : (t) = 1 0 for t all

76 . p.70 Test H 0 : (t) = 1 for all t 12 (t; t) = (t) 1 (t) 2 (t) (i) (t) estimate (ii) form confidence bands under H 0 using the Bootstrap (iii) assess whether the excursion of (t) outside the bands may be due to chance, i.e., obtain p-value

77 . p.71 ζ estimate of ζ Time (ms)

78 . p.72 One Bootstrap sample Suppose we n have trials of paired recordings. 1. Shuffle the trials of neuron 1 data. 2. Shuffle the trials of neuron 2 data. 3. Pair the shuffled trials to n obtain new trials. 4. Compute an estimate of from the new data. Repeat 1000 times

79 . p.73 The Bootstrap Re-samples the data (in some modified form) repeatedly; Uses the many data replications (here, trials) to capture most of the information about relevant unknown data distributions (here, spike train distributions); Assumes sampled units (here, trials) are statistically identical;

80 . p.74 The Bootstrap Re-samples the data (in some modified form) repeatedly; Uses the many data replications (here, trials) to capture most of the information about relevant unknown data distributions (here, spike train distributions); Assumes sampled units (here, trials) are statistically identical; Is not hard to mis-use!

81 . p.75 Performance of Procedure vs. Bin-wise method We make a particular assumption about (t); We generate data artificially; We compute the probability of correctly rejecting H 0, i.e., we compute the power of the test.

82 . p.76 Performance of Procedure vs. Bin-wise method We make a particular assumption about (t); We generate data artificially; We compute the probability of correctly rejecting H 0, i.e., we compute the power of the test. Bin-wise method: obtain a threshold and reject H 0 whenever the normalized correlation exceeds the threshold for two contiguous bins. (A fair comparison of power requires probability of incorrectly rejecting H 0, i.e. probability of Type I error, to be same for two methods.)

83 . p.77 Power Bootstrap Bin wise need 4 times more trials to achieve the same power Maximum percentage of excess firing (%)

84 . p.78 Additional Work on Correlated Spiking Aim: Distinguish within-trial association from excitability. Model for r-th trial: r (t) = (t)g r (t) Two neurons can share excitability or have different excitabilities. We can incorporate IMI effects.

85 Time (ms) firing rate rate Time lag Time (ms) Average counts z Time lag Time (ms). p.79 Average counts z

86 . p.80

87 . p.81 Summary Smoothing methods increase efficiency. BARS smoothing can further increase efficiency. Methods may be adapted for within-trial analyses, using IMI models. Inferential framework can be useful for population analysis. Bayesian decoding increases efficiency of movement prediction. These ideas may be applied to understanding association of pairs of neurons.

88 . p.82 Current/Future Interests Further testing, refinements, and software. Algorithms for neural prostheses. Evolving correlation among multiple neurons.

Statistical Smoothing of Neuronal Data p.1

Statistical Smoothing of Neuronal Data p.1 Statistical Smoothing of Neuronal Data Robert E. Kass, Valérie Ventura, and Can Cai kass@stat.cmu.edu Department of Statistics and the Center for the Neural Basis of Cognition Carnegie Mellon University

More information

Trial-to-Trial Variability and its. on Time-Varying Dependence Between Two Neurons

Trial-to-Trial Variability and its. on Time-Varying Dependence Between Two Neurons Trial-to-Trial Variability and its Effect on Time-Varying Dependence Between Two Neurons Can Cai, Robert E. Kass, and Valérie Ventura Department of Statistics and Center for the Neural Basis of Cognition

More information

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation NOTE Communicated by Jonathan Victor Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation Robert E. Kass kass@stat.cmu.edu Valérie Ventura vventura@stat.cmu.edu

More information

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

Improved characterization of neural and behavioral response. state-space framework

Improved characterization of neural and behavioral response. state-space framework Improved characterization of neural and behavioral response properties using point-process state-space framework Anna Alexandra Dreyer Harvard-MIT Division of Health Sciences and Technology Speech and

More information

Lateral organization & computation

Lateral organization & computation Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar

More information

Single-Trial Neural Correlates. of Arm Movement Preparation. Neuron, Volume 71. Supplemental Information

Single-Trial Neural Correlates. of Arm Movement Preparation. Neuron, Volume 71. Supplemental Information Neuron, Volume 71 Supplemental Information Single-Trial Neural Correlates of Arm Movement Preparation Afsheen Afshar, Gopal Santhanam, Byron M. Yu, Stephen I. Ryu, Maneesh Sahani, and K rishna V. Shenoy

More information

Encoding or decoding

Encoding or decoding Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

Sean Escola. Center for Theoretical Neuroscience

Sean Escola. Center for Theoretical Neuroscience Employing hidden Markov models of neural spike-trains toward the improved estimation of linear receptive fields and the decoding of multiple firing regimes Sean Escola Center for Theoretical Neuroscience

More information

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Spatio-temporal correlations and visual signaling in a complete neuronal population Jonathan W. Pillow 1, Jonathon Shlens 2, Liam Paninski 3, Alexander Sher 4, Alan M. Litke 4,E.J.Chichilnisky 2, Eero

More information

Disambiguating Different Covariation Types

Disambiguating Different Covariation Types NOTE Communicated by George Gerstein Disambiguating Different Covariation Types Carlos D. Brody Computation and Neural Systems Program, California Institute of Technology, Pasadena, CA 925, U.S.A. Covariations

More information

Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex

Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex Y Gao M J Black E Bienenstock S Shoham J P Donoghue Division of Applied Mathematics, Brown University, Providence, RI 292 Dept

More information

Approximate Bayesian Computation

Approximate Bayesian Computation Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate

More information

Extreme Value Analysis and Spatial Extremes

Extreme Value Analysis and Spatial Extremes Extreme Value Analysis and Department of Statistics Purdue University 11/07/2013 Outline Motivation 1 Motivation 2 Extreme Value Theorem and 3 Bayesian Hierarchical Models Copula Models Max-stable Models

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

Issues on quantile autoregression

Issues on quantile autoregression Issues on quantile autoregression Jianqing Fan and Yingying Fan We congratulate Koenker and Xiao on their interesting and important contribution to the quantile autoregression (QAR). The paper provides

More information

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each

More information

Bayesian decoding of neural spike trains

Bayesian decoding of neural spike trains Ann Inst Stat Math (2010) 62:37 59 DOI 10.1007/s10463-009-0249-x Bayesian decoding of neural spike trains Shinsuke Koyama Uri T.Eden Emery N. Brown Robert E. Kass Received: 27 June 2008 / Revised: 2 April

More information

Describing Spike-Trains

Describing Spike-Trains Describing Spike-Trains Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2012 Neural Coding The brain manipulates information by combining and generating action

More information

Bayesian Inference. 2 CS295-7 cfl Michael J. Black,

Bayesian Inference. 2 CS295-7 cfl Michael J. Black, Population Coding Now listen to me closely, young gentlemen. That brain is thinking. Maybe it s thinking about music. Maybe it has a great symphony all thought out or a mathematical formula that would

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU http://www.cns.nyu.edu/~pillow Oct 5, Course lecture: Computational Modeling of Neuronal Systems

More information

A Framework for Evaluating Pairwise and Multiway Synchrony Among Stimulus-Driven Neurons

A Framework for Evaluating Pairwise and Multiway Synchrony Among Stimulus-Driven Neurons LETTER Communicated by Matthew Harrison A Framework for Evaluating Pairwise and Multiway Synchrony Among Stimulus-Driven Neurons Ryan C. Kelly ryekelly@gmail.com Robert E. Kass kass@stat.cmu.edu Department

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

encoding and estimation bottleneck and limits to visual fidelity

encoding and estimation bottleneck and limits to visual fidelity Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light The Neural Coding Problem s(t) {t i } Central goals for today:

More information

Motor Cortical Decoding Using an Autoregressive Moving Average Model

Motor Cortical Decoding Using an Autoregressive Moving Average Model Motor Cortical Decoding Using an Autoregressive Moving Average Model Jessica Fisher Michael Black, advisor March 3, 2006 Abstract We develop an Autoregressive Moving Average (ARMA) model for decoding hand

More information

FuncICA for time series pattern discovery

FuncICA for time series pattern discovery FuncICA for time series pattern discovery Nishant Mehta and Alexander Gray Georgia Institute of Technology The problem Given a set of inherently continuous time series (e.g. EEG) Find a set of patterns

More information

arxiv: v4 [stat.me] 27 Nov 2017

arxiv: v4 [stat.me] 27 Nov 2017 CLASSIFICATION OF LOCAL FIELD POTENTIALS USING GAUSSIAN SEQUENCE MODEL Taposh Banerjee John Choi Bijan Pesaran Demba Ba and Vahid Tarokh School of Engineering and Applied Sciences, Harvard University Center

More information

arxiv: v1 [stat.co] 2 Nov 2017

arxiv: v1 [stat.co] 2 Nov 2017 Binary Bouncy Particle Sampler arxiv:1711.922v1 [stat.co] 2 Nov 217 Ari Pakman Department of Statistics Center for Theoretical Neuroscience Grossman Center for the Statistics of Mind Columbia University

More information

Signal detection theory

Signal detection theory Signal detection theory z p[r -] p[r +] - + Role of priors: Find z by maximizing P[correct] = p[+] b(z) + p[-](1 a(z)) Is there a better test to use than r? z p[r -] p[r +] - + The optimal

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Tutorial on Approximate Bayesian Computation

Tutorial on Approximate Bayesian Computation Tutorial on Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology 16 May 2016

More information

Bayesian Analysis. Bayesian Analysis: Bayesian methods concern one s belief about θ. [Current Belief (Posterior)] (Prior Belief) x (Data) Outline

Bayesian Analysis. Bayesian Analysis: Bayesian methods concern one s belief about θ. [Current Belief (Posterior)] (Prior Belief) x (Data) Outline Bayesian Analysis DuBois Bowman, Ph.D. Gordana Derado, M. S. Shuo Chen, M. S. Department of Biostatistics and Bioinformatics Center for Biomedical Imaging Statistics Emory University Outline I. Introduction

More information

MIXED EFFECTS MODELS FOR TIME SERIES

MIXED EFFECTS MODELS FOR TIME SERIES Outline MIXED EFFECTS MODELS FOR TIME SERIES Cristina Gorrostieta Hakmook Kang Hernando Ombao Brown University Biostatistics Section February 16, 2011 Outline OUTLINE OF TALK 1 SCIENTIFIC MOTIVATION 2

More information

Supplementary Figure 1. Characterization of the single-photon quantum light source based on spontaneous parametric down-conversion (SPDC).

Supplementary Figure 1. Characterization of the single-photon quantum light source based on spontaneous parametric down-conversion (SPDC). .2 Classical light source.8 g (2) ().6.4.2 EMCCD SPAD 2 3.2.4.6.8..2.4.6.8.2 Mean number of photon pairs per pump pulse 4 5 6 7 8 9 2 3 4 Supplementary Figure. Characterization of the single-photon quantum

More information

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although, 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining what

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

3.3 Population Decoding

3.3 Population Decoding 3.3 Population Decoding 97 We have thus far considered discriminating between two quite distinct stimulus values, plus and minus. Often we are interested in discriminating between two stimulus values s

More information

Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking

Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking Discrete Time Rescaling Theorem: Determining Goodness of Fit for Statistical Models of Neural Spiking The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model

Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model Jonathan W. Pillow, Liam Paninski, and Eero P. Simoncelli Howard Hughes Medical Institute Center for Neural Science New York

More information

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review STATS 200: Introduction to Statistical Inference Lecture 29: Course review Course review We started in Lecture 1 with a fundamental assumption: Data is a realization of a random process. The goal throughout

More information

Model neurons!!poisson neurons!

Model neurons!!poisson neurons! Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence

More information

Web-based Supplementary Materials for A Robust Method for Estimating. Optimal Treatment Regimes

Web-based Supplementary Materials for A Robust Method for Estimating. Optimal Treatment Regimes Biometrics 000, 000 000 DOI: 000 000 0000 Web-based Supplementary Materials for A Robust Method for Estimating Optimal Treatment Regimes Baqun Zhang, Anastasios A. Tsiatis, Eric B. Laber, and Marie Davidian

More information

New Procedures for False Discovery Control

New Procedures for False Discovery Control New Procedures for False Discovery Control Christopher R. Genovese Department of Statistics Carnegie Mellon University http://www.stat.cmu.edu/ ~ genovese/ Elisha Merriam Department of Neuroscience University

More information

Nonparametric Bayesian Methods - Lecture I

Nonparametric Bayesian Methods - Lecture I Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics

More information

Modelling temporal structure (in noise and signal)

Modelling temporal structure (in noise and signal) Modelling temporal structure (in noise and signal) Mark Woolrich, Christian Beckmann*, Salima Makni & Steve Smith FMRIB, Oxford *Imperial/FMRIB temporal noise: modelling temporal autocorrelation temporal

More information

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis

Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Exercise Sheet 4: Covariance and Correlation, Bayes theorem, and Linear discriminant analysis Younesse Kaddar. Covariance and Correlation Assume that we have recorded two neurons in the two-alternative-forced

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

A new look at state-space models for neural data

A new look at state-space models for neural data A new look at state-space models for neural data Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University http://www.stat.columbia.edu/ liam liam@stat.columbia.edu

More information

Estimation of information-theoretic quantities

Estimation of information-theoretic quantities Estimation of information-theoretic quantities Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk November 16, 2004 Some

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Efficient adaptive covariate modelling for extremes

Efficient adaptive covariate modelling for extremes Efficient adaptive covariate modelling for extremes Slides at www.lancs.ac.uk/ jonathan Matthew Jones, David Randell, Emma Ross, Elena Zanini, Philip Jonathan Copyright of Shell December 218 1 / 23 Structural

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

The Poisson transform for unnormalised statistical models. Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB)

The Poisson transform for unnormalised statistical models. Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB) The Poisson transform for unnormalised statistical models Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB) Part I Unnormalised statistical models Unnormalised statistical models

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Development of Stochastic Artificial Neural Networks for Hydrological Prediction Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental

More information

A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling

A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling G. B. Kingston, H. R. Maier and M. F. Lambert Centre for Applied Modelling in Water Engineering, School

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Bayesian modelling of fmri time series

Bayesian modelling of fmri time series Bayesian modelling of fmri time series Pedro A. d. F. R. Højen-Sørensen, Lars K. Hansen and Carl Edward Rasmussen Department of Mathematical Modelling, Building 321 Technical University of Denmark DK-28

More information

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Liam Paninski a,b,2 Jonathan Pillow b Eero Simoncelli b a Gatsby Computational Neuroscience Unit, University College

More information

The idiosyncratic nature of confidence

The idiosyncratic nature of confidence SUPPLEMENTARY INFORMATION Articles DOI: 10.1038/s41562-017-0215-1 In the format provided by the authors and unedited. The idiosyncratic nature of confidence 1,2 Joaquin Navajas *, Chandni Hindocha 1,3,

More information

Finding a Basis for the Neural State

Finding a Basis for the Neural State Finding a Basis for the Neural State Chris Cueva ccueva@stanford.edu I. INTRODUCTION How is information represented in the brain? For example, consider arm movement. Neurons in dorsal premotor cortex (PMd)

More information

Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering

Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering LETTER Communicated by Emanuel Todorov Dynamic Analysis of Neural Encoding by Point Process Adaptive Filtering Uri T. Eden tzvi@neurostat.mgh.harvard.edu Neuroscience Statistics Research Laboratory, Department

More information

ASSESSMENT OF SYNCHRONY IN MULTIPLE NEURAL SPIKE TRAINS USING LOGLINEAR POINT PROCESS MODELS 1

ASSESSMENT OF SYNCHRONY IN MULTIPLE NEURAL SPIKE TRAINS USING LOGLINEAR POINT PROCESS MODELS 1 ASSESSMENT OF SYNCHRONY IN MULTIPLE NEURAL SPIKE TRAINS USING LOGLINEAR POINT PROCESS MODELS 1 By Robert E. Kass, Ryan C. Kelly and Wei-Liem Loh Carnegie Mellon University and National University of Singapore

More information

Internally generated preactivation of single neurons in human medial frontal cortex predicts volition

Internally generated preactivation of single neurons in human medial frontal cortex predicts volition Internally generated preactivation of single neurons in human medial frontal cortex predicts volition Itzhak Fried, Roy Mukamel, Gabriel Kreiman List of supplementary material Supplementary Tables (2)

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Neural Decoding. Chapter Encoding and Decoding

Neural Decoding. Chapter Encoding and Decoding Chapter 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining

More information

Lecture 16 Deep Neural Generative Models

Lecture 16 Deep Neural Generative Models Lecture 16 Deep Neural Generative Models CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago May 22, 2017 Approach so far: We have considered simple models and then constructed

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Supplementary Note on Bayesian analysis

Supplementary Note on Bayesian analysis Supplementary Note on Bayesian analysis Structured variability of muscle activations supports the minimal intervention principle of motor control Francisco J. Valero-Cuevas 1,2,3, Madhusudhan Venkadesan

More information

Inferring Neural Firing Rates from Spike Trains Using Gaussian Processes

Inferring Neural Firing Rates from Spike Trains Using Gaussian Processes Inferring Neural Firing Rates from Spike Trains Using Gaussian Processes John P. Cunningham 1, Byron M. Yu 1,2,3, Krishna V. Shenoy 1,2 1 Department of Electrical Engineering, 2 Neurosciences Program,

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

Statistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation

Statistics - Lecture One. Outline. Charlotte Wickham  1. Basic ideas about estimation Statistics - Lecture One Charlotte Wickham wickham@stat.berkeley.edu http://www.stat.berkeley.edu/~wickham/ Outline 1. Basic ideas about estimation 2. Method of Moments 3. Maximum Likelihood 4. Confidence

More information

Nonparametric Estimation of Luminosity Functions

Nonparametric Estimation of Luminosity Functions x x Nonparametric Estimation of Luminosity Functions Chad Schafer Department of Statistics, Carnegie Mellon University cschafer@stat.cmu.edu 1 Luminosity Functions The luminosity function gives the number

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Experience Guided Search: A Bayesian Perspective on Cognitive Control

Experience Guided Search: A Bayesian Perspective on Cognitive Control Collaborators Experience Guided Search: A Bayesian Perspective on Cognitive Control Michael C. Mozer Computer Science Department and Institute of Cognitive Science University of Colorado, Boulder Karthik

More information

Subject CS1 Actuarial Statistics 1 Core Principles

Subject CS1 Actuarial Statistics 1 Core Principles Institute of Actuaries of India Subject CS1 Actuarial Statistics 1 Core Principles For 2019 Examinations Aim The aim of the Actuarial Statistics 1 subject is to provide a grounding in mathematical and

More information

Mathematical Formulation of Our Example

Mathematical Formulation of Our Example Mathematical Formulation of Our Example We define two binary random variables: open and, where is light on or light off. Our question is: What is? Computer Vision 1 Combining Evidence Suppose our robot

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Root-Unroot Methods for Nonparametric Density Estimation and Poisson Random-Effects Models

Root-Unroot Methods for Nonparametric Density Estimation and Poisson Random-Effects Models University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 00 Root-Unroot Methods for Nonparametric Density Estimation and Poisson Random-Effects Models Lwawrence D. Brown University

More information

NONSTATIONARY GAUSSIAN PROCESSES FOR REGRESSION AND SPATIAL MODELLING

NONSTATIONARY GAUSSIAN PROCESSES FOR REGRESSION AND SPATIAL MODELLING CARNEGIE MELLON UNIVERSITY NONSTATIONARY GAUSSIAN PROCESSES FOR REGRESSION AND SPATIAL MODELLING A DISSERTATION SUBMITTED TO THE GRADUATE SCHOOL IN PARTIAL FULFILLMENT OF THE REQUIREMENTS for the degree

More information

A Bayesian Perspective on Residential Demand Response Using Smart Meter Data

A Bayesian Perspective on Residential Demand Response Using Smart Meter Data A Bayesian Perspective on Residential Demand Response Using Smart Meter Data Datong-Paul Zhou, Maximilian Balandat, and Claire Tomlin University of California, Berkeley [datong.zhou, balandat, tomlin]@eecs.berkeley.edu

More information

POINT PROCESS MONTE CARLO FILTERING FOR BRAIN MACHINE INTERFACES

POINT PROCESS MONTE CARLO FILTERING FOR BRAIN MACHINE INTERFACES POINT PROCESS MONTE CARLO FILTERING FOR BRAIN MACHINE INTERFACES By YIWEN WANG A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

A Rate and History-Preserving Resampling Algorithm for Neural Spike Trains

A Rate and History-Preserving Resampling Algorithm for Neural Spike Trains A Rate and History-Preserving Resampling Algorithm for Neural Spike Trains Matthew T. Harrison mtharris@cmu.edu Department of Statistics Carnegie Mellon University Pittsburgh, PA 15213 Stuart Geman stuart

More information

Estimating a State-Space Model from Point Process Observations

Estimating a State-Space Model from Point Process Observations ARTICLE Communicated by Roger Brockett Estimating a State-Space Model from Point Process Observations Anne C. Smith asmith@neurostat.mgh.harvard.edu Neuroscience Statistics Research Laboratory, Department

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

Quantile POD for Hit-Miss Data

Quantile POD for Hit-Miss Data Quantile POD for Hit-Miss Data Yew-Meng Koh a and William Q. Meeker a a Center for Nondestructive Evaluation, Department of Statistics, Iowa State niversity, Ames, Iowa 50010 Abstract. Probability of detection

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Confidence Estimation Methods for Neural Networks: A Practical Comparison

Confidence Estimation Methods for Neural Networks: A Practical Comparison , 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.

More information