Statistical Problems Related to Excitation Threshold and Reset Value of Membrane Potentials
|
|
- Bruce Hutchinson
- 5 years ago
- Views:
Transcription
1 Statistical Problems Related to Excitation Threshold and Reset Value of Membrane Potentials Le Mans - March 18, 2009
2 Contents 1 Biological Background and the Problem Definition 2 3 MDE with respect to the Laplace transform Conditions and Results for MDE 4 Simulated Data Real Data
3 Biological Background Figure: The Neuron
4 Membrane Potential (W. Kilb, Mainz) potential [mv] time [s] Figure: This membrane potential was recorded in vitro from a neuron belonging to a cortical slice preparation from a 6 week old mouse.
5 Usual Model (considered by Ditlevsen, Lansky, Genon-Catalot, Laredo... ) S } {{ } τ 1 }{{}... }{{} τ 2 τ n x 0 Assume the process X between spikes follows an SDE dx t = β θ (X t )dt + σ θ (X t )dw t.
6 Usual Model (considered by Ditlevsen, Lansky, Genon-Catalot, Laredo... ) S } {{ } τ 1 }{{}... }{{} τ 2 τ n x 0 Assume the process X between spikes follows an SDE dx t = β θ (X t )dt + σ θ (X t )dw t. x 0 and S are assumed to be known and θ R d is the parameter to determine by observation of iid inter spike times (level crossing times) τ i, i = 1,...,n.
7 Usual Model (considered by Ditlevsen, Lansky, Genon-Catalot, Laredo... ) S } {{ } τ 1 }{{}... }{{} τ 2 τ n x 0 Assume the process X between spikes follows an SDE dx t = β θ (X t )dt + σ θ (X t )dw t. x 0 and S are assumed to be known and θ R d is the parameter to determine by observation of iid inter spike times (level crossing times) τ i, i = 1,...,n. Nonparametric methods proposed by R. Höpfner (2006).
8 Usual Model (considered by Ditlevsen, Lansky, Genon-Catalot, Laredo... ) S } {{ } τ 1 }{{}... }{{} τ 2 τ n x 0 Assume the process X between spikes follows an SDE dx t = β θ (X t )dt + σ θ (X t )dw t. x 0 and S are assumed to be known and θ R d is the parameter to determine by observation of iid inter spike times (level crossing times) τ i, i = 1,...,n. Nonparametric methods proposed by R. Höpfner (2006).
9 Estimation of Excitation Threshold and Reset Value potential [mv] S? x 0? time [s]
10 The Statistical Problem θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n Assume the process X between spikes follows a known SDE dx t = β(x t )dt + σ(x t )dw t. (β( ) and σ( ) are known!) θ 1
11 The Statistical Problem θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n Assume the process X between spikes follows a known SDE dx t = β(x t )dt + σ(x t )dw t. (β( ) and σ( ) are known!) θ 1 Observation of iid inter spike times { } τ i := inf t 0 X (θ 1) t = θ 2, i = 1,...,n.
12 The Statistical Problem θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n Assume the process X between spikes follows a known SDE dx t = β(x t )dt + σ(x t )dw t. (β( ) and σ( ) are known!) θ 1 Observation of iid inter spike times { } τ i := inf t 0 X (θ 1) t = θ 2, i = 1,...,n. θ 1 = x 0 and θ 2 = S are the parameters to determine.
13 The Statistical Problem θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n Assume the process X between spikes follows a known SDE dx t = β(x t )dt + σ(x t )dw t. (β( ) and σ( ) are known!) θ 1 Observation of iid inter spike times { } τ i := inf t 0 X (θ 1) t = θ 2, i = 1,...,n. θ 1 = x 0 and θ 2 = S are the parameters to determine.
14 Assumptions & Strategy We consider the cases where X is a BMD, GBM, OU or CIR.
15 Assumptions & Strategy We consider the cases where X is a BMD, GBM, OU or CIR. We use MLE and LAN theory whenever it is possible.
16 Assumptions & Strategy We consider the cases where X is a BMD, GBM, OU or CIR. We use MLE and LAN theory whenever it is possible. In other cases we use MDE.
17 Assumptions & Strategy We consider the cases where X is a BMD, GBM, OU or CIR. We use MLE and LAN theory whenever it is possible. In other cases we use MDE.
18 Brownian Motion with Drift Let σ > 0, a 0 and (B t ) t 0 be a standard Brownian motion. Define X = (X t ) t 0 a BMD starting in X 0 = θ 1 by X t := θ 1 + at + σb t. Lebesgue-density of L(τ): t (0, ) f BMD (t) := (θ 2 θ 1 ) t 3/2 exp { ((θ 2 θ 1 ) at) 2 } 2πσ 2σ 2 t Note that a joint estimation of θ 1 and θ 2 is impossible!
19 Brownian Motion with Drift Let σ > 0, a 0 and (B t ) t 0 be a standard Brownian motion. Define X = (X t ) t 0 a BMD starting in X 0 = θ 1 by X t := θ 1 + at + σb t. Lebesgue-density of L(τ): t (0, ) f BMD (t) := (θ 2 θ 1 ) t 3/2 exp { ((θ 2 θ 1 ) at) 2 } 2πσ 2σ 2 t Note that a joint estimation of θ 1 and θ 2 is impossible!
20 What to do, if we know neither x 0 = θ 1 nor S = θ 2? θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n θ 1
21 What to do, if we know neither x 0 = θ 1 nor S = θ 2? θ 2 } {{ } τ 1 }{{}... }{{} τ 2 τ n θ 1 median θ 2 θ 2 = ˆθ 1 τ 1 ˆτ 1 τ 2 ˆτ 2... τ n ˆτ n θ 1
22 Theorem for the Brownian Motion with Drift If X is a BMD, the corresponding sequences of experiments for θ 1 and θ 2 are LAN. Further, the MLEs (a ˆθ 1,n := θ 2 2 τh a ) 2 n 2 τh n + σ2 τ n h, (a ˆθ 2,n := θ τh a ) 2 n + 2 τh n + σ2 τ n h ) 1 where τ n h :=, are strongly consistent and LAM such that ( n i=1 1 τi ) L(θi ) n (ˆθ i,n θ i N n ( 0, (θ 2 θ 1 ) 2 σ 2 ) 2σ 2, i = 1, 2. + a(θ 2 θ 1 )
23 Theorem for the Brownian Motion with Drift If X is a BMD, the corresponding sequences of experiments for θ 1 and θ 2 are LAN. Further, the MLEs (a ˆθ 1,n := θ 2 2 τh a ) 2 n 2 τh n + σ2 τ n h, (a ˆθ 2,n := θ τh a ) 2 n + 2 τh n + σ2 τ n h ) 1 where τ n h :=, are strongly consistent and LAM such that ( n i=1 1 τi ) L(θi ) n (ˆθ i,n θ i N n ( 0, (θ 2 θ 1 ) 2 σ 2 ) 2σ 2, i = 1, 2. + a(θ 2 θ 1 ) With exponential scaling, the case of GMB yields a corresponding result!
24 Theorem for the Brownian Motion with Drift If X is a BMD, the corresponding sequences of experiments for θ 1 and θ 2 are LAN. Further, the MLEs (a ˆθ 1,n := θ 2 2 τh a ) 2 n 2 τh n + σ2 τ n h, (a ˆθ 2,n := θ τh a ) 2 n + 2 τh n + σ2 τ n h ) 1 where τ n h :=, are strongly consistent and LAM such that ( n i=1 1 τi ) L(θi ) n (ˆθ i,n θ i N n ( 0, (θ 2 θ 1 ) 2 σ 2 ) 2σ 2, i = 1, 2. + a(θ 2 θ 1 ) With exponential scaling, the case of GMB yields a corresponding result!
25 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But
26 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But we know the Laplace transform!
27 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But we know the Laplace transform! Θ {θ R 2 : θ 1 < θ 2 } bounded
28 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But we know the Laplace transform! Θ {θ R 2 : θ 1 < θ 2 } bounded H := L 2 (R +, B(R + ), µ), µ has a piecewise continuous Lebesgue density with compact support.
29 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But we know the Laplace transform! Θ {θ R 2 : θ 1 < θ 2 } bounded H := L 2 (R +, B(R + ), µ), µ has a piecewise continuous Lebesgue density with compact support. Reference Values: LTs L θ (α) := E θ [e ατ ], θ Θ continuously parameterized in H,
30 MDE with respect to the Laplace transform Conditions and Results for MDE MDE Framework for the cases of OU and CIR Observations (τ i ) i N iid inter spike times of the OU or the CIR neuronal model. An explicit expression for the density of L(τ) is not known! But we know the Laplace transform! Θ {θ R 2 : θ 1 < θ 2 } bounded H := L 2 (R +, B(R + ), µ), µ has a piecewise continuous Lebesgue density with compact support. Reference Values: LTs L θ (α) := E θ [e ατ ], θ Θ continuously parameterized in H, Empirical Value: Emp. LT ˆL n (α) := 1 n e ατ i, α 0. n i=1
31 Minimum Distance Estimator MDE with respect to the Laplace transform Conditions and Results for MDE The MDE for this experiment is defined by This definition is not unique. θn := arg inf ˆL n L ξ H ξ Θ
32 Conditions MDE with respect to the Laplace transform Conditions and Results for MDE SLLN(θ): L θ ˆL n H P θ a.s. n 0 This is obviously fulfilled for all θ Θ.
33 Conditions MDE with respect to the Laplace transform Conditions and Results for MDE SLLN(θ): L θ ˆL n H P θ a.s. n 0 This is obviously fulfilled for all θ Θ.
34 Conditions MDE with respect to the Laplace transform Conditions and Results for MDE SLLN(θ): L θ ˆL n H P θ a.s. n 0 This is obviously fulfilled for all θ Θ. Identifiability I(θ): inf L θ L ξ H > 0, δ > 0. ξ Θ, θ ξ δ If L θ L ξ as ξ θ, the properties of the LT ensure, that I(θ) holds for all θ Θ.
35 Conditions MDE with respect to the Laplace transform Conditions and Results for MDE SLLN(θ): L θ ˆL n H P θ a.s. n 0 This is obviously fulfilled for all θ Θ. Identifiability I(θ): inf L θ L ξ H > 0, δ > 0. ξ Θ, θ ξ δ If L θ L ξ as ξ θ, the properties of the LT ensure, that I(θ) holds for all θ Θ.
36 Condition AN(θ) MDE with respect to the Laplace transform Conditions and Results for MDE Asymptotic Normality AN(θ): Define W n, n N: Wα n := ) n (ˆL n (α) L θ (α), then there exists a gaussian process W = W(θ) with covariance function K(, ) s.t. If we define W n L(θ) W in H. n K(α 1, α 2 ) := L θ (α 1 +α 2 ) L θ (α 1 )L θ (α 2 ) = Cov θ [ e α 1τ, e α 2 τ ] and use results from Cremers & Kadelka (1986) then AN(θ) holds for all θ Θ.
37 Condition AN(θ) MDE with respect to the Laplace transform Conditions and Results for MDE Asymptotic Normality AN(θ): Define W n, n N: Wα n := ) n (ˆL n (α) L θ (α), then there exists a gaussian process W = W(θ) with covariance function K(, ) s.t. If we define W n L(θ) W in H. n K(α 1, α 2 ) := L θ (α 1 +α 2 ) L θ (α 1 )L θ (α 2 ) = Cov θ [ e α 1τ, e α 2 τ ] and use results from Cremers & Kadelka (1986) then AN(θ) holds for all θ Θ.
38 The Ornstein-Uhlenbeck Case MDE with respect to the Laplace transform Conditions and Results for MDE Ornstein-Uhlenbeck Process dx t = (a bx t )dt + σdb t, X 0 = θ 1 The level-crossing time τ := inf Laplace transform { t 0 X (θ 1) t = θ 2 } has the (Roy / Smith 1969) L θ (α) := E θ [e ατ ] = H α/b ( ( θ 1 b) a ) b σ H α/b ( ( θ 2 a b) b σ ), α 0, where H is the Hermite function.
39 The Cox-Ingersoll-Ross Case MDE with respect to the Laplace transform Conditions and Results for MDE Cox-Ingersoll-Ross Process dx t = (a bx t )dt + σ X + t db t, X 0 = θ 0 0 The level-crossing time τ := inf Laplace transform { t 0 X (θ 1) t = θ 2 } has the (Göing-Jaeschke / Yor 1999,2003) ( α φ L θ (α) := E θ [e ατ b, 2a ) 2b σ2; σ ] = 2θ 1 ( α φ b, 2a ), α 0, 2b σ2; σ 2θ 2 where φ is the confluent hypergeometric function.
40 Condition D(θ) MDE with respect to the Laplace transform Conditions and Results for MDE Differentiability D(θ): The function Θ ξ L ξ H is Fréchet-differentiable at θ with derivation DL θ = (D 1 L θ,...,d d L θ ) (linearly independent components in H). In the OU and CIR case D(θ) holds for all θ Θ where the derivative consists of ratios of the corresponding special functions.
41 Condition D(θ) MDE with respect to the Laplace transform Conditions and Results for MDE Differentiability D(θ): The function Θ ξ L ξ H is Fréchet-differentiable at θ with derivation DL θ = (D 1 L θ,...,d d L θ ) (linearly independent components in H). In the OU and CIR case D(θ) holds for all θ Θ where the derivative consists of ratios of the corresponding special functions.
42 Results for MDE (Millar 1984) MDE with respect to the Laplace transform Conditions and Results for MDE Let the Conditions SLLN(θ), I(θ), D(θ), AN(θ) hold for every θ Θ. Then every sequence of MDE θ n, n N for θ is strongly consistent and is asymptotically normal where Σ θ := Λ 1 θ V θλ 1 θ with n (θ n θ) L(θ) N (0, Σ θ), n Λ θ := ( D i L θ, D j L θ H )1 i,j d the Gramian matrix of the derivatives and V R d d defined by (V θ ) i,j := 0 0 D i L θ (α 1 )K(α 1, α 2 )D j L θ (α 2 )µ(dα 1 )µ(dα 2 ).
43 Simulation of CIR ISI s Simulated Data Real Data Simulation of ISI s of the CIR neuronal model, dx t = (a b(x t c))dt + σ X t c db t, x 0 = 52.7, S = 49.3, σ = 1.8, a = 93.1, b = 28.7, c = 53.7
44 Simulation of CIR ISI s Simulated Data Real Data Simulation of ISI s of the CIR neuronal model, dx t = (a b(x t c))dt + σ X t c db t, x 0 = 52.7, S = 49.3, σ = 1.8, a = 93.1, b = 28.7, c = 53.7 H := L 2 (R +, B(R + ), µ), where µ := 100 j=1 δ α j and α j are suitable points.
45 Simulation of CIR ISI s Simulated Data Real Data Simulation of ISI s of the CIR neuronal model, dx t = (a b(x t c))dt + σ X t c db t, x 0 = 52.7, S = 49.3, σ = 1.8, a = 93.1, b = 28.7, c = 53.7 H := L 2 (R +, B(R + ), µ), where µ := 100 j=1 δ α j and α j are suitable points. Θ 1 := { 53.7, 53.69, 53.68,..., 50} Θ 2 := { 51, 50.99,..., 47} Θ := Θ 1 Θ 2 {θ 1 < θ 2 }.
46 Results of the MDE for θ 2 = S Simulated Data Real Data θ 2,n n Figure: Trajectory of the MDE for θ 2. The dashed line is the true simulation parameter S = 49.3.
47 Simulated Data Real Data Results of the MDE for θ 1 = x 0 θ 1,n e+00 2e+04 4e+04 6e+04 8e+04 1e+05 n Figure: Trajectory of the MDE for θ 1 = x 0. Only every hundredth step is plotted. The dashed line is the true simulation parameter x 0 = 52.7.
48 Simulated Data Real Data Asymptotic Variance of the MDE for θ 2 (Σ θ ) 2, θ θ 1 Figure: The asymptotic variance of θ 2,n depending on the true value θ Θ. The mean value of the stationary distribution of X is a/b + c
49 Simulated Data Real Data Asymptotic Variance of the MDE for θ 1 log 10 [(Σ θ ) 1,1 ] θ θ 1 Figure: The asymptotic variance of θ 1,n in logarithmic scale depending on the true value θ Θ. a/b + c
50 Simulated Data Real Data Asymptotic Correlation of the MDE Components Cor θ (θ ) < 0 Cor θ (θ ) > 0 Cor θ (θ ) θ Figure: The asymptotic correlation coefficient between θ 1,n and θ 2,n. a/b + c θ 1 0.0
51 Simulated Data Real Data A Real Data Set (W. Kilb, Mainz) Figure: This membrane potential includes 46 ISIs. After fixing the CIR model by a nonparametric method, proposed by R. Höpfner (2006), the MDE finds θ = ( 53.74, 49.28) for x 0 and S given by the dashed lines.
52 Fit of the Laplace Transform Simulated Data Real Data Figure: The black rings are the values of the empirical LT ˆL n for n = 46 at the points α j, j = 1,..., 100 which are the mass points of µ. The red line is the estimated LT L θ for θ = ( 53.74, 49.28) that minimizes the distance of these two functions in H, where ˆL n L θ H
53 Résumé Simulated Data Real Data In the cases BMD and GBM the MLE of θ 1 and θ 2 is explicit and its asymptotics are optimal (LAN LAM). In the cases OU and CIR a MLE is not explicitly given, so we take the MDE which is easier to handle because the LT is explicit. A joint estimation of θ 1 and θ 2 is possible, but we need lots of observations (τ 1,...,τ n ) to estimate θ 1. However, in applications θ 1 is a parameter of secondary importance. Further, the distance between emp. and estimated LT gives an indication if the model reproduces the true spiking behavior.
54 Simulated Data Real Data Thank you for your attention! Questions?
55 Simulated Data Real Data Cremers, Kadelka: On weak convergence of integral functionals of stoch. processes with appl. to processes taking paths in L E p. Stoch. Proc. Appl. Vol 21 (1986). A.Göing-Jaeschke, M.Yor: A Survey and Some Generalizations of Bessel Processes. Bernoulli 9 (2003). R.Höpfner: Mathematische Statistik 07/08: hoepfner/material-mathstat.html R.Höpfner: On a set of data for the membrane potential in a neuron. Math. Biosci. 207 (2007). P.Millar: A General Approach to the Optimality of Minimum Distance Estimators. Trans. Amer. Math. Soc., Vol. 286, No. 1. (1984). Roy, Smith: Analysis of the Exponential Decay Model.... Bull. Math. Biophys., Vol. 31 (1969).
Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises
Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Hongwei Long* Department of Mathematical Sciences, Florida Atlantic University, Boca Raton Florida 33431-991,
More informationStatistics & Data Sciences: First Year Prelim Exam May 2018
Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationMemory and hypoellipticity in neuronal models
Memory and hypoellipticity in neuronal models S. Ditlevsen R. Höpfner E. Löcherbach M. Thieullen Banff, 2017 What this talk is about : What is the effect of memory in probabilistic models for neurons?
More informationLower Tail Probabilities and Related Problems
Lower Tail Probabilities and Related Problems Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu . Lower Tail Probabilities Let {X t, t T } be a real valued
More informationParameter estimation from observations of first-passage times of the Ornstein-Uhlenbeck process and the Feller process
Parameter estimation from observations of first-passage times of the Ornstein-Uhlenbeck process and the Feller process Susanne Ditlevsen Department of Biostatistics, University of Copenhagen, DK 4 Copenhagen
More informationEstimation of arrival and service rates for M/M/c queue system
Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics
More informationLAN property for sde s with additive fractional noise and continuous time observation
LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,
More informationModel Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao
Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley
More informationParameter estimation of diffusion models
129 Parameter estimation of diffusion models Miljenko Huzak Abstract. Parameter estimation problems of diffusion models are discussed. The problems of maximum likelihood estimation and model selections
More informationInterest Rate Models:
1/17 Interest Rate Models: from Parametric Statistics to Infinite Dimensional Stochastic Analysis René Carmona Bendheim Center for Finance ORFE & PACM, Princeton University email: rcarmna@princeton.edu
More informationMulti-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes
Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes Anatoliy Swishchuk Department of Mathematics and Statistics University of Calgary Calgary, Alberta, Canada Lunch at the Lab Talk
More informationAsymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri
Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri University of Bergamo (Italy) ilia.negri@unibg.it SAPS VIII, Le Mans 21-24 March,
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationNonparametric Drift Estimation for Stochastic Differential Equations
Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,
More informationLower Tail Probabilities and Normal Comparison Inequalities. In Memory of Wenbo V. Li s Contributions
Lower Tail Probabilities and Normal Comparison Inequalities In Memory of Wenbo V. Li s Contributions Qi-Man Shao The Chinese University of Hong Kong Lower Tail Probabilities and Normal Comparison Inequalities
More informationMaximum Likelihood Estimation
Maximum Likelihood Estimation Guy Lebanon February 19, 2011 Maximum likelihood estimation is the most popular general purpose method for obtaining estimating a distribution from a finite sample. It was
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationA GENERAL THEOREM ON APPROXIMATE MAXIMUM LIKELIHOOD ESTIMATION. Miljenko Huzak University of Zagreb,Croatia
GLASNIK MATEMATIČKI Vol. 36(56)(2001), 139 153 A GENERAL THEOREM ON APPROXIMATE MAXIMUM LIKELIHOOD ESTIMATION Miljenko Huzak University of Zagreb,Croatia Abstract. In this paper a version of the general
More informationChapter 4: Asymptotic Properties of the MLE (Part 2)
Chapter 4: Asymptotic Properties of the MLE (Part 2) Daniel O. Scharfstein 09/24/13 1 / 1 Example Let {(R i, X i ) : i = 1,..., n} be an i.i.d. sample of n random vectors (R, X ). Here R is a response
More informationStochastic differential equation models in biology Susanne Ditlevsen
Stochastic differential equation models in biology Susanne Ditlevsen Introduction This chapter is concerned with continuous time processes, which are often modeled as a system of ordinary differential
More informationBrief Review on Estimation Theory
Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on
More informationPh.D. Qualifying Exam Friday Saturday, January 6 7, 2017
Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a
More informationFinal Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.
1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. (a) If X and Y are independent, Corr(X, Y ) = 0. (b) (c) (d) (e) A consistent estimator must be asymptotically
More informationBIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation
BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation Yujin Chung November 29th, 2016 Fall 2016 Yujin Chung Lec13: MLE Fall 2016 1/24 Previous Parametric tests Mean comparisons (normality assumption)
More informationMultivariate Generalized Ornstein-Uhlenbeck Processes
Multivariate Generalized Ornstein-Uhlenbeck Processes Anita Behme TU München Alexander Lindner TU Braunschweig 7th International Conference on Lévy Processes: Theory and Applications Wroclaw, July 15 19,
More informationEstimation for two-phase designs: semiparametric models and Z theorems
Estimation for two-phase designs:semiparametric models and Z theorems p. 1/27 Estimation for two-phase designs: semiparametric models and Z theorems Jon A. Wellner University of Washington Estimation for
More informationStatistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003
Statistical Approaches to Learning and Discovery Week 4: Decision Theory and Risk Minimization February 3, 2003 Recall From Last Time Bayesian expected loss is ρ(π, a) = E π [L(θ, a)] = L(θ, a) df π (θ)
More informationPath Decomposition of Markov Processes. Götz Kersting. University of Frankfurt/Main
Path Decomposition of Markov Processes Götz Kersting University of Frankfurt/Main joint work with Kaya Memisoglu, Jim Pitman 1 A Brownian path with positive drift 50 40 30 20 10 0 0 200 400 600 800 1000-10
More informationA tailor made nonparametric density estimate
A tailor made nonparametric density estimate Daniel Carando 1, Ricardo Fraiman 2 and Pablo Groisman 1 1 Universidad de Buenos Aires 2 Universidad de San Andrés School and Workshop on Probability Theory
More informationLAN property for ergodic jump-diffusion processes with discrete observations
LAN property for ergodic jump-diffusion processes with discrete observations Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Arturo Kohatsu-Higa (Ritsumeikan University, Japan) &
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationNonparametric Bayesian Methods - Lecture I
Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics
More informationNotes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed
18.466 Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed 1. MLEs in exponential families Let f(x,θ) for x X and θ Θ be a likelihood function, that is, for present purposes,
More informationSquared Bessel Process with Delay
Southern Illinois University Carbondale OpenSIUC Articles and Preprints Department of Mathematics 216 Squared Bessel Process with Delay Harry Randolph Hughes Southern Illinois University Carbondale, hrhughes@siu.edu
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationSecond-order pseudo-stationary random fields and point processes on graphs and their edges
Second-order pseudo-stationary random fields and point processes on graphs and their edges Jesper Møller (in collaboration with Ethan Anderes and Jakob G. Rasmussen) Aalborg University Jesper Møller (Aalborg
More informationEstimating Functions for Discretely Sampled Diffusion-Type Models
Estimating Functions for Discretely Sampled Diffusion-Type Models Bo Martin Bibby Institute of Mathematics and Physics The Royal Veterinary and Agricultural University Thorvaldsensvej 40 DK-1871 Frederiksberg
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationSimulation and Parametric Estimation of SDEs
Simulation and Parametric Estimation of SDEs Jhonny Gonzalez School of Mathematics The University of Manchester Magical books project August 23, 2012 Motivation The problem Simulation of SDEs SDEs driven
More informationCSC321 Lecture 18: Learning Probabilistic Models
CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationStochastic Differential Equations
Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations
More informationHomogenization with stochastic differential equations
Homogenization with stochastic differential equations Scott Hottovy shottovy@math.arizona.edu University of Arizona Program in Applied Mathematics October 12, 2011 Modeling with SDE Use SDE to model system
More informationConstruction of an Informative Hierarchical Prior Distribution: Application to Electricity Load Forecasting
Construction of an Informative Hierarchical Prior Distribution: Application to Electricity Load Forecasting Anne Philippe Laboratoire de Mathématiques Jean Leray Université de Nantes Workshop EDF-INRIA,
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationCorrection to: Yield curve shapes and the asymptotic short rate distribution in affine one-factor models
Finance Stoch (218) 22:53 51 https://doi.org/1.17/s78-18-359-5 CORRECTION Correction to: Yield curve shapes and the asymptotic short rate distribution in affine one-factor models Martin Keller-Ressel 1
More informationA review of asymptotic theory of estimating functions
A review of asymptotic theory of estimating functions Jean Jacod Institut de Mathématiques de Jussieu and Université. et M. Curie (aris-6 CNRS UMR 7586 75252 aris Cédex 05 France Michael Sørensen Dept.
More informationStochastic contraction BACS Workshop Chamonix, January 14, 2008
Stochastic contraction BACS Workshop Chamonix, January 14, 2008 Q.-C. Pham N. Tabareau J.-J. Slotine Q.-C. Pham, N. Tabareau, J.-J. Slotine () Stochastic contraction 1 / 19 Why stochastic contraction?
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationNonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31
Nonlinear Error Correction Model and Multiple-Threshold Cointegration Man Wang Dong Hua University, China Joint work with N.H.Chan May 23, 2014 Nonlinear Error Correction Model and Multiple-Threshold Cointegration
More informationInformation Theory for Maximum Likelihood Estimation of Diffusion Models
Information heory for Maximum Likelihood Estimation of Diffusion Models Hwan-sik Choi Binghamton University Mar. 4 Abstract We develop an information theoretic framework for maximum likelihood estimation
More informationMax. Likelihood Estimation. Outline. Econometrics II. Ricardo Mora. Notes. Notes
Maximum Likelihood Estimation Econometrics II Department of Economics Universidad Carlos III de Madrid Máster Universitario en Desarrollo y Crecimiento Económico Outline 1 3 4 General Approaches to Parameter
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationEffects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks
Commun. Theor. Phys. (Beijing, China) 42 (2004) pp. 121 125 c International Academic Publishers Vol. 42, No. 1, July 15, 2004 Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized
More informationIn Memory of Wenbo V Li s Contributions
In Memory of Wenbo V Li s Contributions Qi-Man Shao The Chinese University of Hong Kong qmshao@cuhk.edu.hk The research is partially supported by Hong Kong RGC GRF 403513 Outline Lower tail probabilities
More informationA COUNTEREXAMPLE TO AN ENDPOINT BILINEAR STRICHARTZ INEQUALITY TERENCE TAO. t L x (R R2 ) f L 2 x (R2 )
Electronic Journal of Differential Equations, Vol. 2006(2006), No. 5, pp. 6. ISSN: 072-669. URL: http://ejde.math.txstate.edu or http://ejde.math.unt.edu ftp ejde.math.txstate.edu (login: ftp) A COUNTEREXAMPLE
More informationLocal Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes
Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes By Tomohito NAITO, Kohei ASAI and Masanobu TANIGUCHI Department of Mathematical Sciences, School of Science and Engineering,
More informationMinimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process
isid/ms/23/25 September 11, 23 http://www.isid.ac.in/ statmath/eprints Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process B. L. S. Prakasa Rao Indian Statistical Institute, Delhi
More informationBulk scaling limits, open questions
Bulk scaling limits, open questions Based on: Continuum limits of random matrices and the Brownian carousel B. Valkó, B. Virág. Inventiones (2009). Eigenvalue statistics for CMV matrices: from Poisson
More informationAnderson-Darling Type Goodness-of-fit Statistic Based on a Multifold Integrated Empirical Distribution Function
Anderson-Darling Type Goodness-of-fit Statistic Based on a Multifold Integrated Empirical Distribution Function S. Kuriki (Inst. Stat. Math., Tokyo) and H.-K. Hwang (Academia Sinica) Bernoulli Society
More informationThe problem is to infer on the underlying probability distribution that gives rise to the data S.
Basic Problem of Statistical Inference Assume that we have a set of observations S = { x 1, x 2,..., x N }, xj R n. The problem is to infer on the underlying probability distribution that gives rise to
More informationParametric Techniques Lecture 3
Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to
More informationAsymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½
University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 1998 Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ Lawrence D. Brown University
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationAsymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals
Acta Applicandae Mathematicae 78: 145 154, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. 145 Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals M.
More information1 Glivenko-Cantelli type theorems
STA79 Lecture Spring Semester Glivenko-Cantelli type theorems Given i.i.d. observations X,..., X n with unknown distribution function F (t, consider the empirical (sample CDF ˆF n (t = I [Xi t]. n Then
More informationMaximum Likelihood Diffusive Source Localization Based on Binary Observations
Maximum Lielihood Diffusive Source Localization Based on Binary Observations Yoav Levinboo and an F. Wong Wireless Information Networing Group, University of Florida Gainesville, Florida 32611-6130, USA
More informationON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction
JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 26, No. 2, May 2013 ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE Joo-Mok Kim* Abstract. We consider fractional Gussian noise
More information1. Fisher Information
1. Fisher Information Let f(x θ) be a density function with the property that log f(x θ) is differentiable in θ throughout the open p-dimensional parameter set Θ R p ; then the score statistic (or score
More informationHypothesis testing for Stochastic PDEs. Igor Cialenco
Hypothesis testing for Stochastic PDEs Igor Cialenco Department of Applied Mathematics Illinois Institute of Technology igor@math.iit.edu Joint work with Liaosha Xu Research partially funded by NSF grants
More informationLocal vs. Nonlocal Diffusions A Tale of Two Laplacians
Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2
More informationBranching Processes II: Convergence of critical branching to Feller s CSB
Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied
More informationOn a set of data for the membrane potential in a neuron
1 (revised version: to appear in Matematical Biosciences) On a set of data for te membrane potential in a neuron Reinard Höpfner (oepfner@matematik.uni-mainz.de) Institute of Matematics, University of
More informationLecture 4 Towards Deep Learning
Lecture 4 Towards Deep Learning (January 30, 2015) Mu Zhu University of Waterloo Deep Network Fields Institute, Toronto, Canada 2015 by Mu Zhu 2 Boltzmann Distribution probability distribution for a complex
More informationlaplace s method for ordinary differential equations
Physics 24 Spring 217 laplace s method for ordinary differential equations lecture notes, spring semester 217 http://www.phys.uconn.edu/ rozman/ourses/p24_17s/ Last modified: May 19, 217 It is possible
More informationControlled Diffusions and Hamilton-Jacobi Bellman Equations
Controlled Diffusions and Hamilton-Jacobi Bellman Equations Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 2014 Emo Todorov (UW) AMATH/CSE 579, Winter
More informationBayesian Regularization
Bayesian Regularization Aad van der Vaart Vrije Universiteit Amsterdam International Congress of Mathematicians Hyderabad, August 2010 Contents Introduction Abstract result Gaussian process priors Co-authors
More informationSuggested solutions to written exam Jan 17, 2012
LINKÖPINGS UNIVERSITET Institutionen för datavetenskap Statistik, ANd 73A36 THEORY OF STATISTICS, 6 CDTS Master s program in Statistics and Data Mining Fall semester Written exam Suggested solutions to
More informationLecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary
ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood
More informationCOMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE
APPLICATIONES MATHEMATICAE 23,1(1995), pp. 83 93 A. WERON(Wroc law) COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE Abstract. The aim of this paper is to demonstrate how
More informationPractice Exam 1. (A) (B) (C) (D) (E) You are given the following data on loss sizes:
Practice Exam 1 1. Losses for an insurance coverage have the following cumulative distribution function: F(0) = 0 F(1,000) = 0.2 F(5,000) = 0.4 F(10,000) = 0.9 F(100,000) = 1 with linear interpolation
More informationAsymptotic properties of maximum likelihood estimator for the growth rate for a jump-type CIR process
Asymptotic properties of maximum likelihood estimator for the growth rate for a jump-type CIR process Mátyás Barczy University of Debrecen, Hungary The 3 rd Workshop on Branching Processes and Related
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMinimax estimators of the coverage probability of the impermissible error for a location family
Minimax estimators of the coverage probability of the impermissible error for a location family by Miguel A. Arcones Binghamton University arcones@math.binghamton.edu Talk based on: Arcones, M. A. (2008).
More informationarxiv: v2 [math.pr] 15 Sep 2015
A copula-based method to build diffusion models with prescribed marginal and serial dependence Enrico Bibbona a, Laura Sacerdote a, Emiliano Torre b a Department of Mathematics G.Peano arxiv:1509.02319v2
More informationWalsh Diffusions. Andrey Sarantsev. March 27, University of California, Santa Barbara. Andrey Sarantsev University of Washington, Seattle 1 / 1
Walsh Diffusions Andrey Sarantsev University of California, Santa Barbara March 27, 2017 Andrey Sarantsev University of Washington, Seattle 1 / 1 Walsh Brownian Motion on R d Spinning measure µ: probability
More informationσ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =
Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,
More informationMFM Practitioner Module: Quantitiative Risk Management. John Dodson. October 14, 2015
MFM Practitioner Module: Quantitiative Risk Management October 14, 2015 The n-block maxima 1 is a random variable defined as M n max (X 1,..., X n ) for i.i.d. random variables X i with distribution function
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More informationOn the Goodness-of-Fit Tests for Some Continuous Time Processes
On the Goodness-of-Fit Tests for Some Continuous Time Processes Sergueï Dachian and Yury A. Kutoyants Laboratoire de Mathématiques, Université Blaise Pascal Laboratoire de Statistique et Processus, Université
More informationBrownian Motion on Manifold
Brownian Motion on Manifold QI FENG Purdue University feng71@purdue.edu August 31, 2014 QI FENG (Purdue University) Brownian Motion on Manifold August 31, 2014 1 / 26 Overview 1 Extrinsic construction
More informationCS Lecture 19. Exponential Families & Expectation Propagation
CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces
More informationJoint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion
Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion Luis Barboza October 23, 2012 Department of Statistics, Purdue University () Probability Seminar 1 / 59 Introduction
More informationPart III. A Decision-Theoretic Approach and Bayesian testing
Part III A Decision-Theoretic Approach and Bayesian testing 1 Chapter 10 Bayesian Inference as a Decision Problem The decision-theoretic framework starts with the following situation. We would like to
More informationStatistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation
Statistics - Lecture One Charlotte Wickham wickham@stat.berkeley.edu http://www.stat.berkeley.edu/~wickham/ Outline 1. Basic ideas about estimation 2. Method of Moments 3. Maximum Likelihood 4. Confidence
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationIntroduction to Machine Learning
Outline Introduction to Machine Learning Bayesian Classification Varun Chandola March 8, 017 1. {circular,large,light,smooth,thick}, malignant. {circular,large,light,irregular,thick}, malignant 3. {oval,large,dark,smooth,thin},
More information