Laplace Distribution Based Stochastic Models Anastassia Baxevani Department of Mathematics and Statistics University of Cyprus results based on work of/with Podgorski, Wegener, Aberg, Bolin, Walin July 2, 213
Outline 1 Introduction 2 Laplace distributions and their generalizations 3 Stochastic processes build upon Laplace distribution 4 Model fitting, estimation
Mathematical Models In spatial statistics and geostatistics, Gaussian random fields are important tools for constructing models for data. This is mainly because of their simplicity: they are specified through the mean and covariance structures, which is what is needed for prediction and estimation.
Mathematical Models In spatial statistics and geostatistics, Gaussian random fields are important tools for constructing models for data. This is mainly because of their simplicity: they are specified through the mean and covariance structures, which is what is needed for prediction and estimation. Drawback: they have difficulty in efficiently accounting for asymmetries and unusually extreme values in the data. When data exhibit such characteristics is usual to consider transformed Gaussian models, with the choice of transformation driven by the particular application.
Mathematical Models In spatial statistics and geostatistics, Gaussian random fields are important tools for constructing models for data. This is mainly because of their simplicity: they are specified through the mean and covariance structures, which is what is needed for prediction and estimation. Drawback: they have difficulty in efficiently accounting for asymmetries and unusually extreme values in the data. When data exhibit such characteristics is usual to consider transformed Gaussian models, with the choice of transformation driven by the particular application. To provide with a more universal model, there is a growing interest in models featuring more general distributions: Laplace moving average models, are formulated as convolutions of Laplace noise and some deterministic kernel and theoretically, they constitute a rich class that is capable of modeling a variety of geometrical asymmetries in the records, and simultaneously can efficiently account for occasional highly extreme events.
Outline 1 Introduction 2 Laplace distributions and their generalizations 3 Stochastic processes build upon Laplace distribution 4 Model fitting, estimation
Asymmetric Laplace distribution Characteristic function: φ(t) = e itδ 1 + σ 2 t 2 /2 µit where δ for location, usually δ =, µ shape, σ > the scale parameter.
Asymmetric Laplace distribution Characteristic function: φ(t) = e itδ 1 + σ 2 t 2 /2 µit where δ for location, usually δ =, µ shape, σ > the scale parameter. There is an analog of the Central Limit Theorem with random number of terms in the summation: If ν p is a geometric random variable with mean 1/p (waiting until failure) and independent of Y i s which have finite variances, then the random sum ν p S p = a p (Y i + b p) converges in distribution to an asymmetric Laplace distribution - -stability with respect to geometric summation. i=1
Generalized skewed Laplace distribution Generalized skewed Laplace distributions have ch.f. φ(t) = e itδ (1 + σ 2 t 2 /2 iµt) ν
Generalized skewed Laplace distribution Generalized skewed Laplace distributions have ch.f. φ(t) = e itδ (1 + σ 2 t 2 /2 iµt) ν 1.2 1.6 1 1.4!=1/4.8.6 Standard Gaussian!= infinity Standard Laplace!=1 1.2 1.8 Standard Gaussian Asymmetric Laplace!=1.4.6!=3.2!=1/4!3!2!1 1 2 3.4.2!1.5!1!.5.5 1 1.5 2 Left: µ = Right: µ = 3σ 2 3/2 and τ = 1/ν
Generalized skewed Laplace distribution, cont. Equivalent expressions:
Generalized skewed Laplace distribution, cont. Equivalent expressions: Gamma variance-mean mixtures of normal σ W Z + µw + δ, Z standard normal, W gamma variable with the shape ν independent of Z.
Generalized skewed Laplace distribution, cont. Equivalent expressions: Gamma variance-mean mixtures of normal σ W Z + µw + δ, Z standard normal, W gamma variable with the shape ν independent of Z. W 1, W 2 independent exponential. κw 1 1 κ W 2 + δ The equivalent expressions give rise to different parameterizations! Choice should be based on the purpose, i.e. inference or theoretical properties?
Multivariate extension Gamma variance models Y = δ + Γ µ + Γ 1/2 Σ 1/2 Z where Γ is a gamma variable with the shape parameter ν and Z has a multivariate normal distribution.
Multivariate extension Gamma variance models Y = δ + Γ µ + Γ 1/2 Σ 1/2 Z where Γ is a gamma variable with the shape parameter ν and Z has a multivariate normal distribution. Characteristics function ( ) ν φ(t) = e iδ t 1, t R d. 1 + 1 2 t Σt iµ t
Multivariate extension Gamma variance models Y = δ + Γ µ + Γ 1/2 Σ 1/2 Z where Γ is a gamma variable with the shape parameter ν and Z has a multivariate normal distribution. Characteristics function ( ) ν φ(t) = e iδ t 1, t R d. 1 + 1 2 t Σt iµ t Density 2 exp(µ Σ 1 (y δ)) (2π) d/2 Γ(ν) Σ 1/2 ( ) ν d/2 Q(y δ) K ν d/2(q(y δ)c(σ, µ)), C(Σ, µ) where Q(x) = x Σ 1 x and C(Σ, µ) = 2 + µ Σ 1 µ.
Examples of the densities 4 4 3 3 2 2 1 1 1 1 2 2 3 4 4 3 3 2 1 1 2 3 4 4 4 4 4 3 3 2 2 1 1 1 1 2 2 3 4 4 3 2 1 1 2 3 4 3 2 1 1 2 3 4 3 3 2 1 1 2 3 4 4 4
Outline 1 Introduction 2 Laplace distributions and their generalizations 3 Stochastic processes build upon Laplace distribution 4 Model fitting, estimation
Stochastic Laplace measure A stochastic Laplace measure, Λ with parameters δ R, µ R and σ > and controlled by a measure m, is a σ-additive function: Λ : (X, B, m) L 2 (Ω, F, P) : Λ(A) GL(µ, σ, 1 m(a) ), A B
Stochastic Laplace measure A stochastic Laplace measure, Λ with parameters δ R, µ R and σ > and controlled by a measure m, is a σ-additive function: Λ : (X, B, m) L 2 (Ω, F, P) : Λ(A) GL(µ, σ, 1 m(a) ), A B φ Λ(A) (t) = e itδm(a) (1 + σ 2 t 2 /2 µit) m(a)
Stochastic Laplace measure A stochastic Laplace measure, Λ with parameters δ R, µ R and σ > and controlled by a measure m, is a σ-additive function: Λ : (X, B, m) L 2 (Ω, F, P) : Λ(A) GL(µ, σ, 1 m(a) ), A B φ Λ(A) (t) = e itδm(a) (1 + σ 2 t 2 /2 µit) m(a) for disjoint A i B, Λ(A i ) are independent and with probability one Λ( i=1 A i) = Λ(A i ). i=1
Stochastic Laplace measure A stochastic Laplace measure, Λ with parameters δ R, µ R and σ > and controlled by a measure m, is a σ-additive function: Λ : (X, B, m) L 2 (Ω, F, P) : Λ(A) GL(µ, σ, 1 m(a) ), A B φ Λ(A) (t) = e itδm(a) (1 + σ 2 t 2 /2 µit) m(a) for disjoint A i B, Λ(A i ) are independent and with probability one Λ( i=1 A i) = Λ(A i ). i=1 Note: E(Λ(A)) = µm(a), Var(Λ(A)) = (µ 2 + σ 2 )m(a)
Laplace motion A Laplace motion L(t) with the asymmetry parameter µ, the space scale parameter σ and the time scale parameter ν, is defined by the following: it starts from the origin, L() = it has independent and stationary increments L(t + ν) L(t) GL(, σ, 1), i.e has an asymmetric Laplace distribution, centered in mean at zero, with parameter µ and σ. Relation to measure is through Λ(a, b] = L(b) L(a)
Alternatively, Laplace motion can be defined through... B(λ) denote standard Brownian motion
Alternatively, Laplace motion can be defined through... B(λ) denote standard Brownian motion Γ(λ) a standard gamma process, i.e. Lévy motion with density γ = νdλ g(x) = 1 Γ(γ) x γ 1 e x, x >
Alternatively, Laplace motion can be defined through... B(λ) denote standard Brownian motion Γ(λ) a standard gamma process, i.e. Lévy motion with density γ = νdλ g(x) = 1 Γ(γ) x γ 1 e x, x > Laplace motion - subordinated Brownian motion L(λ) = B(Γ(λ))
Alternatively, Laplace motion can be defined through... B(λ) denote standard Brownian motion Γ(λ) a standard gamma process, i.e. Lévy motion with density γ = νdλ g(x) = 1 Γ(γ) x γ 1 e x, x > Laplace motion - subordinated Brownian motion L(λ) = B(Γ(λ)) Some properties are inherited from the distribution (stochastic self-similarity): pl(λ) = L(NBp (λ)), where NB p is binomial Lévy motion.
Moving averages through stochastic integration Standard construction of stochastic integrals with deterministic kernels : X(τ) = f (τ x)dλ(x). R
Moving averages through stochastic integration Standard construction of stochastic integrals with deterministic kernels : X(τ) = f (τ x)dλ(x). Characteristic function ( φ X (t) = e itδ fdm exp Extension to multivariate case is immediate. R R log(1 iµtf (x) + σ2 t 2 f 2 ) (x) )dm(x). 2
Moving averages through stochastic integration Standard construction of stochastic integrals with deterministic kernels : X(τ) = f (τ x)dλ(x). Characteristic function ( φ X (t) = e itδ fdm exp Extension to multivariate case is immediate. Covariance : Spectral density: R R r(τ) = σ2 + µ 2 ν log(1 iµtf (x) + σ2 t 2 f 2 ) (x) )dm(x). 2 f (x τ)f (x)dx R(ω) = σ2 + µ 2 Ff (ω)ff ( ω) ν
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt 2 Simulate n i.i.d Γ( dt ν, 1) and store them in G
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt 2 Simulate n i.i.d Γ( dt ν, 1) and store them in G 3 Simulate n i.i.d N(, σ 2 G i ) and store them in B
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt 2 Simulate n i.i.d Γ( dt ν, 1) and store them in G 3 Simulate n i.i.d N(, σ 2 G i ) and store them in B 4 Compute X = d f (x)dx + µ(f G) + f B
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt 2 Simulate n i.i.d Γ( dt ν, 1) and store them in G 3 Simulate n i.i.d N(, σ 2 G i ) and store them in B 4 Compute X = d f (x)dx + µ(f G) + f B 5 Remove the first and last m entries, where m is the length of f
Simulation To simulate the Moving Average process (d = 1) : Λ(t) = d t + µγ(t) + B(Γ(t)), Γ(t) Γ( t, 1), B(t) BM(σ) ν 1 Pick equally spaced grid: dt 2dt ndt 2 Simulate n i.i.d Γ( dt ν, 1) and store them in G 3 Simulate n i.i.d N(, σ 2 G i ) and store them in B 4 Compute X = d f (x)dx + µ(f G) + f B 5 Remove the first and last m entries, where m is the length of f Very fast and efficient but it looses some resolution where the jumps of the Gamma process occur because of the grid! For d > 1 we simulate GL increments
Symmetric spatial models - realizations
Asymmetric spatial models
Outline 1 Introduction 2 Laplace distributions and their generalizations 3 Stochastic processes build upon Laplace distribution 4 Model fitting, estimation
Fitting the model The problem of fitting the model is two folded:
Fitting the model The problem of fitting the model is two folded: Fit the kernel
Fitting the model The problem of fitting the model is two folded: Fit the kernel Fit the Laplace noice
Symmetric kernel case
Symmetric kernel case A non-parametric approach is to estimate f by f (x) = F 1 Ŝ(ω), where Ŝ(ω) is an estimate of spectrum
Non-symmetric kernels measures of asymmetries There is no general non-parametric approach to the problem.
Non-symmetric kernels measures of asymmetries There is no general non-parametric approach to the problem. Various parametrized families of kernels are available
Non-symmetric kernels measures of asymmetries There is no general non-parametric approach to the problem. Various parametrized families of kernels are available Measures of asymmetries for random surfaces should involve moments of joint distributions of the process and its derivative
Non-symmetric kernels measures of asymmetries There is no general non-parametric approach to the problem. Various parametrized families of kernels are available Measures of asymmetries for random surfaces should involve moments of joint distributions of the process and its derivative Estimation of distributional and kernel parameters has to be performed through solution of the resulting equations.
Laplace moving average trajectories 2 4 2 2 2 2 2 2 4 6 8 1 2 4 6 8 1 2 4 6 8 1 5 2 2 2 5 2 2 4 6 8 1 2 4 6 8 1 2 4 6 8 1
Measuring tilting using Rice s formula N(T, A) number of times the field X takes value zero in [, T ] and at the same time has a property A
Measuring tilting using Rice s formula N(T, A) number of times the field X takes value zero in [, T ] and at the same time has a property A For ergodic stationary processes ] N(T, A) E [{X A} Ẋ() X() = lim = [ ], T N(T ) E Ẋ() X() =
Measuring tilting using Rice s formula N(T, A) number of times the field X takes value zero in [, T ] and at the same time has a property A For ergodic stationary processes ] N(T, A) E [{X A} Ẋ() X() = lim = [ ], T N(T ) E Ẋ() X() = The right hand side represents the biased sampling distribution when sampling is made over the -level contour C = {τ : X(τ ) = }
Tilting of trajectories 4 6 2 4 2 2 4 2 4 6 8 1 2 2 4 6 8 1 1 empirical theoretical 1 empirical theoretical relative number.8.6.4 relative number.8.6.4.2.2 2 1.5 1.5.5 1 1.5 2 derivative at down/upcrossing of level u= 2 1 1 2 3 derivative at down/upcrossing of level u=
Tilting o LMA with odd kernels 5 4 level 3 2 1 1 2 3 4 2 2 4 2 4 6 8 1 4 5 6 4 2 2 4 6 derivative at down/upcrossing f (x; α, β) = C sgn(x) x α 1 e x /β, α > 1/2, β >. Marginal distribution of LMA always symmetric, but trajectories may exhibit some asymmetric features due to asymmetry of kernel.
Measures of tilting Measures of tilting should involve derivatives of the LMA process. We propose: ρ 3 = EẊ 3 E 3/2Ẋ 2, - skewness of the derivative process. If process has symmetric distribution, this measure fails to capture any asymmetries in the records!
Measures of tilting Measures of tilting should involve derivatives of the LMA process. We propose: ρ 3 = EẊ 3 E 3/2Ẋ 2, - skewness of the derivative process. If process has symmetric distribution, this measure fails to capture any asymmetries in the records! ρ 3,1 = E(Ẋ 3 X) E 3/2Ẋ 2 E 1/2 X or r 2 3,1 = E(Ẋ 3 X) E 2 X 2
The Gamma kernel f (x; τ, β) = 2 τ 1/2 Γ(2τ 1)β τ 1/2 x τ 1 e x/β x, τ >, β >, - for τ = 1 exponential for τ Gaussian case.
The Gamma kernel f (x; τ, β) = 2 τ 1/2 Γ(2τ 1)β τ 1/2 x τ 1 e x/β x, τ >, β >, - for τ = 1 exponential for τ Gaussian case. r 31 = 3ν 42(2 τ) (τ 1)Γ(4τ 6) β 4 Γ 2 (2τ 1), τ > 3/2.
The Gamma kernel f (x; τ, β) = 2 τ 1/2 Γ(2τ 1)β τ 1/2 x τ 1 e x/β x, τ >, β >, - for τ = 1 exponential for τ Gaussian case. r 31 = 3ν 42(2 τ) (τ 1)Γ(4τ 6) β 4 Γ 2 (2τ 1) τ E(ˆτ) Std(ˆτ) 2 2.245.549 4 4.1367.4636 1 11.628 4.364, τ > 3/2. Monte-Carlo estimation of τ, for ν = 1 and µ =. Hundred Monte-Carlo replicates of sample size of 3, with step.1.
The Gamma kernel, cont. 4.8.7.6 scaled Gamma kernel τ=2 τ=4 τ=1 2.5.4 2 1 2 3 4 5 6 7 8 9 1 τ.3.2.1 5 1 15 2 25 1 1 2 3 4 5 6 7 8 9 1 τ β = 1 and tilting measures r 31 (top), and ρ 31 (bottom) as functions of τ.
Distributional parameters - Method of moments
Distributional parameters - Method of moments Moments of any integer order are available for LMA - the first four central moments are explicitely available and can be used to fit the four distributional parameters: E X = µ ( fdm, E (X E X) 2 = µ 2 + σ 2) ( E (X E X) 3 =µ 2µ 2 + 3σ 2) ( E (X E X) 4 =3 µ 2 + σ 2) ( 2 f 3 dm, f 2 dm, 2 ( f dm) 2 + 3 2µ 4 + 4µ 2 σ 2 + σ 4) f 4 dm
Distributional parameters - Method of moments Moments of any integer order are available for LMA - the first four central moments are explicitely available and can be used to fit the four distributional parameters: E X = µ ( fdm, E (X E X) 2 = µ 2 + σ 2) ( E (X E X) 3 =µ 2µ 2 + 3σ 2) ( E (X E X) 4 =3 µ 2 + σ 2) ( 2 f 3 dm, f 2 dm, 2 ( f dm) 2 + 3 2µ 4 + 4µ 2 σ 2 + σ 4) f 4 dm The method of moments estimation is not the most efficient way of estimation
Distributional parameters - Method of moments Moments of any integer order are available for LMA - the first four central moments are explicitely available and can be used to fit the four distributional parameters: E X = µ ( fdm, E (X E X) 2 = µ 2 + σ 2) ( E (X E X) 3 =µ 2µ 2 + 3σ 2) ( E (X E X) 4 =3 µ 2 + σ 2) ( 2 f 3 dm, f 2 dm, 2 ( f dm) 2 + 3 2µ 4 + 4µ 2 σ 2 + σ 4) f 4 dm The method of moments estimation is not the most efficient way of estimation Empirical moments may fall outside the range permitted by the parameters of the distribution
MLE estimation System of moment equations that would account for both distributional and kernel parameters maybe fairly complex
MLE estimation System of moment equations that would account for both distributional and kernel parameters maybe fairly complex More efficiently, one can attempt Maximum Likelihood Estimation
MLE estimation System of moment equations that would account for both distributional and kernel parameters maybe fairly complex More efficiently, one can attempt Maximum Likelihood Estimation Explicit formula for the MLE estimation can not be obtained except for some very special cases since the density of a generalized Laplace distribution is given through the Bessel function for the Laplace moving averages only characteristics functions can be written in an explicit fashion Numerical optimization has to be implemented
MLE estimation through EM algorithm Representation of GL distribution as σ ΓZ + µγ + δ is used and gamma variable Γ with parameter τ is treated as a missing value
MLE estimation through EM algorithm Representation of GL distribution as σ ΓZ + µγ + δ is used and gamma variable Γ with parameter τ is treated as a missing value For implementation of the algorithm see Podgórski and Wallin, EM algorithm for maximizing likelihood of linear models involving generalized Laplace distribution, under preparation.
MME vs. MLE - illustration The MME estimates of parameters (crosses) vs. the MLE ones (dots). The true parameters δ =.33, σ = 1.5, µ = 1.32, and τ =.25 are marked by the large star.
Highlights
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails. Despite this greater flexibility the discussed models still share a lot of spectral properties with Gaussian processes having the latter as a special case.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails. Despite this greater flexibility the discussed models still share a lot of spectral properties with Gaussian processes having the latter as a special case. The models extend directly to random fields.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails. Despite this greater flexibility the discussed models still share a lot of spectral properties with Gaussian processes having the latter as a special case. The models extend directly to random fields. Spatio-temporal characteristics including asymmetries in the records can be studied by the means of generalized Rice s formula.
Highlights Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails. Despite this greater flexibility the discussed models still share a lot of spectral properties with Gaussian processes having the latter as a special case. The models extend directly to random fields. Spatio-temporal characteristics including asymmetries in the records can be studied by the means of generalized Rice s formula. Model fitting can be obtained by utilizing: a) Method of Moments (simpler) or b) Maximum Likelihood (more accurate)
Non-Gaussian stochastic fields are proposed that can be suitable for modeling environmental data. The models are introduced by means of integrals with respect to independently scattered stochastic measures that have generalized Laplace distributions. Resulting stationary second order processes have, as opposed to their Gaussian counterpart, a possibility of accounting for asymmetry and heavier tails. Despite this greater flexibility the discussed models still share a lot of spectral properties with Gaussian processes having the latter as a special case. The models extend directly to random fields. Spatio-temporal characteristics including asymmetries in the records can be studied by the means of generalized Rice s formula. Model fitting can be obtained by utilizing: a) Method of Moments (simpler) or b) Maximum Likelihood (more accurate) Markov Random Fields can be used to localize efficiently the fitting problem and provide a way to account for non-stationarity in the data. Highlights
List of related papers Kotz, S., Kozubowski, T.J., Podgórski, K. (21). The Laplace distribution and generalizations: A revisit with applications to communications, economics, engineering and finance. Boston: Birkhaüser. Åberg, S. and Podgórski, K. (211). A class of non-gaussian second order random fields. Extremes Podgórski, K. and Wegener, J. (21). Estimation for stochastic models driven by Laplace motion. Comm. Statist. - Theory and Methods.
List of related papers Kotz, S., Kozubowski, T.J., Podgórski, K. (21). The Laplace distribution and generalizations: A revisit with applications to communications, economics, engineering and finance. Boston: Birkhaüser. Åberg, S. and Podgórski, K. (211). A class of non-gaussian second order random fields. Extremes Podgórski, K. and Wegener, J. (21). Estimation for stochastic models driven by Laplace motion. Comm. Statist. - Theory and Methods. Lindgren, F., Rue, H., and Lindström, J. (211). An explicit link between gaussian fields and gaussian markov random fields: the stochastic partial differential equation approach. J. Royal Stat. Soc., Series B. Lindgren, G., Bolin, D., Lindgren, F. (21) Non-traditional stochastic models for ocean waves Lagrange models and nested SPDE models. European Phys. J.
List of related papers Kotz, S., Kozubowski, T.J., Podgórski, K. (21). The Laplace distribution and generalizations: A revisit with applications to communications, economics, engineering and finance. Boston: Birkhaüser. Åberg, S. and Podgórski, K. (211). A class of non-gaussian second order random fields. Extremes Podgórski, K. and Wegener, J. (21). Estimation for stochastic models driven by Laplace motion. Comm. Statist. - Theory and Methods. Lindgren, F., Rue, H., and Lindström, J. (211). An explicit link between gaussian fields and gaussian markov random fields: the stochastic partial differential equation approach. J. Royal Stat. Soc., Series B. Lindgren, G., Bolin, D., Lindgren, F. (21) Non-traditional stochastic models for ocean waves Lagrange models and nested SPDE models. European Phys. J. Bolin, D. (211). Non-gaussian spatial models generated by stochastic partial differential equations. Podgórski, K., Walin, J. (211). The EM-algorithm for models involving generalized Laplace distribution. In progress
List of related papers Kotz, S., Kozubowski, T.J., Podgórski, K. (21). The Laplace distribution and generalizations: A revisit with applications to communications, economics, engineering and finance. Boston: Birkhaüser. Åberg, S. and Podgórski, K. (211). A class of non-gaussian second order random fields. Extremes Podgórski, K. and Wegener, J. (21). Estimation for stochastic models driven by Laplace motion. Comm. Statist. - Theory and Methods. Lindgren, F., Rue, H., and Lindström, J. (211). An explicit link between gaussian fields and gaussian markov random fields: the stochastic partial differential equation approach. J. Royal Stat. Soc., Series B. Lindgren, G., Bolin, D., Lindgren, F. (21) Non-traditional stochastic models for ocean waves Lagrange models and nested SPDE models. European Phys. J. Bolin, D. (211). Non-gaussian spatial models generated by stochastic partial differential equations. Podgórski, K., Walin, J. (211). The EM-algorithm for models involving generalized Laplace distribution. In progress Baxevani A., Podgórski, K. and Wegener, J. (212). Vertical and Horizontal asymmetries for moving averages driven by Laplace motion. In progress Baxevani A. and Podgórski, K., (212). Spectral representation of Laplace moving average process. In progress
Final Slide
Final Slide The French physicist Gabriel Lippman wrote the following in a letter to Henri Poincare. Tout le monde y croit cependent, car les experimenteurs s imaginent que c est un theorem de mathematiques, et les mathematiciens que c est un fait experimental. Everybody believes in the exponential law of errors: the experimenters, because they think it can be proved by mathematics; and the mathematicians, because they believe it has been established by observation.