Non-gaussian spatiotemporal modeling
|
|
- Arron Hoover
- 5 years ago
- Views:
Transcription
1 Dec, / 37 Non-gaussian spatiotemporal modeling Thais C O da Fonseca Joint work with Prof Mark F J Steel Department of Statistics University of Warwick Dec, 2008
2 Dec, / 37 1 Introduction Motivation 2 Spatiotemporal modeling Nonseparable models Heavy tailed processes 3 Simulation Results Data 1 Data 2 Data 3 4 Temperature data Non-gaussian spatiotemporal modeling Results
3 Dec, / 37 Motivation Spatiotemporal data Due to the proliferation of data sets that are indexed in both space and time, spatiotemporal models have received an increased attention in the literature. Maximum temperature data - Spanish Basque Country (67 stations)
4 Dec, / 37 Motivation Spatiotemporal data Due to the proliferation of data sets that are indexed in both space and time, spatiotemporal models have received an increased attention in the literature. Maximum temperature data - Spanish Basque Country (67 stations) Madrid Paris s s 2
5 Dec, / 37 Motivation Example 31 time points (july 2006) maximum temperature time index
6 Dec, / 37 Motivation Typical problem Given: observations Z(s i, t j ) at a finite number locations s i, i = 1,..., I and time points t j, j = 1,..., J. Desired: predictive distribution for the unknown value Z(s 0, t 0 ) at the space-time coordinate (s 0, t 0 ). Focus: continuous space and continuous time which allow for prediction and interpolation at any location and any time. Z(s, t), (s, t) D T, where D R d, T R
7 Dec, / 37 Motivation Typical problem Given: observations Z(s i, t j ) at a finite number locations s i, i = 1,..., I and time points t j, j = 1,..., J. Desired: predictive distribution for the unknown value Z(s 0, t 0 ) at the space-time coordinate (s 0, t 0 ). Focus: continuous space and continuous time which allow for prediction and interpolation at any location and any time. Z(s, t), (s, t) D T, where D R d, T R
8 Dec, / 37 Motivation Typical problem Given: observations Z(s i, t j ) at a finite number locations s i, i = 1,..., I and time points t j, j = 1,..., J. Desired: predictive distribution for the unknown value Z(s 0, t 0 ) at the space-time coordinate (s 0, t 0 ). Focus: continuous space and continuous time which allow for prediction and interpolation at any location and any time. Z(s, t), (s, t) D T, where D R d, T R
9 Dec, / 37 Motivation General modeling formulation The uncertainty of the unobserved parts of the process can be expressed probabilistically by a random function in space and time: {Z(s, t); (s, t) D T}. We need to specify a valid covariance structure for the process. C(s 1, s 2 ; t 1, t 2 ) = Cov(Z(s 1, t 1 ), Z(s 2, t 2 ))
10 Dec, / 37 Motivation General modeling formulation The uncertainty of the unobserved parts of the process can be expressed probabilistically by a random function in space and time: {Z(s, t); (s, t) D T}. We need to specify a valid covariance structure for the process. C(s 1, s 2 ; t 1, t 2 ) = Cov(Z(s 1, t 1 ), Z(s 2, t 2 ))
11 Dec, / 37 Motivation Non-gaussian spatiotemporal models But building adequate models for these processes is not an easy task. One observation of the process simplifying assumptions: Stationarity: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C(s 1 s 2, t 1 t 2 ) Isotropy: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C( s 1 s 2, t 1 t 2 ) Separability: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C s (s 1, s 2 )C t (t 1, t 2 ) Gaussianity: The process has finite dimensional Gaussian distribution. Models based on Gaussianity will not perform well (poor predictions) if the data are contaminated by outliers; there are regions with larger observational variance;
12 Dec, / 37 Motivation Non-gaussian spatiotemporal models But building adequate models for these processes is not an easy task. One observation of the process simplifying assumptions: Stationarity: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C(s 1 s 2, t 1 t 2 ) Isotropy: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C( s 1 s 2, t 1 t 2 ) Separability: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C s (s 1, s 2 )C t (t 1, t 2 ) Gaussianity: The process has finite dimensional Gaussian distribution. Models based on Gaussianity will not perform well (poor predictions) if the data are contaminated by outliers; there are regions with larger observational variance;
13 Dec, / 37 Motivation Non-gaussian spatiotemporal models But building adequate models for these processes is not an easy task. One observation of the process simplifying assumptions: Stationarity: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C(s 1 s 2, t 1 t 2 ) Isotropy: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C( s 1 s 2, t 1 t 2 ) Separability: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C s (s 1, s 2 )C t (t 1, t 2 ) Gaussianity: The process has finite dimensional Gaussian distribution. Models based on Gaussianity will not perform well (poor predictions) if the data are contaminated by outliers; there are regions with larger observational variance;
14 Dec, / 37 Motivation Non-gaussian spatiotemporal models But building adequate models for these processes is not an easy task. One observation of the process simplifying assumptions: Stationarity: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C(s 1 s 2, t 1 t 2 ) Isotropy: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C( s 1 s 2, t 1 t 2 ) Separability: Cov(Z(s1, t 1 ), Z(s 2, t 2 )) = C s (s 1, s 2 )C t (t 1, t 2 ) Gaussianity: The process has finite dimensional Gaussian distribution. Models based on Gaussianity will not perform well (poor predictions) if the data are contaminated by outliers; there are regions with larger observational variance;
15 Dec, / 37 Motivation Example Maximum temperature data - Spanish Basque Country empirical variance empirical variance station index time index For this reason, we consider spatiotemporal processes with heavy tailed finite dimensional distributions.
16 Dec, / 37 Motivation Example Maximum temperature data - Spanish Basque Country empirical variance empirical variance station index time index For this reason, we consider spatiotemporal processes with heavy tailed finite dimensional distributions.
17 Dec, / 37 We will consider processes that are stationary isotropic nonseparable non-gaussian
18 Dec, / 37 Nonseparable models Continuous mixture Idea: Continuous mixture of separable covariance functions [Ma, 2002]. It takes advantage of the well known theory developed for purely spatial and purely temporal processes. Nonseparable model Z(s, t) = Z 1 (s; U)Z 2 (t; V) (1) (U, V) is a bivariate random vector with correlation c. Unconditional covariance C(s, t) = C 1 (s; u)c 2 (t; v)df(u, v) (2) C(s, t) is valid and nonseparable.
19 Dec, / 37 Nonseparable models Continuous mixture Idea: Continuous mixture of separable covariance functions [Ma, 2002]. It takes advantage of the well known theory developed for purely spatial and purely temporal processes. Nonseparable model Z(s, t) = Z 1 (s; U)Z 2 (t; V) (1) (U, V) is a bivariate random vector with correlation c. Unconditional covariance C(s, t) = C 1 (s; u)c 2 (t; v)df(u, v) (2) C(s, t) is valid and nonseparable.
20 Dec, / 37 Nonseparable models Continuous mixture Idea: Continuous mixture of separable covariance functions [Ma, 2002]. It takes advantage of the well known theory developed for purely spatial and purely temporal processes. Nonseparable model Z(s, t) = Z 1 (s; U)Z 2 (t; V) (1) (U, V) is a bivariate random vector with correlation c. Unconditional covariance C(s, t) = C 1 (s; u)c 2 (t; v)df(u, v) (2) C(s, t) is valid and nonseparable.
21 Dec, / 37 Nonseparable models In particular, if C 1 (s; u) = σ 1 exp{ γ 1 (s)u} and C 2 (t; v) = σ 2 exp{ γ 2 (t)v} and U = X 0 + X 1 and V = X 0 + X 2, where X i has finite moment generating function M i, then Proposition C(s, t) = σ 2 M 0( (γ 1(s) + γ 2(t))) M 1( γ 1(s)) M 2( γ 2(t)), (s, t) D T, (3) where γ 1 (s) and γ 2 (t) are spatial and temporal variograms. For instance, γ 1 (s) = s/a α and γ 2 (t) = t/b β. Notice that c = corr(u, V) measures separability and c [0, 1]. See [Fonseca and Steel, 2008] for more details.
22 Dec, / 37 Nonseparable models In particular, if C 1 (s; u) = σ 1 exp{ γ 1 (s)u} and C 2 (t; v) = σ 2 exp{ γ 2 (t)v} and U = X 0 + X 1 and V = X 0 + X 2, where X i has finite moment generating function M i, then Proposition C(s, t) = σ 2 M 0( (γ 1(s) + γ 2(t))) M 1( γ 1(s)) M 2( γ 2(t)), (s, t) D T, (3) where γ 1 (s) and γ 2 (t) are spatial and temporal variograms. For instance, γ 1 (s) = s/a α and γ 2 (t) = t/b β. Notice that c = corr(u, V) measures separability and c [0, 1]. See [Fonseca and Steel, 2008] for more details.
23 Dec, / 37 Nonseparable models In particular, if C 1 (s; u) = σ 1 exp{ γ 1 (s)u} and C 2 (t; v) = σ 2 exp{ γ 2 (t)v} and U = X 0 + X 1 and V = X 0 + X 2, where X i has finite moment generating function M i, then Proposition C(s, t) = σ 2 M 0( (γ 1(s) + γ 2(t))) M 1( γ 1(s)) M 2( γ 2(t)), (s, t) D T, (3) where γ 1 (s) and γ 2 (t) are spatial and temporal variograms. For instance, γ 1 (s) = s/a α and γ 2 (t) = t/b β. Notice that c = corr(u, V) measures separability and c [0, 1]. See [Fonseca and Steel, 2008] for more details.
24 Dec, / 37 Nonseparable models In particular, if C 1 (s; u) = σ 1 exp{ γ 1 (s)u} and C 2 (t; v) = σ 2 exp{ γ 2 (t)v} and U = X 0 + X 1 and V = X 0 + X 2, where X i has finite moment generating function M i, then Proposition C(s, t) = σ 2 M 0( (γ 1(s) + γ 2(t))) M 1( γ 1(s)) M 2( γ 2(t)), (s, t) D T, (3) where γ 1 (s) and γ 2 (t) are spatial and temporal variograms. For instance, γ 1 (s) = s/a α and γ 2 (t) = t/b β. Notice that c = corr(u, V) measures separability and c [0, 1]. See [Fonseca and Steel, 2008] for more details.
25 Dec, / 37 Heavy tailed processes Mixing in space and time We consider the process Z(s, t) = Z 1 (s; U) Z 2 (t; V), (4) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) (5) h(s) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) (6)
26 Dec, / 37 Heavy tailed processes Mixing in space and time We consider the process Z(s, t) = Z 1 (s; U) Z 2 (t; V), (4) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) (5) h(s) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) (6)
27 Dec, / 37 Heavy tailed processes Mixing in space and time We consider the process Z(s, t) = Z 1 (s; U) Z 2 (t; V), (4) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) (5) h(s) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) (6)
28 Dec, / 37 Heavy tailed processes Process λ 1 (s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) λ 1 (s) accounts for regions in space with larger observational variance. λ 1 (s) needs to be correlated to induce m.s. continuity of Z 1 (s; U), this is equivalent to E[λ 1/2 1 (s i )λ 1/2 1 (s i )] E[λ 1 1 (s i)] as s i s i. This is satisfied by λ 1 (s) = λ, s student-t process. But is does not account for regions with larger variance. This is also satisfied by the glg process where {ln(λ 1 (s)); s D} is a gaussian process with mean ν 2 and covariance structure νc 1(.). [Palacios and Steel, 2006]
29 Dec, / 37 Heavy tailed processes Process λ 1 (s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) λ 1 (s) accounts for regions in space with larger observational variance. λ 1 (s) needs to be correlated to induce m.s. continuity of Z 1 (s; U), this is equivalent to E[λ 1/2 1 (s i )λ 1/2 1 (s i )] E[λ 1 1 (s i)] as s i s i. This is satisfied by λ 1 (s) = λ, s student-t process. But is does not account for regions with larger variance. This is also satisfied by the glg process where {ln(λ 1 (s)); s D} is a gaussian process with mean ν 2 and covariance structure νc 1(.). [Palacios and Steel, 2006]
30 Dec, / 37 Heavy tailed processes Process λ 1 (s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) λ 1 (s) accounts for regions in space with larger observational variance. λ 1 (s) needs to be correlated to induce m.s. continuity of Z 1 (s; U), this is equivalent to E[λ 1/2 1 (s i )λ 1/2 1 (s i )] E[λ 1 1 (s i)] as s i s i. This is satisfied by λ 1 (s) = λ, s student-t process. But is does not account for regions with larger variance. This is also satisfied by the glg process where {ln(λ 1 (s)); s D} is a gaussian process with mean ν 2 and covariance structure νc 1(.). [Palacios and Steel, 2006]
31 Dec, / 37 Heavy tailed processes Process λ 1 (s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) λ 1 (s) accounts for regions in space with larger observational variance. λ 1 (s) needs to be correlated to induce m.s. continuity of Z 1 (s; U), this is equivalent to E[λ 1/2 1 (s i )λ 1/2 1 (s i )] E[λ 1 1 (s i)] as s i s i. This is satisfied by λ 1 (s) = λ, s student-t process. But is does not account for regions with larger variance. This is also satisfied by the glg process where {ln(λ 1 (s)); s D} is a gaussian process with mean ν 2 and covariance structure νc 1(.). [Palacios and Steel, 2006]
32 Dec, / 37 Heavy tailed processes Process λ 1 (s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) λ 1 (s) accounts for regions in space with larger observational variance. λ 1 (s) needs to be correlated to induce m.s. continuity of Z 1 (s; U), this is equivalent to E[λ 1/2 1 (s i )λ 1/2 1 (s i )] E[λ 1 1 (s i)] as s i s i. This is satisfied by λ 1 (s) = λ, s student-t process. But is does not account for regions with larger variance. This is also satisfied by the glg process where {ln(λ 1 (s)); s D} is a gaussian process with mean ν 2 and covariance structure νc 1(.). [Palacios and Steel, 2006]
33 Dec, / 37 Heavy tailed processes Process h(s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) h(s) accounts for traditional outliers (different nugget effects). We consider the detection of outliers jointly in the estimation procedure and the variable h i = h(s i ), i = 1,..., I are considered latent variables Their posterior distribution indicate outlying observations (h i close to 0). We consider log(hi ) N( ν h /2, ν h ). h i Ga(1/ν h, 1/ν h ).
34 Dec, / 37 Heavy tailed processes Process h(s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) h(s) accounts for traditional outliers (different nugget effects). We consider the detection of outliers jointly in the estimation procedure and the variable h i = h(s i ), i = 1,..., I are considered latent variables Their posterior distribution indicate outlying observations (h i close to 0). We consider log(hi ) N( ν h /2, ν h ). h i Ga(1/ν h, 1/ν h ).
35 Dec, / 37 Heavy tailed processes Process h(s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) h(s) accounts for traditional outliers (different nugget effects). We consider the detection of outliers jointly in the estimation procedure and the variable h i = h(s i ), i = 1,..., I are considered latent variables Their posterior distribution indicate outlying observations (h i close to 0). We consider log(hi ) N( ν h /2, ν h ). h i Ga(1/ν h, 1/ν h ).
36 Dec, / 37 Heavy tailed processes Process h(s) Mixing in space Z 1 (s; U) = 1 τ 2 Z 1(s; U) λ1 (s) + τ ɛ(s) h(s) h(s) accounts for traditional outliers (different nugget effects). We consider the detection of outliers jointly in the estimation procedure and the variable h i = h(s i ), i = 1,..., I are considered latent variables Their posterior distribution indicate outlying observations (h i close to 0). We consider log(hi ) N( ν h /2, ν h ). h i Ga(1/ν h, 1/ν h ).
37 Dec, / 37 Heavy tailed processes Process λ 2 (t) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) λ 2 (t) accounts for sections in time with larger observational variance. This can be seen as a way to adress the issue of volatility clustering, which is common in finantial time series data. We consider the log gaussian process where {ln(λ 2 (t)); t T} is a gaussian process with mean ν 2 2 and covariance structure ν 2 C 2 (.).
38 Dec, / 37 Heavy tailed processes Process λ 2 (t) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) λ 2 (t) accounts for sections in time with larger observational variance. This can be seen as a way to adress the issue of volatility clustering, which is common in finantial time series data. We consider the log gaussian process where {ln(λ 2 (t)); t T} is a gaussian process with mean ν 2 2 and covariance structure ν 2 C 2 (.).
39 Dec, / 37 Heavy tailed processes Process λ 2 (t) Mixing in time Z 2 (t; V) = Z 2(t; V) λ2 (t) λ 2 (t) accounts for sections in time with larger observational variance. This can be seen as a way to adress the issue of volatility clustering, which is common in finantial time series data. We consider the log gaussian process where {ln(λ 2 (t)); t T} is a gaussian process with mean ν 2 2 and covariance structure ν 2 C 2 (.).
40 Dec, / 37 Heavy tailed processes Predictions (λ 1i, h i, λ 2j ) are considered latent variables and sampled in our MCMC sampler. Given (λ 1i, h i, λ 2j ) the process is gaussian and we can predict at unobserved locations and time points. We compare the predictive performance using proper scoring rules [Gneiting and Raftery, 2008]: LPS(p, x) = log(p(x)) IS(q1, q 2 ; x) = (q 2 q 1 ) + 2 ξ (q 1 x)i(x < q 1 ) + 2 ξ (x q 2)I(x > q 2 ). We use ξ = 0.05 resulting in a 95% credible interval.
41 Dec, / 37 Heavy tailed processes Predictions (λ 1i, h i, λ 2j ) are considered latent variables and sampled in our MCMC sampler. Given (λ 1i, h i, λ 2j ) the process is gaussian and we can predict at unobserved locations and time points. We compare the predictive performance using proper scoring rules [Gneiting and Raftery, 2008]: LPS(p, x) = log(p(x)) IS(q1, q 2 ; x) = (q 2 q 1 ) + 2 ξ (q 1 x)i(x < q 1 ) + 2 ξ (x q 2)I(x > q 2 ). We use ξ = 0.05 resulting in a 95% credible interval.
42 Dec, / 37 Heavy tailed processes Predictions (λ 1i, h i, λ 2j ) are considered latent variables and sampled in our MCMC sampler. Given (λ 1i, h i, λ 2j ) the process is gaussian and we can predict at unobserved locations and time points. We compare the predictive performance using proper scoring rules [Gneiting and Raftery, 2008]: LPS(p, x) = log(p(x)) IS(q1, q 2 ; x) = (q 2 q 1 ) + 2 ξ (q 1 x)i(x < q 1 ) + 2 ξ (x q 2)I(x > q 2 ). We use ξ = 0.05 resulting in a 95% credible interval.
43 Dec, / 37 Heavy tailed processes Predictions (λ 1i, h i, λ 2j ) are considered latent variables and sampled in our MCMC sampler. Given (λ 1i, h i, λ 2j ) the process is gaussian and we can predict at unobserved locations and time points. We compare the predictive performance using proper scoring rules [Gneiting and Raftery, 2008]: LPS(p, x) = log(p(x)) IS(q1, q 2 ; x) = (q 2 q 1 ) + 2 ξ (q 1 x)i(x < q 1 ) + 2 ξ (x q 2)I(x > q 2 ). We use ξ = 0.05 resulting in a 95% credible interval.
44 Dec, / 37 Heavy tailed processes Predictions (λ 1i, h i, λ 2j ) are considered latent variables and sampled in our MCMC sampler. Given (λ 1i, h i, λ 2j ) the process is gaussian and we can predict at unobserved locations and time points. We compare the predictive performance using proper scoring rules [Gneiting and Raftery, 2008]: LPS(p, x) = log(p(x)) IS(q1, q 2 ; x) = (q 2 q 1 ) + 2 ξ (q 1 x)i(x < q 1 ) + 2 ξ (x q 2)I(x > q 2 ). We use ξ = 0.05 resulting in a 95% credible interval.
45 Dec, / 37 Data This data set has I = 30 locations and J = 30 time points generated from a Gaussian model with no nugget effect (τ 2 = 0). The covariance model is nonseparable Cauchy (X i Ga(λ i, 1), i = 0, 1, 2) in space and time with c = 0.5. We contaminated this data set with different kinds of outliers in order to see the performance of the proposed models in each situation.
46 Dec, / 37 Data This data set has I = 30 locations and J = 30 time points generated from a Gaussian model with no nugget effect (τ 2 = 0). The covariance model is nonseparable Cauchy (X i Ga(λ i, 1), i = 0, 1, 2) in space and time with c = 0.5. We contaminated this data set with different kinds of outliers in order to see the performance of the proposed models in each situation.
47 Dec, / 37 Data This data set has I = 30 locations and J = 30 time points generated from a Gaussian model with no nugget effect (τ 2 = 0). The covariance model is nonseparable Cauchy (X i Ga(λ i, 1), i = 0, 1, 2) in space and time with c = 0.5. We contaminated this data set with different kinds of outliers in order to see the performance of the proposed models in each situation.
48 Dec, / 37 Spatial domain s s1 The proposal for λ 1i, h i, i = 1,..., I in the MCMC sampler is constructed by dividing the observations in blocks defined by position in the spatial domain.
49 Dec, / 37 Spatial domain s s1 The proposal for λ 1i, h i, i = 1,..., I in the MCMC sampler is constructed by dividing the observations in blocks defined by position in the spatial domain.
50 Dec, / 37 Data 1 Description and BF One location was selected at random (location 7) and a random increment from Unif(1.0, 1.5) times the standard deviation was added to each observation for this location for the first 20 time points. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
51 Dec, / 37 Data 1 Description and BF One location was selected at random (location 7) and a random increment from Unif(1.0, 1.5) times the standard deviation was added to each observation for this location for the first 20 time points. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
52 Dec, / 37 Data 1 Estimated correlation function - t 0 = 1 Correlation Correlation Distance in Space Distance in Space (a) Gaussian (b) Nongaussian with λ 1 Correlation Correlation Distance in Space (c) Nongaussian with h and λ Distance in Space (d) Gaussian (Uncontaminated data)
53 Dec, / 37 Data 1 Nongaussian model with λ 1 Variance Median Observation indexes Distance from observation 07 (a) Variance for each location. (b) Median of σ 2 i vs. distance from obs. 7.
54 Dec, / 37 Data 1 Nongaussian model with h (lognormal) Variance Observation indexes τ 2 h i Observation indexes (a) Variance for each location. (b) Nugget for each location.
55 Data 1 Nongaussian model with λ 1 and h Variance Observation indexes τ 2 h i Observation indexes (a) Variance for each location. (b) Nugget for each location. λ i h i Observation indexes Observation indexes (c) λ 1i, i = 1,..., 30. (d) h i, i = 1,..., 30. Dec, / 37
56 Dec, / 37 Data 2 Description and BF A region was selected and an increment from Unif(0.5, 1.5) times the standard deviation was added to each observation for the first 10 time points. s s1 The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
57 Dec, / 37 Data 2 Description and BF A region was selected and an increment from Unif(0.5, 1.5) times the standard deviation was added to each observation for the first 10 time points. s s1 The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
58 Data 2 Nongaussian model with λ 1 and h Variance Observation indexes τ 2 h i Observation indexes (a) Variance for each location. (b) Nugget for each location. λ i Observation indexes h i Observation indexes (c) λ 1i, i = 1,..., 30. (d) h i, i = 1,..., 30. Dec, / 37
59 Dec, / 37 Data 2 Data 2 - Description and BF A region of the spatial domain was selected (locations 4, 16, 21 and 27) and the same random increment from Unif(0.5, 1.5) times the standard deviation was added to each observation for the first 10 time points. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
60 Dec, / 37 Data 2 Data 2 - Description and BF A region of the spatial domain was selected (locations 4, 16, 21 and 27) and the same random increment from Unif(0.5, 1.5) times the standard deviation was added to each observation for the first 10 time points. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) h (gamma) λ 1 λ 1 & h (lognormal) Gaussian
61 Dec, / 37 Data 3 Description and BF The observations at time points 11 to 15 were contaminated by adding a random increment from Unif(0.5, 1.5) times the standard deviation to each observation for all spatial locations. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) λ 1 λ 2 λ 1 &λ 2 λ 1 &λ 2 &h Gaussian
62 Dec, / 37 Data 3 Description and BF The observations at time points 11 to 15 were contaminated by adding a random increment from Unif(0.5, 1.5) times the standard deviation to each observation for all spatial locations. The logarithm of the BF using Shifted-Gamma (λ = 0.98) estimators: nug. h (lognormal) λ 1 λ 2 λ 1 &λ 2 λ 1 &λ 2 &h Gaussian
63 Dec, / 37 Data 3 Nongaussian models Variance Observation indexes Variance Observation indexes (a) Model with lognormal h(s). (b) Model with lognormal h(s) and λ 1 (s).
64 Data 3 Nongaussian model with λ 1 and λ 2 location 1 location 29 Variance Time indexes Variance Time indexes (a) Variance for each time. (b) Variance for each time. λ Location indexes λ Time indexes (c) λ 1i, i = 1,..., I. (d) λ 2j, j = 1,..., J. Dec, / 37
65 Dec, / 37 Non-gaussian spatiotemporal modeling Data 33 Madrid Paris s s 2 (a) Spain and France Map. (b) Basque Country (Zoom).
66 Dec, / 37 Non-gaussian spatiotemporal modeling Model Mean function: µ(s, t) = δ 0 + δ 1 s 1 + δ 2 s 2 + δ 3 h + δ 4 t + δ 5 t 2 Cauchy covariance function: X i Ga(λ i, 1) ( ) 1 λ1 ( ) 1 λ2 ( 1 C(s, t) = 1 + s/a α 1 + t/b β 1 + s/a α + t/b β λ 1 = λ 2 = 1 and c = λ 0 /(1 + λ 0 ). ) λ0
67 Dec, / 37 Non-gaussian spatiotemporal modeling Model Mean function: µ(s, t) = δ 0 + δ 1 s 1 + δ 2 s 2 + δ 3 h + δ 4 t + δ 5 t 2 Cauchy covariance function: X i Ga(λ i, 1) ( ) 1 λ1 ( ) 1 λ2 ( 1 C(s, t) = 1 + s/a α 1 + t/b β 1 + s/a α + t/b β λ 1 = λ 2 = 1 and c = λ 0 /(1 + λ 0 ). ) λ0
68 Dec, / 37 Non-gaussian spatiotemporal modeling Likelihood In order to calculate the likelihood function we need to invert a matrix with dimension We approximate the likelihood by using conditional distributions. We consider a partition of Z into subvectors Z 1,..., Z 31 where Z j = (Z(s 1, t j ),..., Z(s 67, t j )) and we define Z (j) = (Z j L+1,..., Z j ). Then p(z φ) p(z 1 φ) 31 j=2 p(z j z (j 1), φ). (7) This means the distribution of Z j will only depend on the observations in space for the previous L time points. In this application we used L = 5 to make the MCMC feasible.
69 Dec, / 37 Results Bayes Factor h λ 1 λ 1 & h λ 2 λ 2 & h λ 1 & λ 2 λ 1, h & λ 2 Shifted gamma Table: The natural logarithm of the Bayes factor in favor of the model in the column versus Gaussian model using Shifted-Gamma (λ = 0.98) estimator for the predictive density of z.
70 Dec, / 37 Results Model with h and λ 2 σ 2 τ 2 h σ 2 τ 2 h (a) σ 2 (1 τ 2 )/λ 2. (b) σ 2 τ 2 /h.
71 Dec, / 37 Results Predicted temperature at the out-of-sample stations Station 1 Station 2 Station 3 predicted predicted predicted observed observed observed (a) Gaussian Model. (b) Gaussian Model. (c) Gaussian Model. Station 1 Station 2 Station 3 predicted predicted predicted observed observed observed (d) Model with λ 2 & h. (e) Model with λ 2 & h. (f) Model with λ 2 & h.
72 Dec, / 37 Results Model comparison model Average width IS LPS Gaussian h λ λ 1 & h λ λ 2 & h λ 1 & λ λ 1, h & λ
73 Dec, / 37 Results References T Fonseca and M F J Steel A New Class of Nonseparable Spatiotemporal Models CRiSM Working Paper T Gneiting and A E Raftery Strictly proper scoring rules, prediction and estimation JASA. (102) , C Ma Spatio-temporal covariance functions generated by mixtures Mathematical geology. (34) , M B Palacios and M F J Steel Non-Gaussian Bayesian Geostatistical Modeling JASA. (101) , 2006.
A General Class of Nonseparable Space-time Covariance
A General Class of Nonseparable Space-time Covariance Models Thaís C.O. Fonseca and Mark F.J. Steel Department of Statistics, University of Warwick, U.K. Summary. The aim of this work is to construct nonseparable,
More informationWhat s for today. Introduction to Space-time models. c Mikyoung Jun (Texas A&M) Stat647 Lecture 14 October 16, / 19
What s for today Introduction to Space-time models c Mikyoung Jun (Texas A&M) Stat647 Lecture 14 October 16, 2012 1 / 19 Space-time Data So far we looked at the data that vary over space Now we add another
More informationIntroduction to Spatial Data and Models
Introduction to Spatial Data and Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry
More informationIntroduction to Spatial Data and Models
Introduction to Spatial Data and Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics,
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
Chapter 4 - Fundamentals of spatial processes Lecture notes Geir Storvik January 21, 2013 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites Mostly positive correlation Negative
More informationHandbook of Spatial Statistics Chapter 2: Continuous Parameter Stochastic Process Theory by Gneiting and Guttorp
Handbook of Spatial Statistics Chapter 2: Continuous Parameter Stochastic Process Theory by Gneiting and Guttorp Marcela Alfaro Córdoba August 25, 2016 NCSU Department of Statistics Continuous Parameter
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
TK4150 - Intro 1 Chapter 4 - Fundamentals of spatial processes Lecture notes Odd Kolbjørnsen and Geir Storvik January 30, 2017 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites
More informationBayesian Transformed Gaussian Random Field: A Review
Bayesian Transformed Gaussian Random Field: A Review Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD (Victor De Oliveira, David Bindel, Boris and Sandra Kozintsev)
More informationModels for spatial data (cont d) Types of spatial data. Types of spatial data (cont d) Hierarchical models for spatial data
Hierarchical models for spatial data Based on the book by Banerjee, Carlin and Gelfand Hierarchical Modeling and Analysis for Spatial Data, 2004. We focus on Chapters 1, 2 and 5. Geo-referenced data arise
More informationBasics of Point-Referenced Data Models
Basics of Point-Referenced Data Models Basic tool is a spatial process, {Y (s), s D}, where D R r Chapter 2: Basics of Point-Referenced Data Models p. 1/45 Basics of Point-Referenced Data Models Basic
More informationBayesian Semiparametric GARCH Models
Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods
More informationWrapped Gaussian processes: a short review and some new results
Wrapped Gaussian processes: a short review and some new results Giovanna Jona Lasinio 1, Gianluca Mastrantonio 2 and Alan Gelfand 3 1-Università Sapienza di Roma 2- Università RomaTRE 3- Duke University
More informationBayesian Semiparametric GARCH Models
Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods
More informationNon-Gaussian Bayesian Geostatistical Modelling
Non-Gaussian Bayesian Geostatistical Modelling M.B. Palacios and M.F.J. Steel Abstract Sampling models for geostatistical data are usually based on Gaussian processes. However, real data often display
More informationComparing Non-informative Priors for Estimation and Prediction in Spatial Models
Environmentrics 00, 1 12 DOI: 10.1002/env.XXXX Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Regina Wu a and Cari G. Kaufman a Summary: Fitting a Bayesian model to spatial
More informationExtreme Value Analysis and Spatial Extremes
Extreme Value Analysis and Department of Statistics Purdue University 11/07/2013 Outline Motivation 1 Motivation 2 Extreme Value Theorem and 3 Bayesian Hierarchical Models Copula Models Max-stable Models
More informationCross-covariance Functions for Tangent Vector Fields on the Sphere
Cross-covariance Functions for Tangent Vector Fields on the Sphere Minjie Fan 1 Tomoko Matsuo 2 1 Department of Statistics University of California, Davis 2 Cooperative Institute for Research in Environmental
More informationTABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1
TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 1.1 The Probability Model...1 1.2 Finite Discrete Models with Equally Likely Outcomes...5 1.2.1 Tree Diagrams...6 1.2.2 The Multiplication Principle...8
More informationBayesian Transgaussian Kriging
1 Bayesian Transgaussian Kriging Hannes Müller Institut für Statistik University of Klagenfurt 9020 Austria Keywords: Kriging, Bayesian statistics AMS: 62H11,60G90 Abstract In geostatistics a widely used
More informationDependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.
Practitioner Course: Portfolio Optimization September 10, 2008 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y ) (x,
More informationCBMS Lecture 1. Alan E. Gelfand Duke University
CBMS Lecture 1 Alan E. Gelfand Duke University Introduction to spatial data and models Researchers in diverse areas such as climatology, ecology, environmental exposure, public health, and real estate
More informationCOPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition
Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15
More informationPRODUCING PROBABILITY MAPS TO ASSESS RISK OF EXCEEDING CRITICAL THRESHOLD VALUE OF SOIL EC USING GEOSTATISTICAL APPROACH
PRODUCING PROBABILITY MAPS TO ASSESS RISK OF EXCEEDING CRITICAL THRESHOLD VALUE OF SOIL EC USING GEOSTATISTICAL APPROACH SURESH TRIPATHI Geostatistical Society of India Assumptions and Geostatistical Variogram
More informationKarhunen-Loeve Expansion and Optimal Low-Rank Model for Spatial Processes
TTU, October 26, 2012 p. 1/3 Karhunen-Loeve Expansion and Optimal Low-Rank Model for Spatial Processes Hao Zhang Department of Statistics Department of Forestry and Natural Resources Purdue University
More informationStatistícal Methods for Spatial Data Analysis
Texts in Statistícal Science Statistícal Methods for Spatial Data Analysis V- Oliver Schabenberger Carol A. Gotway PCT CHAPMAN & K Contents Preface xv 1 Introduction 1 1.1 The Need for Spatial Analysis
More informationVCMC: Variational Consensus Monte Carlo
VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object
More informationBayesian Inference for the Multivariate Normal
Bayesian Inference for the Multivariate Normal Will Penny Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK. November 28, 2014 Abstract Bayesian inference for the multivariate
More informationUsing Model Selection and Prior Specification to Improve Regime-switching Asset Simulations
Using Model Selection and Prior Specification to Improve Regime-switching Asset Simulations Brian M. Hartman, PhD ASA Assistant Professor of Actuarial Science University of Connecticut BYU Statistics Department
More informationGaussian kernel GARCH models
Gaussian kernel GARCH models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics 7 June 2013 Motivation A regression model is often
More informationHierarchical Modeling for Spatio-temporal Data
Hierarchical Modeling for Spatio-temporal Data Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationSTA 4273H: Sta-s-cal Machine Learning
STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our
More informationModeling daily precipitation in Space and Time
Space and Time SWGen - Hydro Berlin 20 September 2017 temporal - dependence Outline temporal - dependence temporal - dependence Stochastic Weather Generator Stochastic Weather Generator (SWG) is a stochastic
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationA Framework for Daily Spatio-Temporal Stochastic Weather Simulation
A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric
More informationIncorporating unobserved heterogeneity in Weibull survival models: A Bayesian approach
Incorporating unobserved heterogeneity in Weibull survival models: A Bayesian approach Catalina A. Vallejos 1 Mark F.J. Steel 2 1 MRC Biostatistics Unit, EMBL-European Bioinformatics Institute. 2 Dept.
More informationReview: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF)
Case Study 4: Collaborative Filtering Review: Probabilistic Matrix Factorization Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 2 th, 214 Emily Fox 214 1 Probabilistic
More informationModeling and Interpolation of Non-Gaussian Spatial Data: A Comparative Study
Modeling and Interpolation of Non-Gaussian Spatial Data: A Comparative Study Gunter Spöck, Hannes Kazianka, Jürgen Pilz Department of Statistics, University of Klagenfurt, Austria hannes.kazianka@uni-klu.ac.at
More informationDefault Priors and Effcient Posterior Computation in Bayesian
Default Priors and Effcient Posterior Computation in Bayesian Factor Analysis January 16, 2010 Presented by Eric Wang, Duke University Background and Motivation A Brief Review of Parameter Expansion Literature
More informationNearest Neighbor Gaussian Processes for Large Spatial Data
Nearest Neighbor Gaussian Processes for Large Spatial Data Abhi Datta 1, Sudipto Banerjee 2 and Andrew O. Finley 3 July 31, 2017 1 Department of Biostatistics, Bloomberg School of Public Health, Johns
More informationStat 5101 Lecture Notes
Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random
More informationThe Expectation Maximization or EM algorithm
The Expectation Maximization or EM algorithm Carl Edward Rasmussen November 15th, 2017 Carl Edward Rasmussen The EM algorithm November 15th, 2017 1 / 11 Contents notation, objective the lower bound functional,
More informationBayes Model Selection with Path Sampling: Factor Models
with Path Sampling: Factor Models Ritabrata Dutta and Jayanta K Ghosh Purdue University 07/02/11 Factor Models in Applications Factor Models in Applications Factor Models Factor Models and Factor analysis
More informationSubject CS1 Actuarial Statistics 1 Core Principles
Institute of Actuaries of India Subject CS1 Actuarial Statistics 1 Core Principles For 2019 Examinations Aim The aim of the Actuarial Statistics 1 subject is to provide a grounding in mathematical and
More informationAn Introduction to Spatial Statistics. Chunfeng Huang Department of Statistics, Indiana University
An Introduction to Spatial Statistics Chunfeng Huang Department of Statistics, Indiana University Microwave Sounding Unit (MSU) Anomalies (Monthly): 1979-2006. Iron Ore (Cressie, 1986) Raw percent data
More informationBayesian linear regression
Bayesian linear regression Linear regression is the basis of most statistical modeling. The model is Y i = X T i β + ε i, where Y i is the continuous response X i = (X i1,..., X ip ) T is the corresponding
More informationSummary STK 4150/9150
STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than
More informationFusing point and areal level space-time data. data with application to wet deposition
Fusing point and areal level space-time data with application to wet deposition Alan Gelfand Duke University Joint work with Sujit Sahu and David Holland Chemical Deposition Combustion of fossil fuel produces
More informationIndex. Geostatistics for Environmental Scientists, 2nd Edition R. Webster and M. A. Oliver 2007 John Wiley & Sons, Ltd. ISBN:
Index Akaike information criterion (AIC) 105, 290 analysis of variance 35, 44, 127 132 angular transformation 22 anisotropy 59, 99 affine or geometric 59, 100 101 anisotropy ratio 101 exploring and displaying
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationEmpirical Bayesian Kriging
Empirical Bayesian Kriging Implemented in ArcGIS Geostatistical Analyst By Konstantin Krivoruchko, Senior Research Associate, Software Development Team, Esri Obtaining reliable environmental measurements
More informationSpatial-Temporal Modeling of Active Layer Thickness
Spatial-Temporal Modeling of Active Layer Thickness Qian Chen Advisor : Dr. Tatiyana Apanasovich Department of Statistics The George Washington University Abstract The objective of this study is to provide
More informationA spatio-temporal model for extreme precipitation simulated by a climate model
A spatio-temporal model for extreme precipitation simulated by a climate model Jonathan Jalbert Postdoctoral fellow at McGill University, Montréal Anne-Catherine Favre, Claude Bélisle and Jean-François
More informationSpatial Dynamic Factor Analysis
Spatial Dynamic Factor Analysis Esther Salazar Federal University of Rio de Janeiro Department of Statistical Methods Sixth Workshop on BAYESIAN INFERENCE IN STOCHASTIC PROCESSES Bressanone/Brixen, Italy
More informationBasic Sampling Methods
Basic Sampling Methods Sargur Srihari srihari@cedar.buffalo.edu 1 1. Motivation Topics Intractability in ML How sampling can help 2. Ancestral Sampling Using BNs 3. Transforming a Uniform Distribution
More informationMultivariate Geostatistics
Hans Wackernagel Multivariate Geostatistics An Introduction with Applications Third, completely revised edition with 117 Figures and 7 Tables Springer Contents 1 Introduction A From Statistics to Geostatistics
More informationSpatio-Temporal Models for Areal Data
Spatio-Temporal Models for Areal Data Juan C. Vivar (jcvivar@dme.ufrj.br) and Marco A. R. Ferreira (marco@im.ufrj.br) Departamento de Métodos Estatísticos - IM Universidade Federal do Rio de Janeiro (UFRJ)
More informationPoint-Referenced Data Models
Point-Referenced Data Models Jamie Monogan University of Georgia Spring 2013 Jamie Monogan (UGA) Point-Referenced Data Models Spring 2013 1 / 19 Objectives By the end of these meetings, participants should
More informationFundamentals of Data Assimila1on
014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University
More informationGaussian Multiscale Spatio-temporal Models for Areal Data
Gaussian Multiscale Spatio-temporal Models for Areal Data (University of Missouri) Scott H. Holan (University of Missouri) Adelmo I. Bertolde (UFES) Outline Motivation Multiscale factorization The multiscale
More informationBayesian Modeling of Conditional Distributions
Bayesian Modeling of Conditional Distributions John Geweke University of Iowa Indiana University Department of Economics February 27, 2007 Outline Motivation Model description Methods of inference Earnings
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationSpatial Statistics with Image Analysis. Lecture L02. Computer exercise 0 Daily Temperature. Lecture 2. Johan Lindström.
C Stochastic fields Covariance Spatial Statistics with Image Analysis Lecture 2 Johan Lindström November 4, 26 Lecture L2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM2 L /2 C Stochastic fields Covariance
More informationWhat s for today. Random Fields Autocovariance Stationarity, Isotropy. c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, / 13
What s for today Random Fields Autocovariance Stationarity, Isotropy c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, 2012 1 / 13 Stochastic Process and Random Fields A stochastic process is a family
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2017 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationLatent Variable Models for Binary Data. Suppose that for a given vector of explanatory variables x, the latent
Latent Variable Models for Binary Data Suppose that for a given vector of explanatory variables x, the latent variable, U, has a continuous cumulative distribution function F (u; x) and that the binary
More informationAnalysing geoadditive regression data: a mixed model approach
Analysing geoadditive regression data: a mixed model approach Institut für Statistik, Ludwig-Maximilians-Universität München Joint work with Ludwig Fahrmeir & Stefan Lang 25.11.2005 Spatio-temporal regression
More informationSpatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016
Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström
More informationThe Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.
Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface
More informationESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS
ESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, N.C.,
More informationHierarchical Bayesian Modeling of Multisite Daily Rainfall Occurrence
The First Henry Krumb Sustainable Engineering Symposium Hierarchical Bayesian Modeling of Multisite Daily Rainfall Occurrence Carlos Henrique Ribeiro Lima Prof. Upmanu Lall March 2009 Agenda 1) Motivation
More informationBayesian Regression (1/31/13)
STA613/CBB540: Statistical methods in computational biology Bayesian Regression (1/31/13) Lecturer: Barbara Engelhardt Scribe: Amanda Lea 1 Bayesian Paradigm Bayesian methods ask: given that I have observed
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationComputer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization
Prof. Daniel Cremers 6. Mixture Models and Expectation-Maximization Motivation Often the introduction of latent (unobserved) random variables into a model can help to express complex (marginal) distributions
More informationThe Expectation-Maximization Algorithm
1/29 EM & Latent Variable Models Gaussian Mixture Models EM Theory The Expectation-Maximization Algorithm Mihaela van der Schaar Department of Engineering Science University of Oxford MLE for Latent Variable
More informationHierarchical Modeling for Multivariate Spatial Data
Hierarchical Modeling for Multivariate Spatial Data Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department
More informationIs there still room for new developments in geostatistics?
Is there still room for new developments in geostatistics? Jean-Paul Chilès MINES ParisTech, Fontainebleau, France, IAMG 34th IGC, Brisbane, 8 August 2012 Matheron: books and monographs 1962-1963: Treatise
More informationHeriot-Watt University
Heriot-Watt University Heriot-Watt University Research Gateway Prediction of settlement delay in critical illness insurance claims by using the generalized beta of the second kind distribution Dodd, Erengul;
More informationSpatiotemporal Analysis of Solar Radiation for Sustainable Research in the Presence of Uncertain Measurements
Spatiotemporal Analysis of Solar Radiation for Sustainable Research in the Presence of Uncertain Measurements Alexander Kolovos SAS Institute, Inc. alexander.kolovos@sas.com Abstract. The study of incoming
More informationSpatio-Temporal Geostatistical Models, with an Application in Fish Stock
Spatio-Temporal Geostatistical Models, with an Application in Fish Stock Ioannis Elmatzoglou Submitted for the degree of Master in Statistics at Lancaster University, September 2006. Abstract Geostatistics
More informationNon-Parametric Bayes
Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian
More informationBo Li. National Center for Atmospheric Research. Based on joint work with:
Nonparametric Assessment of Properties of Space-time Covariance Functions and its Application in Paleoclimate Reconstruction Bo Li National Center for Atmospheric Research Based on joint work with: Marc
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationPart 6: Multivariate Normal and Linear Models
Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of
More informationOn prediction and density estimation Peter McCullagh University of Chicago December 2004
On prediction and density estimation Peter McCullagh University of Chicago December 2004 Summary Having observed the initial segment of a random sequence, subsequent values may be predicted by calculating
More informationQuantile POD for Hit-Miss Data
Quantile POD for Hit-Miss Data Yew-Meng Koh a and William Q. Meeker a a Center for Nondestructive Evaluation, Department of Statistics, Iowa State niversity, Ames, Iowa 50010 Abstract. Probability of detection
More informationMotivation Scale Mixutres of Normals Finite Gaussian Mixtures Skew-Normal Models. Mixture Models. Econ 690. Purdue University
Econ 690 Purdue University In virtually all of the previous lectures, our models have made use of normality assumptions. From a computational point of view, the reason for this assumption is clear: combined
More informationIntroduction to Geostatistics
Introduction to Geostatistics Abhi Datta 1, Sudipto Banerjee 2 and Andrew O. Finley 3 July 31, 2017 1 Department of Biostatistics, Bloomberg School of Public Health, Johns Hopkins University, Baltimore,
More informationThe ProbForecastGOP Package
The ProbForecastGOP Package April 24, 2006 Title Probabilistic Weather Field Forecast using the GOP method Version 1.3 Author Yulia Gel, Adrian E. Raftery, Tilmann Gneiting, Veronica J. Berrocal Description
More informationBruno Sansó. Department of Applied Mathematics and Statistics University of California Santa Cruz bruno
Bruno Sansó Department of Applied Mathematics and Statistics University of California Santa Cruz http://www.ams.ucsc.edu/ bruno Climate Models Climate Models use the equations of motion to simulate changes
More informationCross-sectional space-time modeling using ARNN(p, n) processes
Cross-sectional space-time modeling using ARNN(p, n) processes W. Polasek K. Kakamu September, 006 Abstract We suggest a new class of cross-sectional space-time models based on local AR models and nearest
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationEstimation of non-stationary spatial covariance structure
Estimation of non-stationary spatial covariance structure DAVID J NOTT Department of Statistics, University of New South Wales, Sydney 2052, Australia djn@mathsunsweduau WILLIAM T M DUNSMUIR Division of
More informationPredictive spatio-temporal models for spatially sparse environmental data. Umeå University
Seminar p.1/28 Predictive spatio-temporal models for spatially sparse environmental data Xavier de Luna and Marc G. Genton xavier.deluna@stat.umu.se and genton@stat.ncsu.edu http://www.stat.umu.se/egna/xdl/index.html
More informationRobust Prediction of Large Spatio-Temporal Datasets
Robust Prediction of Large Spatio-Temporal Datasets Yang Chen Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the
More informationOverlapping Astronomical Sources: Utilizing Spectral Information
Overlapping Astronomical Sources: Utilizing Spectral Information David Jones Advisor: Xiao-Li Meng Collaborators: Vinay Kashyap (CfA) and David van Dyk (Imperial College) CHASC Astrostatistics Group April
More informationBayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling
Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Jon Wakefield Departments of Statistics and Biostatistics University of Washington 1 / 37 Lecture Content Motivation
More information