Spatial Statistics with Image Analysis. Lecture L08. Computer exercise 3. Lecture 8. Johan Lindström. November 25, 2016
|
|
- Cordelia Anthony
- 5 years ago
- Views:
Transcription
1 C3 Repetition Creating Q Spectral Non-grid Spatial Statistics with Image Analysis Lecture 8 Johan Lindström November 25, 216 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 1/39 Lecture L8 C3 Repetition Creating Q Spectral Non-grid Computer exercise 3 Repetition Bayesian hierarchical modelling Parameter estimation Creating Q CAR1 models Matérn? Spectral Representation Example Fields Stochastic Partial Differential Equation The finite element method Boundary effects Non-gridded data Triangulate Basis functions Solution Observations Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 2/39 Computer exercise 3 C3 Repetition Creating Q Spectral Non-grid Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 3/39
2 Computer exercise 3 C3 Repetition Creating Q Spectral Non-grid Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 4/39 C3 Repetition Creating Q Spectral Non-grid BHM Estimation Bayesian hierarchical modelling using GMRF Data model, p y z, θ: Describing how observations arise assuming a known latent field z. Latent model, p z θ: Describing how the latent field behaves. z = Ax + Bβ x N, Q θ Parameters, p θ: Describing our, sometimes vauge, prior knowledge of the parameters. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 5/39 Inference C3 Repetition Creating Q Spectral Non-grid BHM Estimation Given a Bayesian hierarchical model we are interested in θ = arg max pθ y arg max p y x, θ p x θ d x. θ θ We note that conditional distributions provide the equality p y x, θ p x θ = p y, x θ = p x y, θ p y θ This gives p θ y p y x, θ p x θ p x y, θ p θ for any x. py θ Allowing us to avoid explicitly computing the integral. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 6/39
3 C3 Repetition Creating Q Spectral Non-grid BHM Estimation Parameter estimation Gaussian Observations For the case with Gaussian observations of a GMRF y x, θ N à x, Q ε x θ N, Q, We have: Q 1/2 Q ε 1/2 py θ exp 1 [ μ Qx y 1/2 x y 2 Qμ x y + y Ãμ x y Qε y x y ] Ãμ. and E x y, θ = μ x y = Q x y à Q ε y, V x y, θ = Q x y = Q + à Q ε Ã. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 7/39 C3 Repetition Creating Q Spectral Non-grid BHM Estimation Parameter estimation Non-Gaussian Observations For the case with Non-Gaussian observations of a GMRF p y x = n p y i x i i=1 x θ N, Q, we use the same trick for the conditional distributions as before p θ y p y x, θ p x θ p x y, θ p θ for any x. py θ However for non-gaussian data p x y, θ does not have a close form solution. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 8/39 C3 Repetition Creating Q Spectral Non-grid BHM Estimation Parameter estimation Taylor Expansion A Taylor expansion of the log-posterior around x à x log p x y, θ x à f H f 1 2 x Q à H f à x + const. gives an approximate Gaussian: x y, θ N μ x y, Q x y with μ x y = Q x y à f à H f à x Q x y = Q à H à f Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 9/39
4 C3 Repetition Creating Q Spectral Non-grid BHM Estimation Parameter estimation Non-Gaussian Observations If the Gaussian approximation is performed at the mode of p x y, θ then the expectation of the Gaussian approximation coincides with the mode arg max p x y, θ = x = E x x x y, θ and the approximate posterior simplifies to p y x, θ p x θ p θ y p θ Qx y 1/2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 1/39 CAR1 models C3 Repetition Creating Q Spectral Non-grid CAR1 models Matérn? In order to use GMRFs insead of full covariance models, we need to construct useful, sparse, Q-matrices. Conditional autoregressive models CAR A mean zero CAR1 models is defined by x i {x j : j N i } N 1 1 K 2 x j, + N i τk 2 + N j N i i Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 11/39 CAR1 models C3 Repetition Creating Q Spectral Non-grid CAR1 models Matérn? For observations on a regular grid the resulting local q-pattern with pixel x i marked in red is q = τ 4 + K 2 The q-pattern can be divided into a component containing the parameter, K 2, and a 2 nd order finite difference operator q = τ K I Δ G Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 12/39
5 How to create Q? C3 Repetition Creating Q Spectral Non-grid CAR1 models Matérn? The Matérn covariance family The covariance between two points at distance h is r M h = σ 2 Γν 2 ν K h ν K ν K h Fields with Matérn covariances are solutions to a Stochastic Partial Differential Equation SPDE, K 2 Δ α/2 xs = 1 τ Ws. Here Ws is white noise and Δ = i 2 2 s i. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 13/39 Spectral Density For a stochastic process in time, xt with stationary covariance function, rt the spectral density and covariance function form a Fourier transform pair fω = 1 2π rt = rte iωt dt = Fr fωe iωt dω = F f Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 14/39 Linear Filters Applying a linear filter with impulse response, hu, to a Gaussian process results in a transformed Gaussian process yt = ht uxu du = huxt u du, with spectral density f y ω = Hω 2 f x ω where Hω is the transfer function Hω = hue iωu du Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 15/39
6 Linear Filters Proofs The covariance for the filtered process yt = ht uxu du is assuming that all functions are nice r y τ = C yt + τ, yt = = C huxt + τ u du, = = hvxt v dv hu hv C xt + τ u, xt v du dv hu hv r x τ + v u du dv Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 16/39 Linear Filters Proofs The resulting spectral density for yt is f y ω = 1 2π = 1 2π r y τe iωτ dτ with the substitution τ = τ + v u we have = 1 2π = huhvr x τ + v ue iωτ du dv dτ huhvr x τ e iωτ v+u du dv dτ hue iωu du hve iωv dv 1 2π = HωHωf x ω = Hω 2 f x ω r x τ e iωτ dτ Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 17/39 Example Exponential smoothing Assume an exponential smoothing filter in continuous time { βe αu, u hu =, u < with transfer function Hω = e iωu hu du = βe α+iωu hu du = β α + iω Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 18/39
7 Example Exponential smoothing The filtered smoothed process yt = ht uxu du = has spectral density f y ω = Hω 2 f x ω = β α + iω If xt is white noise with t 2 βe αt u xu du. f x ω = β 2 α 2 + ω 2 f xω. r x t = δ t f x ω = 1 2π then yt is a time continuous version of an AR1-process the Ornstein-Uhlenbeck process with f y ω = β 2 2πα 2 + ω 2 r y t = β 2 2α e α t Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 19/39 Spectral Density For a stochastic field, xs, in R d with stationary covariance function, rh the spectral density is fω = 1 rhe iω h dh, 2π d R d rh = fωe iω h dω R d Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 2/39 Spectral Density Isotropy For an isotropic stationary covariance function in R 2 a change to polar coordinates gives the spectral density as fω = 1 2π 2 = 1 2π rh 2π rh h J ωh dh, e iωh cos θ dθ h dh where ω = ω, h = h, and J is a Bessel function of the first kind. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 21/39
8 Spectral Density Isotropy For a stochastic field, xs, in R d with isotropic stationary covariance function, rh the spectral density is fω = 1 2π d/2 rh h d J d 2/2ωh ωh d 2/2 dh, rh = 2π d/2 fω ω d J d 2/2ωh ωh d 2/2 where J k is a Bessel function of the first kind. dω, Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 22/39 Spectral Density Matérn The Matérn covariance family The covariance between two points at distance h is r M h = σ 2 Γν 2 ν K h ν K ν K h has spectral density in R 2 f M ω = σ 2 2πΓν 2 ν = νσ2 K 2ν π 1 K 2 + ω 2 ν+1 Kh ν K ν Kh h J wh dh Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 23/39 Matérn covariances from an SPDE If we consider the SPDE as a linear filter we can write xs = K 2 Δ α/2 1 τ Ws. with transfer function 1 Hω = F τ K 2 Δ α/2 since 1 τ 1 K 2 + ω 2 α/2 F f = iωff F f = ω 2 Ff Proportional to since we re ignoring 2π-constants in the Fourier-transform Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 24/39
9 Matérn covariances from an SPDE If the SPDE K 2 Δ α/2 xs = 1 τ Ws. is driven by spatial white noise, Ws, with f W ω 1 and r W h = δ h then the resulting spectral density for xs f x ω = Hω 2 f W ω 1 τ 1 K 2 + ω 2 α is Matérn. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 25/39 Matérn covariances cont. In R d a field with Matérn covariance in given as the solution to the SPDE K 2 Δ α/2 xs = 1 τ Ws, where we have: Δ = is the Laplacian. s 2 x s 2 y Wu is spatial white noise. α = ν + d/2. Parameter link: σ 2 = 1 Γν τ ΓαK 2ν 4π d/2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 26/39 The finite element method The following is a highly simplified solution sketch: For α = 2 and gridded data we discretize the field xs to a vector X of values on the the grid points. And replace the differential operator K 2 Δ, with a finite difference matrix K 2 I + G. The discretized SPDE becomes a SAR1: K 2 I + G X D = 1 τ ε where ε is the discretized white noise, Wu. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 27/39
10 Lattice on R 2, Grid size h, τ = 1 Order α = 1 ν =, or CAR1: K 2 h } {{ } } {{ } C Δ G Order α = 2 ν = 1, or SAR1: 1 K 4 h K h C Δ G Δ 2 G 2 =GC G Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 28/39 Boundary effects Three options for handeling the boundary effects exist. Dirichlet boundary condition: xu = on the boundary. Neumann boundary condition: x u = perpendicular to the boundary. Torus: By folding the image onto a torus we completely avoid edges, elimating the need for boundary conditions. However it introduces strange dependencies... Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 29/39 Boundary effects The three options correspond to different edge corrections q Dirichlet = 4 + K 2 q Neumann = 3 + K 2 q Torus = 4 + K 2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 3/39
11 Boundary effects cont. Dirichlet Neumann Torus Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 31/39 Non-gridded data Basic idea Construct a discrete approximation of the continuous field using basis functions, {ψ k }, and weights, {x k }, xs = k ψ k sx k Find the distribution of x k by solving κ 2 Δ α/2 xs = Ws Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 32/39 Non-gridded data Triangulate Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 33/39
12 What s a good basis? Here we use Piecewise linear basis, i.e. a set of pyramids. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 34/39 Scalar product We define the scalar product between two functions as fs, gs = fsgs ds R d Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 35/39 Solving the SPDE A stochastic weak solution to the SPDE is given by weights, {x k }, such that [ ψ i s, κ 2 Δ ] α/2 D xs = [ ψ i s, Ws ] i=1,...,n i=1,...,n Replacing xs with k ψ ksx k gives [ ψ i s, κ 2 Δ ] α/2 ψk s x k k i=1,...,n D = [ ψ i s, Ws ] i=1,...,n For α = 2 we have κ 2 [ ψ i s, ψ k s ] + [ ψ i s, Δψ k s ] x = D [ ψ k s, Ws ] C G N,C Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 36/39
13 Solution to the SPDE The equality κ 2 [ ψ i s, ψ k s ] + [ ψ i s, Δψ k s ] x = D [ ψ k s, Ws ] C G N,C can be written in matrix form as κ 2 C + G x N, C where elements in G and C are given by G ij = ψ i s, Δψ j s and C ij = ψ i s, ψ i s To obtain a diagonal C-matrix and sparse C we use the finte element approximation: C ii ψ i s, 1 = ψ i s ds and C ij if i j Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 37/39 Solution to the SPDE A weak solution to the SPDE κ 2 Δ xs = Ws. is given by xs = k ψ k sw k where κ 2 C + G w N, C The precision of the weights, w, is V w = Q 2 = κ 2 C + G C κ 2 C + G Q 1 =κ 2 C + G Q 2 = κ 2 C + G C κ 2 C + G Q α = κ 2 C + G C Q α 2 C κ 2 C + G, α = 3, 4, 5,... Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 38/39 Observations The A-matrix The field is created as a weighted sum of basis functions. xs = N ψ k s x k, k=1 The locations of the basis functions do not need to match observation locations. Observations ys = xs + ε = k ψ k sx k + ε We introduce a sparse matrix A i = [ ψ 1s i ψ Ns i ] linking the field to the observation. Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 39/39
Latent Gaussian Processes and Stochastic Partial Differential Equations
Latent Gaussian Processes and Stochastic Partial Differential Equations Johan Lindström 1 1 Centre for Mathematical Sciences Lund University Lund 2016-02-04 Johan Lindström - johanl@maths.lth.se Gaussian
More informationSpatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016
Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström
More informationSpatial Statistics with Image Analysis. Lecture L02. Computer exercise 0 Daily Temperature. Lecture 2. Johan Lindström.
C Stochastic fields Covariance Spatial Statistics with Image Analysis Lecture 2 Johan Lindström November 4, 26 Lecture L2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM2 L /2 C Stochastic fields Covariance
More informationState Space Representation of Gaussian Processes
State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)
More informationMultivariate Gaussian Random Fields with SPDEs
Multivariate Gaussian Random Fields with SPDEs Xiangping Hu Daniel Simpson, Finn Lindgren and Håvard Rue Department of Mathematics, University of Oslo PASI, 214 Outline The Matérn covariance function and
More informationHilbert Space Methods for Reduced-Rank Gaussian Process Regression
Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Arno Solin and Simo Särkkä Aalto University, Finland Workshop on Gaussian Process Approximation Copenhagen, Denmark, May 2015 Solin &
More informationGaussian Process Regression
Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process
More informationData are collected along transect lines, with dense data along. Spatial modelling using GMRFs. 200m. Today? Spatial modelling using GMRFs
Centre for Mathematical Sciences Lund University Engineering geology Lund University Results A non-stationary extension The model Estimation Gaussian Markov random fields Basics Approximating Mate rn covariances
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationTime Series Analysis
Time Series Analysis Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of todays lecture Descriptions of (deterministic) linear systems. Chapter 4: Linear
More informationFast algorithms that utilise the sparsity of Q exist (c-package ( Q 1. GMRFlib) Seattle February 3, 2009
GMRF:s Results References Some Theory for and Applications of Gaussian Markov Random Fields Johan Lindström 1 David Bolin 1 Finn Lindgren 1 Håvard Rue 2 1 Centre for Mathematical Sciences Lund University
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationLecture 1: Pragmatic Introduction to Stochastic Differential Equations
Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Simo Särkkä Aalto University, Finland (visiting at Oxford University, UK) November 13, 2013 Simo Särkkä (Aalto) Lecture 1: Pragmatic
More informationA Framework for Daily Spatio-Temporal Stochastic Weather Simulation
A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric
More informationSpatial smoothing over complex domain
Spatial smoothing over complex domain Alessandro Ottavi NTNU Department of Mathematical Science August 16-20 2010 Outline Spatial smoothing Complex domain The SPDE approch Manifold Some tests Spatial smoothing
More informationAn EM algorithm for Gaussian Markov Random Fields
An EM algorithm for Gaussian Markov Random Fields Will Penny, Wellcome Department of Imaging Neuroscience, University College, London WC1N 3BG. wpenny@fil.ion.ucl.ac.uk October 28, 2002 Abstract Lavine
More information5. THE CLASSES OF FOURIER TRANSFORMS
5. THE CLASSES OF FOURIER TRANSFORMS There are four classes of Fourier transform, which are represented in the following table. So far, we have concentrated on the discrete Fourier transform. Table 1.
More informationThe Bayesian approach to inverse problems
The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu
More informationRandom Processes Handout IV
RP-IV.1 Random Processes Handout IV CALCULATION OF MEAN AND AUTOCORRELATION FUNCTIONS FOR WSS RPS IN LTI SYSTEMS In the last classes, we calculated R Y (τ) using an intermediate function f(τ) (h h)(τ)
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationHierarchical Modelling for Univariate Spatial Data
Hierarchical Modelling for Univariate Spatial Data Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
Chapter 4 - Fundamentals of spatial processes Lecture notes Geir Storvik January 21, 2013 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites Mostly positive correlation Negative
More informationHierarchical Modeling for Univariate Spatial Data
Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This
More informationOverview of Spatial Statistics with Applications to fmri
with Applications to fmri School of Mathematics & Statistics Newcastle University April 8 th, 2016 Outline Why spatial statistics? Basic results Nonstationary models Inference for large data sets An example
More informationInference for Lévy-Driven Continuous-Time ARMA Processes
Inference for Lévy-Driven Continuous-Time ARMA Processes Peter J. Brockwell Richard A. Davis Yu Yang Colorado State University May 23, 2007 Outline Background Lévy-driven CARMA processes Second order properties
More informationNonparametric Bayesian Methods - Lecture I
Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics
More information13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES
13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value
More informationGaussian processes for spatial modelling in environmental health: parameterizing for flexibility vs. computational efficiency
Gaussian processes for spatial modelling in environmental health: parameterizing for flexibility vs. computational efficiency Chris Paciorek March 11, 2005 Department of Biostatistics Harvard School of
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationGATE EE Topic wise Questions SIGNALS & SYSTEMS
www.gatehelp.com GATE EE Topic wise Questions YEAR 010 ONE MARK Question. 1 For the system /( s + 1), the approximate time taken for a step response to reach 98% of the final value is (A) 1 s (B) s (C)
More informationSPDEs, criticality, and renormalisation
SPDEs, criticality, and renormalisation Hendrik Weber Mathematics Institute University of Warwick Potsdam, 06.11.2013 An interesting model from Physics I Ising model Spin configurations: Energy: Inverse
More informationThe structure of laser pulses
1 The structure of laser pulses 2 The structure of laser pulses Pulse characteristics Temporal and spectral representation Fourier transforms Temporal and spectral widths Instantaneous frequency Chirped
More informationComputer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression
Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationPractice Problems For Test 3
Practice Problems For Test 3 Power Series Preliminary Material. Find the interval of convergence of the following. Be sure to determine the convergence at the endpoints. (a) ( ) k (x ) k (x 3) k= k (b)
More information1 Isotropic Covariance Functions
1 Isotropic Covariance Functions Let {Z(s)} be a Gaussian process on, ie, a collection of jointly normal random variables Z(s) associated with n-dimensional locations s The joint distribution of {Z(s)}
More informationData assimilation with and without a model
Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU
More informationSwitching Regime Estimation
Switching Regime Estimation Series de Tiempo BIrkbeck March 2013 Martin Sola (FE) Markov Switching models 01/13 1 / 52 The economy (the time series) often behaves very different in periods such as booms
More informationReview of Frequency Domain Fourier Series: Continuous periodic frequency components
Today we will review: Review of Frequency Domain Fourier series why we use it trig form & exponential form how to get coefficients for each form Eigenfunctions what they are how they relate to LTI systems
More information2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf
Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationA short introduction to INLA and R-INLA
A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk
More informationA Complete Spatial Downscaler
A Complete Spatial Downscaler Yen-Ning Huang, Brian J Reich, Montserrat Fuentes 1 Sankar Arumugam 2 1 Department of Statistics, NC State University 2 Department of Civil, Construction, and Environmental
More informationPractice Problems For Test 3
Practice Problems For Test 3 Power Series Preliminary Material. Find the interval of convergence of the following. Be sure to determine the convergence at the endpoints. (a) ( ) k (x ) k (x 3) k= k (b)
More informationCommunication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University
Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise
More informatione st f (t) dt = e st tf(t) dt = L {t f(t)} s
Additional operational properties How to find the Laplace transform of a function f (t) that is multiplied by a monomial t n, the transform of a special type of integral, and the transform of a periodic
More informationLecture 4: Numerical solution of ordinary differential equations
Lecture 4: Numerical solution of ordinary differential equations Department of Mathematics, ETH Zürich General explicit one-step method: Consistency; Stability; Convergence. High-order methods: Taylor
More informationLecture 12: Detailed balance and Eigenfunction methods
Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),
More informationNormalized kernel-weighted random measures
Normalized kernel-weighted random measures Jim Griffin University of Kent 1 August 27 Outline 1 Introduction 2 Ornstein-Uhlenbeck DP 3 Generalisations Bayesian Density Regression We observe data (x 1,
More informationContents lecture 5. Automatic Control III. Summary of lecture 4 (II/II) Summary of lecture 4 (I/II) u y F r. Lecture 5 H 2 and H loop shaping
Contents lecture 5 Automatic Control III Lecture 5 H 2 and H loop shaping Thomas Schön Division of Systems and Control Department of Information Technology Uppsala University. Email: thomas.schon@it.uu.se,
More informationComputer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression
Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.
More informationGradient and smoothing stencils with isotropic discretization error. Abstract
Gradient and smoothing stencils with isotropic discretization error Tommy Anderberg (Dated: April 30, 2012) Abstract Gradient stencils with isotropic O(h 2 ) and O(h 4 ) discretization error are constructed
More informationNonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University
Nonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University this presentation derived from that presented at the Pan-American Advanced
More informationPartial factor modeling: predictor-dependent shrinkage for linear regression
modeling: predictor-dependent shrinkage for linear Richard Hahn, Carlos Carvalho and Sayan Mukherjee JASA 2013 Review by Esther Salazar Duke University December, 2013 Factor framework The factor framework
More informationLinear Dynamical Systems (Kalman filter)
Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete
More informationLINEAR RESPONSE THEORY
MIT Department of Chemistry 5.74, Spring 5: Introductory Quantum Mechanics II Instructor: Professor Andrei Tokmakoff p. 8 LINEAR RESPONSE THEORY We have statistically described the time-dependent behavior
More informationA spatio-temporal stochastic pattern generator for simulation of uncertainties in geophysical ensemble prediction and ensemble data assimilation
A spatio-temporal stochastic pattern generator for simulation of uncertainties in geophysical ensemble prediction and ensemble data assimilation arxiv:1605.02018v4 [physics.data-an] 20 Dec 2017 Michael
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run
More informationDefinition of the Laplace transform. 0 x(t)e st dt
Definition of the Laplace transform Bilateral Laplace Transform: X(s) = x(t)e st dt Unilateral (or one-sided) Laplace Transform: X(s) = 0 x(t)e st dt ECE352 1 Definition of the Laplace transform (cont.)
More informationScientific Computing I
Scientific Computing I Module 8: An Introduction to Finite Element Methods Tobias Neckel Winter 2013/2014 Module 8: An Introduction to Finite Element Methods, Winter 2013/2014 1 Part I: Introduction to
More informationLecture 12: Detailed balance and Eigenfunction methods
Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility), 4.2-4.4 (explicit examples of eigenfunction methods) Gardiner
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More informationAMS 529: Finite Element Methods: Fundamentals, Applications, and New Trends
AMS 529: Finite Element Methods: Fundamentals, Applications, and New Trends Lecture 3: Finite Elements in 2-D Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Finite Element Methods 1 / 18 Outline 1 Boundary
More informationSplitting methods with boundary corrections
Splitting methods with boundary corrections Alexander Ostermann University of Innsbruck, Austria Joint work with Lukas Einkemmer Verona, April/May 2017 Strang s paper, SIAM J. Numer. Anal., 1968 S (5)
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationLecture 4: Dynamic models
linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More informationFunctions of a Complex Variable (S1) Lecture 11. VII. Integral Transforms. Integral transforms from application of complex calculus
Functions of a Complex Variable (S1) Lecture 11 VII. Integral Transforms An introduction to Fourier and Laplace transformations Integral transforms from application of complex calculus Properties of Fourier
More informationGaussian with mean ( µ ) and standard deviation ( σ)
Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (
More informationNon-parametric identification
Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response
More informationSummary STK 4150/9150
STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than
More informationBayesian spatial hierarchical modeling for temperature extremes
Bayesian spatial hierarchical modeling for temperature extremes Indriati Bisono Dr. Andrew Robinson Dr. Aloke Phatak Mathematics and Statistics Department The University of Melbourne Maths, Informatics
More informationHandbook of Spatial Statistics Chapter 2: Continuous Parameter Stochastic Process Theory by Gneiting and Guttorp
Handbook of Spatial Statistics Chapter 2: Continuous Parameter Stochastic Process Theory by Gneiting and Guttorp Marcela Alfaro Córdoba August 25, 2016 NCSU Department of Statistics Continuous Parameter
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More informationTHE UNIVERSITY OF WESTERN ONTARIO. Applied Mathematics 375a Instructor: Matt Davison. Final Examination December 14, :00 12:00 a.m.
THE UNIVERSITY OF WESTERN ONTARIO London Ontario Applied Mathematics 375a Instructor: Matt Davison Final Examination December 4, 22 9: 2: a.m. 3 HOURS Name: Stu. #: Notes: ) There are 8 question worth
More informationBasics of Point-Referenced Data Models
Basics of Point-Referenced Data Models Basic tool is a spatial process, {Y (s), s D}, where D R r Chapter 2: Basics of Point-Referenced Data Models p. 1/45 Basics of Point-Referenced Data Models Basic
More informationLecture 34. Fourier Transforms
Lecture 34 Fourier Transforms In this section, we introduce the Fourier transform, a method of analyzing the frequency content of functions that are no longer τ-periodic, but which are defined over the
More informationFinite element approximation of the stochastic heat equation with additive noise
p. 1/32 Finite element approximation of the stochastic heat equation with additive noise Stig Larsson p. 2/32 Outline Stochastic heat equation with additive noise du u dt = dw, x D, t > u =, x D, t > u()
More informationConvolution. Define a mathematical operation on discrete-time signals called convolution, represented by *. Given two discrete-time signals x 1, x 2,
Filters Filters So far: Sound signals, connection to Fourier Series, Introduction to Fourier Series and Transforms, Introduction to the FFT Today Filters Filters: Keep part of the signal we are interested
More informationThe stochastic heat equation with a fractional-colored noise: existence of solution
The stochastic heat equation with a fractional-colored noise: existence of solution Raluca Balan (Ottawa) Ciprian Tudor (Paris 1) June 11-12, 27 aluca Balan (Ottawa), Ciprian Tudor (Paris 1) Stochastic
More informationTherefore the new Fourier coefficients are. Module 2 : Signals in Frequency Domain Problem Set 2. Problem 1
Module 2 : Signals in Frequency Domain Problem Set 2 Problem 1 Let be a periodic signal with fundamental period T and Fourier series coefficients. Derive the Fourier series coefficients of each of the
More informationOn Gaussian Process Models for High-Dimensional Geostatistical Datasets
On Gaussian Process Models for High-Dimensional Geostatistical Datasets Sudipto Banerjee Joint work with Abhirup Datta, Andrew O. Finley and Alan E. Gelfand University of California, Los Angeles, USA May
More informationANALOG AND DIGITAL SIGNAL PROCESSING CHAPTER 3 : LINEAR SYSTEM RESPONSE (GENERAL CASE)
3. Linear System Response (general case) 3. INTRODUCTION In chapter 2, we determined that : a) If the system is linear (or operate in a linear domain) b) If the input signal can be assumed as periodic
More informationLecture 1 January 5, 2016
MATH 262/CME 372: Applied Fourier Analysis and Winter 26 Elements of Modern Signal Processing Lecture January 5, 26 Prof. Emmanuel Candes Scribe: Carlos A. Sing-Long; Edited by E. Candes & E. Bates Outline
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationBetter Simulation Metamodeling: The Why, What and How of Stochastic Kriging
Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging Jeremy Staum Collaborators: Bruce Ankenman, Barry Nelson Evren Baysal, Ming Liu, Wei Xie supported by the NSF under Grant No.
More informationSymmetry and Separability In Spatial-Temporal Processes
Symmetry and Separability In Spatial-Temporal Processes Man Sik Park, Montserrat Fuentes Symmetry and Separability In Spatial-Temporal Processes 1 Motivation In general, environmental data have very complex
More informationCSC411 Fall 2018 Homework 5
Homework 5 Deadline: Wednesday, Nov. 4, at :59pm. Submission: You need to submit two files:. Your solutions to Questions and 2 as a PDF file, hw5_writeup.pdf, through MarkUs. (If you submit answers to
More information10 Transfer Matrix Models
MIT EECS 6.241 (FALL 26) LECTURE NOTES BY A. MEGRETSKI 1 Transfer Matrix Models So far, transfer matrices were introduced for finite order state space LTI models, in which case they serve as an important
More informationShort-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility
Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,
More informationA new covariance function for spatio-temporal data analysis with application to atmospheric pollution and sensor networking
A new covariance function for spatio-temporal data analysis with application to atmospheric pollution and sensor networking György Terdik and Subba Rao Tata UofD, HU & UofM, UK January 30, 2015 Laboratoire
More informationEL1820 Modeling of Dynamical Systems
EL1820 Modeling of Dynamical Systems Lecture 9 - Parameter estimation in linear models Model structures Parameter estimation via prediction error minimization Properties of the estimate: bias and variance
More informationR-INLA. Sam Clifford Su-Yun Kang Jeff Hsieh. 30 August Bayesian Research and Analysis Group 1 / 14
1 / 14 R-INLA Sam Clifford Su-Yun Kang Jeff Hsieh Bayesian Research and Analysis Group 30 August 2012 What is R-INLA? R package for Bayesian computation Integrated Nested Laplace Approximation MCMC free
More informationFundamentals of the Discrete Fourier Transform
Seminar presentation at the Politecnico di Milano, Como, November 12, 2012 Fundamentals of the Discrete Fourier Transform Michael G. Sideris sideris@ucalgary.ca Department of Geomatics Engineering University
More informationRiemann Manifold Methods in Bayesian Statistics
Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes
More information