Maximum likelihood SAR tomography based on the polarimetric multi-baseline RVoG model:
|
|
- Philippa Wright
- 5 years ago
- Views:
Transcription
1 Maximum likelihood SAR tomography based on the polarimetric multi-baseline RVoG model: Optimal estimation of a covariance matrix structured as the sum of two Kronecker products. L. Ferro-Famil 1,2, S. Tebaldini 3 1 University of Rennes 1, IETR lab., France 2 University of Tromsø, Physics and Technology Dpt., Norway 3 Politecnico Milano, Italy Laurent.Ferro-Famil@univ-rennes.fr
2 L. Ferro-Famil, S. Tebaldini MB-RVOG, ML SKP-2, PolinSAR 2013, Frascati
3 Outline PolTomSAR using the MB-RVOG model and Kronecker products Maximum Likelihood estimation of SKP-2 covariance matrices Performance of the proposed ML estimator
4 Single-pol MB-RVOG model SB case Uncorrelated responses y i(l) = s gi (l)+s vi (l), I i = I g + I v Homogeneous media E(s x1 s x 2 ) = I xγ x, γ x = f x(z)exp(jk zz)dz Global InSAR response y(l) = [s 1(l), s 2(l)] T R SB = I gr g + I v R v, with R x = [ 1 ] γx γx 1
5 Single-pol MB-RVOG model MB case Point-like scatterer MB-response : s(l) = A c(l)a(z) a = [ 1, exp(jk z2 z),..., exp(jk zns z) ] Global MB-InSAR response y(l) == s g(l)+s v(l) 1 γ x(k z2 )... γ x(k zns ) 1. R MB = I gr g + I vr v with R x =
6 Single-pol MB-RVOG model MB case Point-like scatterer MB-response : s(l) = A c(l)a(z) a = [ 1, exp(jk z2 z),..., exp(jk zns z) ] Global MB-InSAR response y(l) == s g(l)+s v(l) 1 γ x(k z2 )... γ x(k zns ) 1. R MB = I gr g + I vr v with R x =
7 Full-pol MB-RVOG model Point-like scatterer MB-response : y(l) = A c(l) [ ] khh a(z) k hv a(z) = A c(l)k a(z) k vva(z)
8 Full-pol MB-RVOG model Point-like scatterer MB-response : y(l) = A c(l) Uncorrelated P-S processes : E(y iy i ) = IiCi Ri [ ] khh a(z) k hv a(z) = A c(l)k a(z) k vva(z)
9 Full-pol MB-RVOG model Point-like scatterer MB-response : y(l) = A c(l) Uncorrelated P-S processes : E(y iy i ) = IiCi Ri Global MB-POLinSAR response : y(l) = y g(l)+y v(l) R PS = C g R g + C v R v [ ] khh a(z) k hv a(z) = A c(l)k a(z) k vva(z)
10 Full-pol MB-RVOG model Point-like scatterer MB-response : y(l) = A c(l) [ ] khh a(z) k hv a(z) = A c(l)k a(z) k vva(z) Uncorrelated P-S processes : E(y iy i ) = IiCi Ri Global MB-POLinSAR response : y(l) = y g(l)+y v(l) R PS = C g R g + C v R v SB-case (N s = 2) [ ] Cg + C v γ gc g + γ vc v R SP = R g C g + R v C v = γg C g + γv C v C g + C [ ] v C C12 R SP = C 12 C γ(w) = w C 12 w w Cw = γv+µ(w)γg 1+µ(w) MB-RVOG : exactly modeled by an SKP-2 structured covariance matrix
11 MB-RVOG parameter estimation SB case Coherence line estimation GV separation : strong assumptions γ g = 1, rank(c g) < 3 f v(z) known MB case SVD SKP-2 decomposition GV separation : several solutions accuracy increases with N s
12 MB-RVOG parameter estimation SB case Coherence line estimation GV separation : strong assumptions γ g = 1, rank(c g) < 3 f v(z) known MB case SVD SKP-2 decomposition GV separation : several solutions accuracy increases with N s See also : physical interpretation of MB-RVOG JD & SKP decomp. B. El Hajj-Chehade, L. Ferro-Famil, Poster session Wednesday aft.
13 Estimation problem statement and objectives L independent observations {y(l)} L l=1 Estimate RVOG parameters or its SKP-2 covariance matrix SB case Pol-Sampling γ(w i), line LS estimate [C & P ][Flynn & Tabb ] LS estimation (matrix formatting) [LFF et al, ] [L-M 2012] ML estimation [Flynn et al. 2002] and CRLB [Roueff et al. 2011]. MB case LS estimation using joint diagonalization [Ferro-Famil et al. 2010] SVD based SKP-2 decomposition [Tebaldini et al. 2009, 2010, 2012]
14 Estimation problem statement and objectives L independent observations {y(l)} L l=1 Estimate RVOG parameters or its SKP-2 covariance matrix SB case Pol-Sampling γ(w i), line LS estimate [C & P ][Flynn & Tabb ] LS estimation (matrix formatting) [LFF et al, ] [L-M 2012] ML estimation [Flynn et al. 2002] and CRLB [Roueff et al. 2011]. MB case LS estimation using joint diagonalization [Ferro-Famil et al. 2010] SVD based SKP-2 decomposition [Tebaldini et al. 2009, 2010, 2012] LS vs ML estimation Estimate that best fits the observations (LS) vs Estimate most likely to have produced observations (ML). LS : generally fast, often has analytical solutions, no pdf assumption ML : if UMV estimator exists ML. Can be complex and time consuming
15 ML estimation of SKP2 covariance matrices Objectives Find ˆR PS ML = arg max f({y(l)} L l=1 R PS ) under the SKP-2 constraint. ˆR PS ML (Wishart) concentrated criterion : J(R PS ) = tr(r 1 PS ˆR yy)+log R PS ) with ˆR yy = 1 L L y PS (l)y PS (l) l=1
16 ML estimation of SKP2 covariance matrices Objectives Find ˆR PS ML = arg max f({y(l)} L l=1 R PS ) under the SKP-2 constraint. ˆR PS ML (Wishart) concentrated criterion : J(R PS ) = tr(r 1 PS ˆR yy)+log R PS ) with ˆR yy = 1 L Estimation objective : determine L y PS (l)y PS (l) ˆR PS ML = arg min J(R PS ) with and R PS = C g R g + C v R v Find a fast and reliable technique l=1
17 ML estimation of SKP2 covariance matrices Objectives Find ˆR PS ML = arg max f({y(l)} L l=1 R PS ) under the SKP-2 constraint. ˆR PS ML (Wishart) concentrated criterion : J(R PS ) = tr(r 1 PS ˆR yy)+log R PS ) with ˆR yy = 1 L Estimation objective : determine L y PS (l)y PS (l) ˆR PS ML = arg min J(R PS ) with and R PS = C g R g + C v R v Find a fast and reliable technique l=1 Case of a single KP No analytical solution. Fast and reliable iterative alternate (Flip-Flop) optimization [Lu 2004] Asymptotic analytical expression : Weighted LS solution [Werner 2008]
18 ML estimation of SKP2 covariance matrices SKP-2 case In general A B+C D E F No existing solution.
19 ML estimation of SKP2 covariance matrices SKP-2 case In general A B+C D E F No existing solution. Exact joint diagonalization of two matrices A, B > 0 and Hermitian can be exactly jointly diagonalized [Yeredor 2005] : W : WAW = D A, WBW = D B, in general WW I
20 ML estimation of SKP2 covariance matrices SKP-2 case In general A B+C D E F No existing solution. Exact joint diagonalization of two matrices A, B > 0 and Hermitian can be exactly jointly diagonalized [Yeredor 2005] : W : WAW = D A, WBW = D B, in general WW I In the SKP-2 case, R PS = C g R g + C v R v W = W C W R : WR PS W = D Cg D Rg + D Cv D Rv = D
21 ML estimation of SKP2 covariance matrices SKP-2 case In general A B+C D E F No existing solution. Exact joint diagonalization of two matrices A, B > 0 and Hermitian can be exactly jointly diagonalized [Yeredor 2005] : W : WAW = D A, WBW = D B, in general WW I In the SKP-2 case, R PS = C g R g + C v R v W = W C W R : WR PS W = D Cg D Rg + D Cv D Rv = D ML criterion minimization over D Inserting D, W into J(R PS ) J(D, W) = tr(d 1 WˆR yyw )+log D log WW Minimum obtained for : ˆD = diag(wˆr yyw )
22 ML estimation of SKP2 covariance matrices SKP-2 case In general A B+C D E F No existing solution. Exact joint diagonalization of two matrices A, B > 0 and Hermitian can be exactly jointly diagonalized [Yeredor 2005] : W : WAW = D A, WBW = D B, in general WW I In the SKP-2 case, R PS = C g R g + C v R v W = W C W R : WR PS W = D Cg D Rg + D Cv D Rv = D ML criterion minimization over D Inserting D, W into J(R PS ) J(D, W) = tr(d 1 WˆR yyw )+log D log WW Minimum obtained for : ˆD = diag(wˆr yyw ) Resulting concentrated ML criterion J(W) = log diag(wˆr yyw ) log WW
23 ML estimation of SKP2 covariance matrices ML criterion minimization over W : formulation Explicit ML criterion J(W C, W R ) = log diag( D) log WW with D = WˆR yyw and W = W C W R
24 ML estimation of SKP2 covariance matrices ML criterion minimization over W : formulation Explicit ML criterion J(W C, W R ) = log diag( D) log WW with D = WˆR yyw and W = W C W R Conditional criterion : known spatial diagonalizer N s J(W C W R ) = log diag(w C DCi W C ) log W C W C i=1 Conditional criterion : known polarimetric diagonalizer N p J(W R W C ) = log diag(w R D Ri W R ) log W R W R i=1
25 ML estimation of SKP2 covariance matrices ML criterion minimization over W : formulation Explicit ML criterion J(W C, W R ) = log diag( D) log WW with D = WˆR yyw and W = W C W R Conditional criterion : known spatial diagonalizer N s J(W C W R ) = log diag(w C DCi W C ) log W C W C i=1 Conditional criterion : known polarimetric diagonalizer N p J(W R W C ) = log diag(w R D Ri W R ) log W R W R i=1 ML criterion minimization over W : joint diagonalization J(W C W R ) or J(W R W C ) efficiently minimized modified Pham s technique [Pham 2001] Iterative constrained joint diagonalization of { D Ci } or { D Ri } Convergence is proven, generally very fast (4 iterations)
26 ML estimation of SKP2 covariance matrices : algorithm Step 0 Initialisation L Compute ˆR yy = 1 y L PS (l)y PS (l), set W C(0), W R (0), n = 0 l=1
27 ML estimation of SKP2 covariance matrices : algorithm Step 0 Initialisation L Compute ˆR yy = 1 y L PS (l)y PS (l), set W C(0), W R (0), n = 0 l=1 Step 1 Diagonalization Compute D(n+1) = W(n)ˆR yyw (n), n = n+1
28 ML estimation of SKP2 covariance matrices : algorithm Step 0 Initialisation L Compute ˆR yy = 1 y L PS (l)y PS (l), set W C(0), W R (0), n = 0 l=1 Step 1 Diagonalization Compute D(n+1) = W(n)ˆR yyw (n), n = n+1 Step 2 Minimization using Pham s technique W C (n) = arg min J(W C W R (n 1)), W R (n) = arg min J(W R W C (n 1)) W C W R
29 ML estimation of SKP2 covariance matrices : algorithm Step 0 Initialisation L Compute ˆR yy = 1 y L PS (l)y PS (l), set W C(0), W R (0), n = 0 l=1 Step 1 Diagonalization Compute D(n+1) = W(n)ˆR yyw (n), n = n+1 Step 2 Minimization using Pham s technique W C (n) = arg min J(W C W R (n 1)), W R (n) = arg min J(W R W C (n 1)) W C W R Step 3 Convergence? If no : Go to Step 1 Step 4 Identification D = diag(ŵˆr yy Ŵ ), ˆR PS ML = Ŵ 1 DŴ From ˆR PS ML, identify Ĉg,ˆR g, Ĉv,ˆR v
30 SignificantL. improvement, Ferro-Famil, S. Tebaldini in general MB-RVOG, ML SMP LS SKP-2, PolinSAR 2013, LS ML Frascati Global relative rms error rmse = E R PS ˆR PS 2 F, with ˆR R PS 2 PS = ˆR yy, ˆR PS LS, ˆR PS ML F
31 Coherence locations in the complex plane N p = 3, N s = 5, Ground : rank(c g) = 2, Volume : full rank, minimum phase
32 Coherence wise rms error rmse(γ ij) = E( γ ij ˆγ ij 2 ) Equal performance for low rank (and DoF) R g, well chosen solution.
33 Polarimetric rms error rmse i = E C PS i Ĉ PSi 2 F C PSi 2 F Large LS error over low rank C g
34 MB-RVOG model validity vs estimation numb. unfeasible exp. numb.exp 100
35 Tomography at P band of a TROPISAR profile (ONERA/SETHI) N p = 3, N s = 6 L = 16 < N sn p
36 LS & ML SKP-2 Tomography N p = 3, N s = 6 L = 16 < N sn p
37 Conclusion Numerical ML estimation of the MB-RVOG model Rigorous and fast Adapted to POL-inSAR and TOMO-SAR Could also be used in medical imaging, image processing... Performance Fewer looks needed Significant variance reduction LS ML unformated LS Better estimation of polarimetric features, not only spatial ones
BUILDING HEIGHT ESTIMATION USING MULTIBASELINE L-BAND SAR DATA AND POLARIMETRIC WEIGHTED SUBSPACE FITTING METHODS
BUILDING HEIGHT ESTIMATION USING MULTIBASELINE L-BAND SAR DATA AND POLARIMETRIC WEIGHTED SUBSPACE FITTING METHODS Yue Huang, Laurent Ferro-Famil University of Rennes 1, Institute of Telecommunication and
More informationSpectral Clustering of Polarimetric SAR Data With Wishart-Derived Distance Measures
Spectral Clustering of Polarimetric SAR Data With Wishart-Derived Distance Measures STIAN NORMANN ANFINSEN ROBERT JENSSEN TORBJØRN ELTOFT COMPUTATIONAL EARTH OBSERVATION AND MACHINE LEARNING LABORATORY
More informationCoherence Linearity and SKP-Structured Matrices in Multi-Baseline PolInSAR
Coherence Linearity and SKP-Structured Matrices in Multi-Baseline PolInSAR Stefano Tebaldini and Fabio Rocca Politecnico di Milano Dipartimento di Elettronica e Informazione IGARSS 2, Vancouer Introduction
More informationSoil moisture retrieval over periodic surfaces using PolSAR data
Soil moisture retrieval over periodic surfaces using PolSAR data Sandrine DANIEL Sophie ALLAIN Laurent FERRO-FAMIL Eric POTTIER IETR Laboratory, UMR CNRS 6164, University of Rennes1, France Contents Soil
More information2 Statistical Estimation: Basic Concepts
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:
More informationModel based forest height estimation with ALOS/PalSAR: A first study.
Model based forest height estimation with ALOS/PalSAR: A first study. K.P. Papathanassiou*, I. Hajnsek*, T.Mette*, S.R. Cloude** and A. Moreira* * (DLR) (DLR-HR) Oberpfaffenhofen, Germany ** AEL Consultants
More informationMultivariate Linear Models
Multivariate Linear Models Stanley Sawyer Washington University November 7, 2001 1. Introduction. Suppose that we have n observations, each of which has d components. For example, we may have d measurements
More informationMulti-Baseline SAR interferometry
ulti-baseline SAR interferometry A. onti Guarnieri, S. Tebaldini Dipartimento di Elettronica e Informazione - Politecnico di ilano G. Fornaro, A. Pauciullo IREA-CNR ESA cat-1 project, 3173. Page 1 Classical
More informationA New Approach to Estimate Forest Parameters Using Dual-Baseline POL-InSAR Data
Jan 26-30, 2009 Frascati, Italy A New Approach to Estimate Forest Parameters Using Dual-Baseline POL-InSAR Data Lu Bai, Wen ong, Fang Cao, Yongsheng Zhou bailu8@gmail.com fcao@mail.ie.ac.cn National Key
More informationSemi-Blind approaches to source separation: introduction to the special session
Semi-Blind approaches to source separation: introduction to the special session Massoud BABAIE-ZADEH 1 Christian JUTTEN 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals
More informationEstimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators
Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let
More informationArchimer
Please note that this is an author-produced PDF of an article accepted for publication following peer review. The definitive publisher-authenticated version is available on the publisher Web site Ieee
More informationPCA and admixture models
PCA and admixture models CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar, Alkes Price PCA and admixture models 1 / 57 Announcements HW1
More informationStudy and Applications of POLSAR Data Time-Frequency Correlation Properties
Study and Applications of POLSAR Data Time-Frequency Correlation Properties L. Ferro-Famil 1, A. Reigber 2 and E. Pottier 1 1 University of Rennes 1, Institute of Electronics and Telecommunications of
More informationDecember 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis
.. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make
More informationEconomics 620, Lecture 5: exp
1 Economics 620, Lecture 5: The K-Variable Linear Model II Third assumption (Normality): y; q(x; 2 I N ) 1 ) p(y) = (2 2 ) exp (N=2) 1 2 2(y X)0 (y X) where N is the sample size. The log likelihood function
More informationDimension Reduction and Low-dimensional Embedding
Dimension Reduction and Low-dimensional Embedding Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/26 Dimension
More informationLecture 7 MIMO Communica2ons
Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10
More informationProbabilistic Latent Semantic Analysis
Probabilistic Latent Semantic Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationLecture 4. CP and KSVD Representations. Charles F. Van Loan
Structured Matrix Computations from Structured Tensors Lecture 4. CP and KSVD Representations Charles F. Van Loan Cornell University CIME-EMS Summer School June 22-26, 2015 Cetraro, Italy Structured Matrix
More informationUniversity of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries
University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent
More informationSingular value decomposition. If only the first p singular values are nonzero we write. U T o U p =0
Singular value decomposition If only the first p singular values are nonzero we write G =[U p U o ] " Sp 0 0 0 # [V p V o ] T U p represents the first p columns of U U o represents the last N-p columns
More informationTAMS39 Lecture 2 Multivariate normal distribution
TAMS39 Lecture 2 Multivariate normal distribution Martin Singull Department of Mathematics Mathematical Statistics Linköping University, Sweden Content Lecture Random vectors Multivariate normal distribution
More information1 Data Arrays and Decompositions
1 Data Arrays and Decompositions 1.1 Variance Matrices and Eigenstructure Consider a p p positive definite and symmetric matrix V - a model parameter or a sample variance matrix. The eigenstructure is
More informationStochastic representation of random positive-definite tensor-valued properties: application to 3D anisotropic permeability random fields
Sous la co-tutelle de : LABORATOIRE DE MODÉLISATION ET SIMULATION MULTI ÉCHELLE CNRS UPEC UNIVERSITÉ PARIS-EST CRÉTEIL UPEM UNIVERSITÉ PARIS-EST MARNE-LA-VALLÉE Stochastic representation of random positive-definite
More information1 Outline. 1. Motivation. 2. SUR model. 3. Simultaneous equations. 4. Estimation
1 Outline. 1. Motivation 2. SUR model 3. Simultaneous equations 4. Estimation 2 Motivation. In this chapter, we will study simultaneous systems of econometric equations. Systems of simultaneous equations
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationCOMPLEX CONSTRAINED CRB AND ITS APPLICATION TO SEMI-BLIND MIMO AND OFDM CHANNEL ESTIMATION. Aditya K. Jagannatham and Bhaskar D.
COMPLEX CONSTRAINED CRB AND ITS APPLICATION TO SEMI-BLIND MIMO AND OFDM CHANNEL ESTIMATION Aditya K Jagannatham and Bhaskar D Rao University of California, SanDiego 9500 Gilman Drive, La Jolla, CA 92093-0407
More informationPOLARIMETRIC SPECKLE FILTERING
$y $x r Ezt (, ) $z POLARIMETRIC SPECKLE FILTERING E. Pottier, L. Ferro-Famil (/) SPECKLE FILTERING SPECKLE PHENOMENON E. Pottier, L. Ferro-Famil (/) SPECKLE FILTERING OBSERVATION POINT SURFACE ROUGHNESS
More informationConvex Optimization Algorithms for Machine Learning in 10 Slides
Convex Optimization Algorithms for Machine Learning in 10 Slides Presenter: Jul. 15. 2015 Outline 1 Quadratic Problem Linear System 2 Smooth Problem Newton-CG 3 Composite Problem Proximal-Newton-CD 4 Non-smooth,
More informationParameter estimation in linear Gaussian covariance models
Parameter estimation in linear Gaussian covariance models Caroline Uhler (IST Austria) Joint work with Piotr Zwiernik (UC Berkeley) and Donald Richards (Penn State University) Big Data Reunion Workshop
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationPeter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8
Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall
More informationADVANCED CONCEPTS POLSARPRO V3.0 LECTURE NOTES. Eric POTTIER (1), Jong-Sen LEE (2), Laurent FERRO-FAMIL (1)
ADVANCED CONCEPTS Eric POTTIER (), Jong-Sen LEE (), Laurent FERRO-FAMIL () () I.E.T.R UMR CNRS 664 University of Rennes Image and Remote Sensing Department, SAPHIR Team Campus de Beaulieu, Bat D, 63 Av
More informationGraduate Econometrics I: Maximum Likelihood II
Graduate Econometrics I: Maximum Likelihood II Yves Dominicy Université libre de Bruxelles Solvay Brussels School of Economics and Management ECARES Yves Dominicy Graduate Econometrics I: Maximum Likelihood
More informationEvaluation and Bias Removal of Multi-Look Effect on Entropy/Alpha /Anisotropy (H/
POLINSAR 2009 WORKSHOP 26-29 January 2009 ESA-ESRIN, Frascati (ROME), Italy Evaluation and Bias Removal of Multi-Look Effect on Entropy/Alpha /Anisotropy (H/ (H/α/A) Jong-Sen Lee*, Thomas Ainsworth Naval
More informationNoise covariance model for time-series InSAR analysis. Piyush Agram, Mark Simons
Noise covariance model for time-series InSAR analysis Piyush Agram, Mark Simons Motivation Noise covariance in individual interferograms has been well studied. (Hanssen, 2001) Extend the noise model to
More informationA L A BA M A L A W R E V IE W
A L A BA M A L A W R E V IE W Volume 52 Fall 2000 Number 1 B E F O R E D I S A B I L I T Y C I V I L R I G HT S : C I V I L W A R P E N S I O N S A N D TH E P O L I T I C S O F D I S A B I L I T Y I N
More informationStationary Graph Processes: Nonparametric Spectral Estimation
Stationary Graph Processes: Nonparametric Spectral Estimation Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro Dept. of Signal Theory and Communications King Juan Carlos University
More information,i = 1,2,L, p. For a sample of size n, let the columns of data be
MAC IIci: Miller Asymptotics Chapter 5: Regression Section?: Asymptotic Relationship Between a CC and its Associated Slope Estimates in Multiple Linear Regression The asymptotic null distribution of a
More informationPolarization degree fading during propagation of partially coherent light through retarders
OPTO-ELECTRONICS REVIEW 3(), 7 76 7 th International Workshop on Nonlinear Optics Applications Polarization degree fading during propagation of partially coherent light through retarders A.W. DOMAÑSKI
More information$%! & (, -3 / 0 4, 5 6/ 6 +7, 6 8 9/ 5 :/ 5 A BDC EF G H I EJ KL N G H I. ] ^ _ ` _ ^ a b=c o e f p a q i h f i a j k e i l _ ^ m=c n ^
! #" $%! & ' ( ) ) (, -. / ( 0 1#2 ' ( ) ) (, -3 / 0 4, 5 6/ 6 7, 6 8 9/ 5 :/ 5 ;=? @ A BDC EF G H I EJ KL M @C N G H I OPQ ;=R F L EI E G H A S T U S V@C N G H IDW G Q G XYU Z A [ H R C \ G ] ^ _ `
More informationlinearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice
3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More informationUnderdetermined Instantaneous Audio Source Separation via Local Gaussian Modeling
Underdetermined Instantaneous Audio Source Separation via Local Gaussian Modeling Emmanuel Vincent, Simon Arberet, and Rémi Gribonval METISS Group, IRISA-INRIA Campus de Beaulieu, 35042 Rennes Cedex, France
More informationWEAKER MSE CRITERIA AND TESTS FOR LINEAR RESTRICTIONS IN REGRESSION MODELS WITH NON-SPHERICAL DISTURBANCES. Marjorie B. MCELROY *
Journal of Econometrics 6 (1977) 389-394. 0 North-Holland Publishing Company WEAKER MSE CRITERIA AND TESTS FOR LINEAR RESTRICTIONS IN REGRESSION MODELS WITH NON-SPHERICAL DISTURBANCES Marjorie B. MCELROY
More informationZellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley
Zellner s Seemingly Unrelated Regressions Model James L. Powell Department of Economics University of California, Berkeley Overview The seemingly unrelated regressions (SUR) model, proposed by Zellner,
More informationDimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining
Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes October 3, 2012 1 / 1 Combinations of features Given a data matrix X n p with p fairly large, it can
More informationRobust Quantum Error-Correction. via Convex Optimization
Robust Quantum Error-Correction via Convex Optimization Robert Kosut SC Solutions Systems & Control Division Sunnyvale, CA Alireza Shabani EE Dept. USC Los Angeles, CA Daniel Lidar EE, Chem., & Phys. Depts.
More informationi.ea IE !e e sv?f 'il i+x3p \r= v * 5,?: S i- co i, ==:= SOrq) Xgs'iY # oo .9 9 PE * v E=S s->'d =ar4lq 5,n =.9 '{nl a':1 t F #l *r C\ t-e
fl ) 2 ;;:i c.l l) ( # =S >' 5 ^'R 1? l.y i.i.9 9 P * v ,>f { e e v? 'il v * 5,?: S 'V i: :i (g Y 1,Y iv cg G J :< >,c Z^ /^ c..l Cl i l 1 3 11 5 (' \ h 9 J :'i g > _ ^ j, \= f{ '{l #l * C\? 0l = 5,
More informationDESIGNING A KALMAN FILTER WHEN NO NOISE COVARIANCE INFORMATION IS AVAILABLE. Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof
DESIGNING A KALMAN FILTER WHEN NO NOISE COVARIANCE INFORMATION IS AVAILABLE Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof Delft Center for Systems and Control, Delft University of Technology, Mekelweg
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationTRINICON: A Versatile Framework for Multichannel Blind Signal Processing
TRINICON: A Versatile Framework for Multichannel Blind Signal Processing Herbert Buchner, Robert Aichner, Walter Kellermann {buchner,aichner,wk}@lnt.de Telecommunications Laboratory University of Erlangen-Nuremberg
More informationilpi4ka3 MuHucrepcrBo o6pasobahufl Huxeropollcnofi o Onacrrl rocyaapctnennofi rrorosofi s2015 ro4y fl? /CI.?G
MuucpcB 6psBul uxpllci cl ilpk l? /C.?G ' uxui P' 06 YP [pcjb CCTB qb cypci KMqui KMcc uxpqcci 6cu 015 Y' B cstctb yktm 1 lpxk plbl' 'cypcsuii u "c1 6p:JbbM ppmmll Cp 6[ Sp: ybpx pkm MuucpcTB 6p:ux 1 yk
More informationReduced rank regression in cointegrated models
Journal of Econometrics 06 (2002) 203 26 www.elsevier.com/locate/econbase Reduced rank regression in cointegrated models.w. Anderson Department of Statistics, Stanford University, Stanford, CA 94305-4065,
More informationCS540 Machine learning Lecture 5
CS540 Machine learning Lecture 5 1 Last time Basis functions for linear regression Normal equations QR SVD - briefly 2 This time Geometry of least squares (again) SVD more slowly LMS Ridge regression 3
More informationA more efficient second order blind identification method for separation of uncorrelated stationary time series
A more efficient second order blind identification method for separation of uncorrelated stationary time series Sara Taskinen 1a, Jari Miettinen a, Klaus Nordhausen b,c a Department of Mathematics and
More informationCS6964: Notes On Linear Systems
CS6964: Notes On Linear Systems 1 Linear Systems Systems of equations that are linear in the unknowns are said to be linear systems For instance ax 1 + bx 2 dx 1 + ex 2 = c = f gives 2 equations and 2
More informationMatrix Factorization and Analysis
Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their
More informationINRIA Rh^one-Alpes. Abstract. Friedman (1989) has proposed a regularization technique (RDA) of discriminant analysis
Regularized Gaussian Discriminant Analysis through Eigenvalue Decomposition Halima Bensmail Universite Paris 6 Gilles Celeux INRIA Rh^one-Alpes Abstract Friedman (1989) has proposed a regularization technique
More informationA Block-Jacobi Algorithm for Non-Symmetric Joint Diagonalization of Matrices
A Block-Jacobi Algorithm for Non-Symmetric Joint Diagonalization of Matrices ao Shen and Martin Kleinsteuber Department of Electrical and Computer Engineering Technische Universität München, Germany {hao.shen,kleinsteuber}@tum.de
More informationSparse least squares and Q-less QR
Notes for 2016-02-29 Sparse least squares and Q-less QR Suppose we want to solve a full-rank least squares problem in which A is large and sparse. In principle, we could solve the problem via the normal
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationSupplementary Material SPNet: Shape Prediction using a Fully Convolutional Neural Network
Supplementary Material SPNet: Shape Prediction using a Fully Convolutional Neural Network S M Masudur Rahman Al Arif 1, Karen Knapp 2 and Greg Slabaugh 1 1 City, University of London 2 University of Exeter
More informationA New Subspace Identification Method for Open and Closed Loop Data
A New Subspace Identification Method for Open and Closed Loop Data Magnus Jansson July 2005 IR S3 SB 0524 IFAC World Congress 2005 ROYAL INSTITUTE OF TECHNOLOGY Department of Signals, Sensors & Systems
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 22 MACHINE LEARNING Discrete Probabilities Consider two variables and y taking discrete
More informationSTAT 430/510 Probability
STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional
More informationcomponent risk analysis
273: Urban Systems Modeling Lec. 3 component risk analysis instructor: Matteo Pozzi 273: Urban Systems Modeling Lec. 3 component reliability outline risk analysis for components uncertain demand and uncertain
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationIntégration de connaissance experte
Intégration de connaissance experte dans des systèmes de fusion d informationsd Florentin BUJOR, Gabriel VASILE, Lionel VALET Emmanuel TROUVÉ, Gilles MAURIS et Philippe BOLON emmanuel.trouve@univ-savoie.fr
More informationECE295, Data Assimila0on and Inverse Problems, Spring 2015
ECE295, Data Assimila0on and Inverse Problems, Spring 2015 1 April, Intro; Linear discrete Inverse problems (Aster Ch 1 and 2) Slides 8 April, SVD (Aster ch 2 and 3) Slides 15 April, RegularizaFon (ch
More informationSparse factorization using low rank submatrices. Cleve Ashcraft LSTC 2010 MUMPS User Group Meeting April 15-16, 2010 Toulouse, FRANCE
Sparse factorization using low rank submatrices Cleve Ashcraft LSTC cleve@lstc.com 21 MUMPS User Group Meeting April 15-16, 21 Toulouse, FRANCE ftp.lstc.com:outgoing/cleve/mumps1 Ashcraft.pdf 1 LSTC Livermore
More informationMethods and performances for multi-pass SAR Interferometry
Methods and performances for multi-pass SAR Interferometry Methods and performances for multi-pass SAR Interferometry Stefano Tebaldini and Andrea Monti Guarnieri Dipartimento di Elettronica e Informazione
More informationDUAL FREQUENCY POLARIMETRIC SAR DATA CLASSIFICATION AND ANALYSIS
Progress In Electromagnetics Research, PIER 31, 247 272, 2001 DUAL FREQUENCY POLARIMETRIC SAR DATA CLASSIFICATION AND ANALYSIS L. Ferro-Famil Ecole Polytechnique de l Université de Nantes IRESTE, Laboratoire
More informationLecture 8. Principal Component Analysis. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. December 13, 2016
Lecture 8 Principal Component Analysis Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza December 13, 2016 Luigi Freda ( La Sapienza University) Lecture 8 December 13, 2016 1 / 31 Outline 1 Eigen
More informationRegression. Oscar García
Regression Oscar García Regression methods are fundamental in Forest Mensuration For a more concise and general presentation, we shall first review some matrix concepts 1 Matrices An order n m matrix is
More informationChapter 5 Prediction of Random Variables
Chapter 5 Prediction of Random Variables C R Henderson 1984 - Guelph We have discussed estimation of β, regarded as fixed Now we shall consider a rather different problem, prediction of random variables,
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationIdentification of MIMO linear models: introduction to subspace methods
Identification of MIMO linear models: introduction to subspace methods Marco Lovera Dipartimento di Scienze e Tecnologie Aerospaziali Politecnico di Milano marco.lovera@polimi.it State space identification
More informationSystem 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:
System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :
More informationECONOMETRIC THEORY. MODULE XVII Lecture - 43 Simultaneous Equations Models
ECONOMETRIC THEORY MODULE XVII Lecture - 43 Simultaneous Equations Models Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur 2 Estimation of parameters To estimate
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans
More informationDS-GA 1002 Lecture notes 12 Fall Linear regression
DS-GA Lecture notes 1 Fall 16 1 Linear models Linear regression In statistics, regression consists of learning a function relating a certain quantity of interest y, the response or dependent variable,
More informationLinear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4
Linear Algebra Section. : LU Decomposition Section. : Permutations and transposes Wednesday, February 1th Math 01 Week # 1 The LU Decomposition We learned last time that we can factor a invertible matrix
More informationStat 579: Generalized Linear Models and Extensions
Stat 579: Generalized Linear Models and Extensions Linear Mixed Models for Longitudinal Data Yan Lu April, 2018, week 15 1 / 38 Data structure t1 t2 tn i 1st subject y 11 y 12 y 1n1 Experimental 2nd subject
More informationEfficient Solvers for Stochastic Finite Element Saddle Point Problems
Efficient Solvers for Stochastic Finite Element Saddle Point Problems Catherine E. Powell c.powell@manchester.ac.uk School of Mathematics University of Manchester, UK Efficient Solvers for Stochastic Finite
More informationA Spectral Regularization Framework for Multi-Task Structure Learning
A Spectral Regularization Framework for Multi-Task Structure Learning Massimiliano Pontil Department of Computer Science University College London (Joint work with A. Argyriou, T. Evgeniou, C.A. Micchelli,
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans
More informationNew meeting times for Econ 210D Mondays 8:00-9:20 a.m. in Econ 300 Wednesdays 11:00-12:20 in Econ 300
New meeting times for Econ 210D Mondays 8:00-9:20 a.m. in Econ 300 Wednesdays 11:00-12:20 in Econ 300 1 Identification using nonrecursive structure, long-run restrictions and heteroskedasticity 2 General
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationNotes on the framework of Ando and Zhang (2005) 1 Beyond learning good functions: learning good spaces
Notes on the framework of Ando and Zhang (2005 Karl Stratos 1 Beyond learning good functions: learning good spaces 1.1 A single binary classification problem Let X denote the problem domain. Suppose we
More informationEconometrics II - EXAM Answer each question in separate sheets in three hours
Econometrics II - EXAM Answer each question in separate sheets in three hours. Let u and u be jointly Gaussian and independent of z in all the equations. a Investigate the identification of the following
More informationDOA Estimation of Quasi-Stationary Signals Using a Partly-Calibrated Uniform Linear Array with Fewer Sensors than Sources
Progress In Electromagnetics Research M, Vol. 63, 185 193, 218 DOA Estimation of Quasi-Stationary Signals Using a Partly-Calibrated Uniform Linear Array with Fewer Sensors than Sources Kai-Chieh Hsu and
More informationReceived Signal, Interference and Noise
Optimum Combining Maximum ratio combining (MRC) maximizes the output signal-to-noise ratio (SNR) and is the optimal combining method in a maximum likelihood sense for channels where the additive impairment
More informationECE531 Lecture 2b: Bayesian Hypothesis Testing
ECE531 Lecture 2b: Bayesian Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 29-January-2009 Worcester Polytechnic Institute D. Richard Brown III 29-January-2009 1 / 39 Minimizing
More informationR = µ + Bf Arbitrage Pricing Model, APM
4.2 Arbitrage Pricing Model, APM Empirical evidence indicates that the CAPM beta does not completely explain the cross section of expected asset returns. This suggests that additional factors may be required.
More informationPOLARIMETRIC and interferometric synthetic aperture
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 45, NO. 11, NOVEMBER 2007 3599 Polarimetric and Interferometric SAR Image Partition Into Statistically Homogeneous Regions Based on the Minimization
More informationIntroduction: the Abruzzo earthquake The network and the processing strategies. displacements estimation at earthquake epoch. horizontal displacements
The Abruzzo earthquake: temporal and spatial analysis of the first geodetic results L. Biagi, S. Caldera, D. Dominici, F. Sansò Politecnico di Milano Università degli studi de L Aquila Outline Introduction:
More information