10. Multi-objective least squares

Size: px
Start display at page:

Download "10. Multi-objective least squares"

Transcription

1 L Vandenberghe ECE133A (Winter 2018) 10 Multi-objective least squares multi-objective least squares regularized data fitting control estimation and inversion 10-1

2 Multi-objective least squares we have several objectives J 1 = A 1 x b 1 2,, J k = A k x b k 2 A i is an m i n matrix, b i is an m i -vector we seek one x that makes all k objectives small usually there is a trade-off: no single x minimizes all objectives simultaneously Weighted least squares formulation: find x that minimizes λ 1 A 1 x b λ k A k x b k 2 coefficients λ 1,, λ k are positive weights weights λ i express relative importance of different objectives without loss of generality, we can choose λ 1 = 1 Multi-objective least squares 10-2

3 Solution of weighted least squares weighted least squares is equivalent to a standard least squares problem minimize λ1 A 1 λ2 A 2 λk A k x λ1 b 1 λ2 b 2 λk b k 2 solution is unique if the stacked matrix has linearly independent columns each matrix A i may have linearly dependent columns (or be a wide matrix) it the stacked matrix has linearly independent columns, the solution is ˆx = ( λ 1 A T 1 A λ k A T k A k ) 1 ( λ1 A T 1 b λ k A T k b k ) Multi-objective least squares 10-3

4 Example with two objectives A 1 and A 2 are 10 5 minimize A 1 x b λ A 2 x b ˆx 1 (λ) ˆx 2 (λ) ˆx 3 (λ) ˆx 4 (λ) ˆx 5 (λ) λ plot shows weighted least squares solution ˆx(λ) as function of weight λ Multi-objective least squares 10-4

5 Example with two objectives minimize A 1 x b λ A 2 x b J2(λ) λ = J 1 (λ) J 2 (λ) 6 4 λ = 1 λ = λ J 1 (λ) left figure shows J 1 (λ) = A 1ˆx(λ) b 1 2 and J 2 (λ) = A 2ˆx(λ) b 2 2 right figure shows optimal trade-off curve of J 2 (λ) versus J 1 (λ) Multi-objective least squares 10-5

6 Outline multi-objective least squares regularized data fitting control estimation and inversion

7 Motivation consider linear-in-parameters model ˆf(x) = θ 1 f 1 (x) + + θ p f p (x) we assume f 1 (x) is the constant function 1 we fit the model ˆf(x) to examples (x (1), y (1) ),, (x (N), y (N) ) large coefficient θ i makes model more sensitive to changes in f i (x) keeping θ 2,, θ p small helps avoid over-fitting this leads to two objectives: J 1 (θ) = N ( ˆf(x (k) ) y (k) ) 2, J 2 (θ) = k=1 p j=2 θ 2 j primary objective J 1 (θ) is sum of squares of prediction errors Multi-objective least squares 10-6

8 Weighted least squares formulation minimize J 1 (θ) + λj 2 (θ) = λ is positive regularization parameter N ( ˆf(x (k) ) y (k) ) 2 + λ k=1 equivalent to least squares problem: minimize [ A 1 λa2 ] θ [ y d 0 ] 2 p j=2 θ 2 j with y d = (y (1),, y (N) ), A 1 = 1 f 2 (x (1) ) f p (x (1) ) 1 f 2 (x (2) ) f p (x (2) ) 1 f 2 (x (N) ) f p (x (N) ), A 2 = stacked matrix has linearly independent columns (for positive λ) value of λ can be chosen by out-of-sample validation or cross-validation Multi-objective least squares 10-7

9 Example Train Test x solid line is signal used to generate synthetic (simulated) data 10 blue points are used as training set; 20 red points are used as test set we fit a model with five parameters θ 1,, θ 5 : ˆf(x) = θ θ k+1 cos(ω k x + φ k ) (with given ω k, φ k ) k=1 Multi-objective least squares 10-8

10 Result of regularized least squares fit Train Test RMS error versus λ 2 1 Coefficients versus λ θ 1 θ 2 θ 3 θ 4 θ λ λ minimum test RMS error is for λ around 008 increasing λ shrinks the coefficients θ 2,, θ 5 dashed lines show coefficients used to generate the data for λ near 08, estimated coefficients are close to these true values Multi-objective least squares 10-9

11 Outline multi-objective least squares regularized data fitting control estimation and inversion

12 Control y = Ax + b x is n-vector of actions or inputs y is m-vector of results or outputs relation between inputs and outputs is known affine function goal is to choose inputs to optimize different objectives on x and y Multi-objective least squares 10-10

13 Optimal input design Linear dynamical system y(t) = h 0 u(t) + h 1 u(t 1) + h 2 u(t 2) + + h t u(0) output y(t) and input u(t) are scalar we assume input u(t) is zero for t < 0 coefficients h 0, h 1, are the impulse response coefficients output is convolution of input with impulse response Optimal input design optimization variable is the input sequence x = (u(0), u(1),, u(n)) goal is to track a desired output using a small and slowly varying input Multi-objective least squares 10-11

14 Input design objectives minimize J t (x) + λ v J v (x) + λ m J m (x) track desired output y des over an interval [0, N]: J t (x) = N (y(t) y des (t)) 2 t=0 use a small and slowly varying input signal: J m (x) = N u(t) 2, J v (x) = t=0 (u(t + 1) u(t)) 2 N 1 t=0 Multi-objective least squares 10-12

15 Tracking error N J t (x) = (y(t) y des (t)) 2 t=0 = A t x b t 2 with A t = h h 1 h h 2 h 1 h h N 1 h N 2 h N 3 h 0 0 h N h N 1 h N 2 h 1 h 0, b t = y des (0) y des (1) y des (2) y des (N 1) y des (N) Multi-objective least squares 10-13

16 Input variation and magnitude Input variation J v (x) = (u(t + 1) u(t)) 2 = Dx 2 N 1 t=0 with D the N (N + 1) matrix D = Input magnitude J m (x) = N u(t) 2 = x 2 t=0 Multi-objective least squares 10-14

17 Example λ v = 0, small λ m u(t) t y(t) t larger λ v u(t) 0 2 y(t) t t Multi-objective least squares 10-15

18 Outline multi-objective least squares regularized data fitting control estimation and inversion

19 Estimation Linear measurement model y = Ax + v n-vector x contains parameters that we want to estimate m-vector v is unknown measurement error or noise m-vector y contains measurements m n matrix A relates measurements and parameters Least squares estimate: use as estimate of x the solution ˆx of minimize Ax y 2 Multi-objective least squares 10-16

20 Regularized estimation add other terms to Ax y 2 to include information about parameters Example: Tikhonov regularization minimize Ax y 2 + λ x 2 goal is to make Ax y small with small x equivalent to solving (A T A + λi)x = A T y solution is unique (if λ > 0) even when A has linearly dependent columns Multi-objective least squares 10-17

21 Signal denoising observed signal is n-vector 15 y = x + v x is unknown signal v is noise Least squares denoising yk k minimize n 1 x y 2 + λ (x i+1 x i ) 2 i=1 goal is to find slowly varying signal ˆx, close to observed signal y Multi-objective least squares 10-18

22 Matrix formulation minimize [ I λd ] x [ y 0 ] 2 D is (n 1) n finite difference matrix D = equivalent to linear equation (I + λd T D)x = y Multi-objective least squares 10-19

23 Trade-off the two objectives ˆx(λ) y and Dˆx(λ) for varying λ ˆx(λ) y Dˆx(λ) Dˆx(λ) 1 05 λ = λ λ = 10 2 λ = ˆx(λ) y Multi-objective least squares 10-20

24 Three solutions 15 λ = λ = 10 2 ˆx(λ)k 1 ˆx(λ)k k k 15 λ = 10 5 ˆx(λ) y for λ 0 ˆx(λ) avg(y)1 for λ ˆx(λ)k 1 λ 10 2 is good compromise Multi-objective least squares k

25 Image deblurring y = Ax + v x is unknown image, y is observed image A is (known) blurring matrix, v is (unknown) noise images are M N, stored as MN-vectors blurred, noisy image y deblurred image ˆx Multi-objective least squares 10-22

26 Least squares deblurring minimize Ax y 2 + λ( D v x 2 + D h x 2 ) 1st term is data fidelity term: ensures Aˆx y 2nd term penalizes differences between values at neighboring pixels D h x 2 + D v x 2 M N 1 = (X i,j+1 X ij ) 2 + i=1 j=1 M 1 i=1 N (X i+1,j X ij ) 2 j=1 if X is the M N image stored in the MN-vector x Multi-objective least squares 10-23

27 Differencing operations in matrix notation suppose x is the M N image X, stored column-wise as MN-vector x = (X 1:M,1, X 1:M,2,, X 1:M,N ) horizontal differencing: (N 1) N block matrix with M M blocks D h = I I I I I I vertical differencing: N N block matrix with (M 1) M blocks D v = D D D, D = Multi-objective least squares 10-24

28 Deblurred images Multi-objective least squares λ = 10 6 λ = 10 4 λ = 10 2 λ=

29 Tomography y = Ax + v x represents values of some quantity in a region of interest of n voxels (pixels) y represents measurements of the integral along lines through the region y i = n A ij x j + v i i=1 A ij is the length of the intersection of the line in measurement i with voxel j x 1 x 2 x 6 line for measurement i Multi-objective least squares 10-26

30 Tomographic reconstruction minimize Ax y 2 + λ( D v x 2 + D h x 2 ) D v and D h are defined as in image deblurring example on page Example left: 4000 lines (100 points, 40 lines per point) right: object placed in the square region at the center of the picture on the left region of interest is divided in pixels Multi-objective least squares 10-27

31 Regularized least squares reconstruction λ = 10 2 λ = 10 1 λ = 1 λ = 5 λ = 10 λ = 100 Multi-objective least squares 10-28

6. Approximation and fitting

6. Approximation and fitting 6. Approximation and fitting Convex Optimization Boyd & Vandenberghe norm approximation least-norm problems regularized approximation robust approximation 6 Norm approximation minimize Ax b (A R m n with

More information

Lecture 2: Tikhonov-Regularization

Lecture 2: Tikhonov-Regularization Lecture 2: Tikhonov-Regularization Bastian von Harrach harrach@math.uni-stuttgart.de Chair of Optimization and Inverse Problems, University of Stuttgart, Germany Advanced Instructional School on Theoretical

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Linear Inverse Problems

Linear Inverse Problems Linear Inverse Problems Ajinkya Kadu Utrecht University, The Netherlands February 26, 2018 Outline Introduction Least-squares Reconstruction Methods Examples Summary Introduction 2 What are inverse problems?

More information

Tikhonov Regularization in General Form 8.1

Tikhonov Regularization in General Form 8.1 Tikhonov Regularization in General Form 8.1 To introduce a more general formulation, let us return to the continuous formulation of the first-kind Fredholm integral equation. In this setting, the residual

More information

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,

More information

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns L. Vandenberghe ECE133A (Winter 2018) 4. Matrix inverses left and right inverse linear independence nonsingular matrices matrices with linearly independent columns matrices with linearly independent rows

More information

12. Cholesky factorization

12. Cholesky factorization L. Vandenberghe ECE133A (Winter 2018) 12. Cholesky factorization positive definite matrices examples Cholesky factorization complex positive definite matrices kernel methods 12-1 Definitions a symmetric

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

A Localized Linearized ROF Model for Surface Denoising

A Localized Linearized ROF Model for Surface Denoising 1 2 3 4 A Localized Linearized ROF Model for Surface Denoising Shingyu Leung August 7, 2008 5 Abstract 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 1 Introduction CT/MRI scan becomes a very

More information

Image processing and Computer Vision

Image processing and Computer Vision 1 / 1 Image processing and Computer Vision Continuous Optimization and applications to image processing Martin de La Gorce martin.de-la-gorce@enpc.fr February 2015 Optimization 2 / 1 We have a function

More information

Markov Random Fields

Markov Random Fields Markov Random Fields Umamahesh Srinivas ipal Group Meeting February 25, 2011 Outline 1 Basic graph-theoretic concepts 2 Markov chain 3 Markov random field (MRF) 4 Gauss-Markov random field (GMRF), and

More information

1. Vectors. notation. examples. vector operations. linear functions. complex vectors. complexity of vector computations

1. Vectors. notation. examples. vector operations. linear functions. complex vectors. complexity of vector computations L. Vandenberghe ECE133A (Winter 2018) 1. Vectors notation examples vector operations linear functions complex vectors complexity of vector computations 1-1 Vector a vector is an ordered finite list of

More information

Lecture 1 Introduction

Lecture 1 Introduction L. Vandenberghe EE236A (Fall 2013-14) Lecture 1 Introduction course overview linear optimization examples history approximate syllabus basic definitions linear optimization in vector and matrix notation

More information

Laplace-distributed increments, the Laplace prior, and edge-preserving regularization

Laplace-distributed increments, the Laplace prior, and edge-preserving regularization J. Inverse Ill-Posed Probl.? (????), 1 15 DOI 1.1515/JIIP.????.??? de Gruyter???? Laplace-distributed increments, the Laplace prior, and edge-preserving regularization Johnathan M. Bardsley Abstract. For

More information

13. Nonlinear least squares

13. Nonlinear least squares L. Vandenberghe ECE133A (Fall 2018) 13. Nonlinear least squares definition and examples derivatives and optimality condition Gauss Newton method Levenberg Marquardt method 13.1 Nonlinear least squares

More information

Physics-based Prior modeling in Inverse Problems

Physics-based Prior modeling in Inverse Problems Physics-based Prior modeling in Inverse Problems MURI Meeting 2013 M Usman Sadiq, Purdue University Charles A. Bouman, Purdue University In collaboration with: Jeff Simmons, AFRL Venkat Venkatakrishnan,

More information

Newton s Method for Estimating the Regularization Parameter for Least Squares: Using the Chi-curve

Newton s Method for Estimating the Regularization Parameter for Least Squares: Using the Chi-curve Newton s Method for Estimating the Regularization Parameter for Least Squares: Using the Chi-curve Rosemary Renaut, Jodi Mead Arizona State and Boise State Copper Mountain Conference on Iterative Methods

More information

9. Least squares data fitting

9. Least squares data fitting L. Vandenberghe EE133A (Spring 2017) 9. Least squares data fitting model fitting regression linear-in-parameters models time series examples validation least squares classification statistics interpretation

More information

Final Exam. Problem Score Problem Score 1 /10 8 /10 2 /10 9 /10 3 /10 10 /10 4 /10 11 /10 5 /10 12 /10 6 /10 13 /10 7 /10 Total /130

Final Exam. Problem Score Problem Score 1 /10 8 /10 2 /10 9 /10 3 /10 10 /10 4 /10 11 /10 5 /10 12 /10 6 /10 13 /10 7 /10 Total /130 EE103/CME103: Introduction to Matrix Methods December 9 2015 S. Boyd Final Exam You may not use any books, notes, or computer programs (e.g., Julia). Throughout this exam we use standard mathematical notation;

More information

Lecture 5 Least-squares

Lecture 5 Least-squares EE263 Autumn 2008-09 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property

More information

Inverse problem and optimization

Inverse problem and optimization Inverse problem and optimization Laurent Condat, Nelly Pustelnik CNRS, Gipsa-lab CNRS, Laboratoire de Physique de l ENS de Lyon Decembre, 15th 2016 Inverse problem and optimization 2/36 Plan 1. Examples

More information

Computer Vision & Digital Image Processing

Computer Vision & Digital Image Processing Computer Vision & Digital Image Processing Image Restoration and Reconstruction I Dr. D. J. Jackson Lecture 11-1 Image restoration Restoration is an objective process that attempts to recover an image

More information

Regularization methods for large-scale, ill-posed, linear, discrete, inverse problems

Regularization methods for large-scale, ill-posed, linear, discrete, inverse problems Regularization methods for large-scale, ill-posed, linear, discrete, inverse problems Silvia Gazzola Dipartimento di Matematica - Università di Padova January 10, 2012 Seminario ex-studenti 2 Silvia Gazzola

More information

Least Squares. Stephen Boyd. EE103 Stanford University. October 28, 2017

Least Squares. Stephen Boyd. EE103 Stanford University. October 28, 2017 Least Squares Stephen Boyd EE103 Stanford University October 28, 2017 Outline Least squares problem Solution of least squares problem Examples Least squares problem 2 Least squares problem suppose m n

More information

Linear Operators and Fourier Transform

Linear Operators and Fourier Transform Linear Operators and Fourier Transform DD2423 Image Analysis and Computer Vision Mårten Björkman Computational Vision and Active Perception School of Computer Science and Communication November 13, 2013

More information

Image processing and nonparametric regression

Image processing and nonparametric regression Image processing and nonparametric regression Rencontres R BoRdeaux 2012 B. Thieurmel Collaborators : P.A. Cornillon, N. Hengartner, E. Matzner-Løber, B. Wolhberg 2 Juillet 2012 Rencontres R BoRdeaux 2012

More information

Homework 6 Solutions

Homework 6 Solutions 8-290 Signals and Systems Profs. Byron Yu and Pulkit Grover Fall 208 Homework 6 Solutions. Part One. (2 points) Consider an LTI system with impulse response h(t) e αt u(t), (a) Compute the frequency response

More information

A Tutorial on Recursive methods in Linear Least Squares Problems

A Tutorial on Recursive methods in Linear Least Squares Problems A Tutorial on Recursive methods in Linear Least Squares Problems by Arvind Yedla 1 Introduction This tutorial motivates the use of Recursive Methods in Linear Least Squares problems, specifically Recursive

More information

14. Nonlinear equations

14. Nonlinear equations L. Vandenberghe ECE133A (Winter 2018) 14. Nonlinear equations Newton method for nonlinear equations damped Newton method for unconstrained minimization Newton method for nonlinear least squares 14-1 Set

More information

PDEs in Image Processing, Tutorials

PDEs in Image Processing, Tutorials PDEs in Image Processing, Tutorials Markus Grasmair Vienna, Winter Term 2010 2011 Direct Methods Let X be a topological space and R: X R {+ } some functional. following definitions: The mapping R is lower

More information

Convex Optimization and l 1 -minimization

Convex Optimization and l 1 -minimization Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l

More information

Regularization Parameter Estimation for Least Squares: A Newton method using the χ 2 -distribution

Regularization Parameter Estimation for Least Squares: A Newton method using the χ 2 -distribution Regularization Parameter Estimation for Least Squares: A Newton method using the χ 2 -distribution Rosemary Renaut, Jodi Mead Arizona State and Boise State September 2007 Renaut and Mead (ASU/Boise) Scalar

More information

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting

More information

Inverse Ill Posed Problems in Image Processing

Inverse Ill Posed Problems in Image Processing Inverse Ill Posed Problems in Image Processing Image Deblurring I. Hnětynková 1,M.Plešinger 2,Z.Strakoš 3 hnetynko@karlin.mff.cuni.cz, martin.plesinger@tul.cz, strakos@cs.cas.cz 1,3 Faculty of Mathematics

More information

Least Squares Data Fitting

Least Squares Data Fitting Least Squares Data Fitting Stephen Boyd EE103 Stanford University October 31, 2017 Outline Least squares model fitting Validation Feature engineering Least squares model fitting 2 Setup we believe a scalar

More information

Nonlinear Least Squares

Nonlinear Least Squares Nonlinear Least Squares Stephen Boyd EE103 Stanford University December 6, 2016 Outline Nonlinear equations and least squares Examples Levenberg-Marquardt algorithm Nonlinear least squares classification

More information

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL of ELECTRICAL & COMPUTER ENGINEERING FINAL EXAM. COURSE: ECE 3084A (Prof. Michaels)

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL of ELECTRICAL & COMPUTER ENGINEERING FINAL EXAM. COURSE: ECE 3084A (Prof. Michaels) GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL of ELECTRICAL & COMPUTER ENGINEERING FINAL EXAM DATE: 30-Apr-14 COURSE: ECE 3084A (Prof. Michaels) NAME: STUDENT #: LAST, FIRST Write your name on the front page

More information

Novel integro-differential equations in image processing and its applications

Novel integro-differential equations in image processing and its applications Novel integro-differential equations in image processing and its applications Prashant Athavale a and Eitan Tadmor b a Institute of Pure and Applied Mathematics, University of California, Los Angeles,

More information

Mathematical Beer Goggles or The Mathematics of Image Processing

Mathematical Beer Goggles or The Mathematics of Image Processing How Mathematical Beer Goggles or The Mathematics of Image Processing Department of Mathematical Sciences University of Bath Postgraduate Seminar Series University of Bath 12th February 2008 1 How 2 How

More information

The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation

The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation Rosemary Renaut Collaborators: Jodi Mead and Iveta Hnetynkova DEPARTMENT OF MATHEMATICS

More information

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction Qifa Ke and Takeo Kanade Department of Computer Science, Carnegie Mellon University Email: ke@cmu.edu, tk@cs.cmu.edu Abstract

More information

Erkut Erdem. Hacettepe University February 24 th, Linear Diffusion 1. 2 Appendix - The Calculus of Variations 5.

Erkut Erdem. Hacettepe University February 24 th, Linear Diffusion 1. 2 Appendix - The Calculus of Variations 5. LINEAR DIFFUSION Erkut Erdem Hacettepe University February 24 th, 2012 CONTENTS 1 Linear Diffusion 1 2 Appendix - The Calculus of Variations 5 References 6 1 LINEAR DIFFUSION The linear diffusion (heat)

More information

Statistically-Based Regularization Parameter Estimation for Large Scale Problems

Statistically-Based Regularization Parameter Estimation for Large Scale Problems Statistically-Based Regularization Parameter Estimation for Large Scale Problems Rosemary Renaut Joint work with Jodi Mead and Iveta Hnetynkova March 1, 2010 National Science Foundation: Division of Computational

More information

Fourier Transforms 1D

Fourier Transforms 1D Fourier Transforms 1D 3D Image Processing Alireza Ghane 1 Overview Recap Intuitions Function representations shift-invariant spaces linear, time-invariant (LTI) systems complex numbers Fourier Transforms

More information

Finite Volume Method for Scalar Advection in 2D

Finite Volume Method for Scalar Advection in 2D Chapter 9 Finite Volume Method for Scalar Advection in D 9. Introduction The purpose of this exercise is to code a program to integrate the scalar advection equation in two-dimensional flows. 9. The D

More information

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration E. Chouzenoux, A. Jezierska, J.-C. Pesquet and H. Talbot Université Paris-Est Lab. d Informatique Gaspard

More information

IMAGE ENHANCEMENT II (CONVOLUTION)

IMAGE ENHANCEMENT II (CONVOLUTION) MOTIVATION Recorded images often exhibit problems such as: blurry noisy Image enhancement aims to improve visual quality Cosmetic processing Usually empirical techniques, with ad hoc parameters ( whatever

More information

Advanced Numerical Linear Algebra: Inverse Problems

Advanced Numerical Linear Algebra: Inverse Problems Advanced Numerical Linear Algebra: Inverse Problems Rosemary Renaut Spring 23 Some Background on Inverse Problems Constructing PSF Matrices The DFT Rosemary Renaut February 4, 23 References Deblurring

More information

The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation

The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation The Chi-squared Distribution of the Regularized Least Squares Functional for Regularization Parameter Estimation Rosemary Renaut DEPARTMENT OF MATHEMATICS AND STATISTICS Prague 2008 MATHEMATICS AND STATISTICS

More information

Optimisation in imaging

Optimisation in imaging Optimisation in imaging Hugues Talbot ICIP 2014 Tutorial Optimisation on Hierarchies H. Talbot : Optimisation 1/59 Outline of the lecture 1 Concepts in optimization Cost function Constraints Duality 2

More information

Iterative Methods for Ill-Posed Problems

Iterative Methods for Ill-Posed Problems Iterative Methods for Ill-Posed Problems Based on joint work with: Serena Morigi Fiorella Sgallari Andriy Shyshkov Salt Lake City, May, 2007 Outline: Inverse and ill-posed problems Tikhonov regularization

More information

Two well known examples. Applications of structured low-rank approximation. Approximate realisation = Model reduction. System realisation

Two well known examples. Applications of structured low-rank approximation. Approximate realisation = Model reduction. System realisation Two well known examples Applications of structured low-rank approximation Ivan Markovsky System realisation Discrete deconvolution School of Electronics and Computer Science University of Southampton The

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Non-negative Quadratic Programming Total Variation Regularization for Poisson Vector-Valued Image Restoration

Non-negative Quadratic Programming Total Variation Regularization for Poisson Vector-Valued Image Restoration University of New Mexico UNM Digital Repository Electrical & Computer Engineering Technical Reports Engineering Publications 5-10-2011 Non-negative Quadratic Programming Total Variation Regularization

More information

Overview of Spatial Statistics with Applications to fmri

Overview of Spatial Statistics with Applications to fmri with Applications to fmri School of Mathematics & Statistics Newcastle University April 8 th, 2016 Outline Why spatial statistics? Basic results Nonstationary models Inference for large data sets An example

More information

Investigating the Influence of Box-Constraints on the Solution of a Total Variation Model via an Efficient Primal-Dual Method

Investigating the Influence of Box-Constraints on the Solution of a Total Variation Model via an Efficient Primal-Dual Method Article Investigating the Influence of Box-Constraints on the Solution of a Total Variation Model via an Efficient Primal-Dual Method Andreas Langer Department of Mathematics, University of Stuttgart,

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Discrete-time Fourier transform (DTFT) representation of DT aperiodic signals Section The (DT) Fourier transform (or spectrum) of x[n] is

Discrete-time Fourier transform (DTFT) representation of DT aperiodic signals Section The (DT) Fourier transform (or spectrum) of x[n] is Discrete-time Fourier transform (DTFT) representation of DT aperiodic signals Section 5. 3 The (DT) Fourier transform (or spectrum) of x[n] is X ( e jω) = n= x[n]e jωn x[n] can be reconstructed from its

More information

Golub-Kahan iterative bidiagonalization and determining the noise level in the data

Golub-Kahan iterative bidiagonalization and determining the noise level in the data Golub-Kahan iterative bidiagonalization and determining the noise level in the data Iveta Hnětynková,, Martin Plešinger,, Zdeněk Strakoš, * Charles University, Prague ** Academy of Sciences of the Czech

More information

CS540 Machine learning Lecture 5

CS540 Machine learning Lecture 5 CS540 Machine learning Lecture 5 1 Last time Basis functions for linear regression Normal equations QR SVD - briefly 2 This time Geometry of least squares (again) SVD more slowly LMS Ridge regression 3

More information

Lecture 6. Numerical methods. Approximation of functions

Lecture 6. Numerical methods. Approximation of functions Lecture 6 Numerical methods Approximation of functions Lecture 6 OUTLINE 1. Approximation and interpolation 2. Least-square method basis functions design matrix residual weighted least squares normal equation

More information

LPA-ICI Applications in Image Processing

LPA-ICI Applications in Image Processing LPA-ICI Applications in Image Processing Denoising Deblurring Derivative estimation Edge detection Inverse halftoning Denoising Consider z (x) =y (x)+η (x), wherey is noise-free image and η is noise. assume

More information

Time Series. Karanveer Mohan Keegan Go Stephen Boyd. EE103 Stanford University. November 15, 2016

Time Series. Karanveer Mohan Keegan Go Stephen Boyd. EE103 Stanford University. November 15, 2016 Time Series Karanveer Mohan Keegan Go Stephen Boyd EE103 Stanford University November 15, 2016 Outline Introduction Linear operations Least-squares Prediction Introduction 2 Time series data represent

More information

Learning Deep Architectures for AI. Part II - Vijay Chakilam

Learning Deep Architectures for AI. Part II - Vijay Chakilam Learning Deep Architectures for AI - Yoshua Bengio Part II - Vijay Chakilam Limitations of Perceptron x1 W, b 0,1 1,1 y x2 weight plane output =1 output =0 There is no value for W and b such that the model

More information

Ill Posed Inverse Problems in Image Processing

Ill Posed Inverse Problems in Image Processing Ill Posed Inverse Problems in Image Processing Introduction, Structured matrices, Spectral filtering, Regularization, Noise revealing I. Hnětynková 1,M.Plešinger 2,Z.Strakoš 3 hnetynko@karlin.mff.cuni.cz,

More information

NONLINEAR DIFFUSION PDES

NONLINEAR DIFFUSION PDES NONLINEAR DIFFUSION PDES Erkut Erdem Hacettepe University March 5 th, 0 CONTENTS Perona-Malik Type Nonlinear Diffusion Edge Enhancing Diffusion 5 References 7 PERONA-MALIK TYPE NONLINEAR DIFFUSION The

More information

Newton s method for obtaining the Regularization Parameter for Regularized Least Squares

Newton s method for obtaining the Regularization Parameter for Regularized Least Squares Newton s method for obtaining the Regularization Parameter for Regularized Least Squares Rosemary Renaut, Jodi Mead Arizona State and Boise State International Linear Algebra Society, Cancun Renaut and

More information

ANALYTICAL BOUNDS ON THE MINIMIZERS OF (NONCONVEX) REGULARIZED LEAST-SQUARES. Mila Nikolova. (Communicated by Jackie (Jianhong) Shen)

ANALYTICAL BOUNDS ON THE MINIMIZERS OF (NONCONVEX) REGULARIZED LEAST-SQUARES. Mila Nikolova. (Communicated by Jackie (Jianhong) Shen) Inverse Problems and Imaging Volume 1, No. 4, 2007, 661 677 Web site: http://www.aimsciences.org ANALYTICAL BOUNDS ON THE MINIMIZERS OF NONCONVEX) REGULARIZED LEAST-SQUARES Mila Nikolova CMLA, ENS Cachan,

More information

Least-squares data fitting

Least-squares data fitting EE263 Autumn 2015 S. Boyd and S. Lall Least-squares data fitting 1 Least-squares data fitting we are given: functions f 1,..., f n : S R, called regressors or basis functions data or measurements (s i,

More information

Data Sparse Matrix Computation - Lecture 20

Data Sparse Matrix Computation - Lecture 20 Data Sparse Matrix Computation - Lecture 20 Yao Cheng, Dongping Qi, Tianyi Shi November 9, 207 Contents Introduction 2 Theorems on Sparsity 2. Example: A = [Φ Ψ]......................... 2.2 General Matrix

More information

NAME: ht () 1 2π. Hj0 ( ) dω Find the value of BW for the system having the following impulse response.

NAME: ht () 1 2π. Hj0 ( ) dω Find the value of BW for the system having the following impulse response. University of California at Berkeley Department of Electrical Engineering and Computer Sciences Professor J. M. Kahn, EECS 120, Fall 1998 Final Examination, Wednesday, December 16, 1998, 5-8 pm NAME: 1.

More information

Learning Spectral Graph Segmentation

Learning Spectral Graph Segmentation Learning Spectral Graph Segmentation AISTATS 2005 Timothée Cour Jianbo Shi Nicolas Gogin Computer and Information Science Department University of Pennsylvania Computer Science Ecole Polytechnique Graph-based

More information

Control Design. Lecture 9: State Feedback and Observers. Two Classes of Control Problems. State Feedback: Problem Formulation

Control Design. Lecture 9: State Feedback and Observers. Two Classes of Control Problems. State Feedback: Problem Formulation Lecture 9: State Feedback and s [IFAC PB Ch 9] State Feedback s Disturbance Estimation & Integral Action Control Design Many factors to consider, for example: Attenuation of load disturbances Reduction

More information

Convex Optimization: Applications

Convex Optimization: Applications Convex Optimization: Applications Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh Convex Optimization 1-75/36-75 Based on material from Boyd, Vandenberghe Norm Approximation minimize Ax b (A R m

More information

Laplacian Filters. Sobel Filters. Laplacian Filters. Laplacian Filters. Laplacian Filters. Laplacian Filters

Laplacian Filters. Sobel Filters. Laplacian Filters. Laplacian Filters. Laplacian Filters. Laplacian Filters Sobel Filters Note that smoothing the image before applying a Sobel filter typically gives better results. Even thresholding the Sobel filtered image cannot usually create precise, i.e., -pixel wide, edges.

More information

Non-uniformity Correction of IR Images using Natural Scene Statistics

Non-uniformity Correction of IR Images using Natural Scene Statistics Non-uniformity Correction of IR Images using Natural Scene Statistics Todd Goodall, Alan C. Bovik, and Haris Vikalo 1 Nicholas G. Paulter, Jr. 2 1 University of Texas at Austin 2 National Institute of

More information

Sparse and Robust Optimization and Applications

Sparse and Robust Optimization and Applications Sparse and and Statistical Learning Workshop Les Houches, 2013 Robust Laurent El Ghaoui with Mert Pilanci, Anh Pham EECS Dept., UC Berkeley January 7, 2013 1 / 36 Outline Sparse Sparse Sparse Probability

More information

Spatially adaptive alpha-rooting in BM3D sharpening

Spatially adaptive alpha-rooting in BM3D sharpening Spatially adaptive alpha-rooting in BM3D sharpening Markku Mäkitalo and Alessandro Foi Department of Signal Processing, Tampere University of Technology, P.O. Box FIN-553, 33101, Tampere, Finland e-mail:

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang Image Noise: Detection, Measurement and Removal Techniques Zhifei Zhang Outline Noise measurement Filter-based Block-based Wavelet-based Noise removal Spatial domain Transform domain Non-local methods

More information

Maximal Entropy for Reconstruction of Back Projection Images

Maximal Entropy for Reconstruction of Back Projection Images Maximal Entropy for Reconstruction of Back Projection Images Tryphon Georgiou Department of Electrical and Computer Engineering University of Minnesota Minneapolis, MN 5545 Peter J Olver Department of

More information

Towards Tomographic Photoelasticity

Towards Tomographic Photoelasticity Towards Tomographic Photoelasticity Dr Rachel Tomlinson Department of Mechanical Engineering, Outline What is photoelasticity? 3D Photoelasticity Methods Advances in data collection and processing Future

More information

Linear Neural Networks

Linear Neural Networks Chapter 10 Linear Neural Networks In this chapter, we introduce the concept of the linear neural network. 10.1 Introduction and Notation 1. The linear neural cell, or node has the schematic form as shown

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

Preconditioning. Noisy, Ill-Conditioned Linear Systems

Preconditioning. Noisy, Ill-Conditioned Linear Systems Preconditioning Noisy, Ill-Conditioned Linear Systems James G. Nagy Emory University Atlanta, GA Outline 1. The Basic Problem 2. Regularization / Iterative Methods 3. Preconditioning 4. Example: Image

More information

MIT 2.71/2.710 Optics 10/31/05 wk9-a-1. The spatial frequency domain

MIT 2.71/2.710 Optics 10/31/05 wk9-a-1. The spatial frequency domain 10/31/05 wk9-a-1 The spatial frequency domain Recall: plane wave propagation x path delay increases linearly with x λ z=0 θ E 0 x exp i2π sinθ + λ z i2π cosθ λ z plane of observation 10/31/05 wk9-a-2 Spatial

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Towards Improved Sensitivity in Feature Extraction from Signals: one and two dimensional

Towards Improved Sensitivity in Feature Extraction from Signals: one and two dimensional Towards Improved Sensitivity in Feature Extraction from Signals: one and two dimensional Rosemary Renaut, Hongbin Guo, Jodi Mead and Wolfgang Stefan Supported by Arizona Alzheimer s Research Center and

More information

Wavelets and Multiresolution Processing

Wavelets and Multiresolution Processing Wavelets and Multiresolution Processing Wavelets Fourier transform has it basis functions in sinusoids Wavelets based on small waves of varying frequency and limited duration In addition to frequency,

More information

Edge Detection in Computer Vision Systems

Edge Detection in Computer Vision Systems 1 CS332 Visual Processing in Computer and Biological Vision Systems Edge Detection in Computer Vision Systems This handout summarizes much of the material on the detection and description of intensity

More information

Covariance Matrix Simplification For Efficient Uncertainty Management

Covariance Matrix Simplification For Efficient Uncertainty Management PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part

More information

A Riemannian Framework for Denoising Diffusion Tensor Images

A Riemannian Framework for Denoising Diffusion Tensor Images A Riemannian Framework for Denoising Diffusion Tensor Images Manasi Datar No Institute Given Abstract. Diffusion Tensor Imaging (DTI) is a relatively new imaging modality that has been extensively used

More information

Lecture 3 January 23

Lecture 3 January 23 EE 123: Digital Signal Processing Spring 2007 Lecture 3 January 23 Lecturer: Prof. Anant Sahai Scribe: Dominic Antonelli 3.1 Outline These notes cover the following topics: Eigenvectors and Eigenvalues

More information

Intro. Computer Control Systems: F9

Intro. Computer Control Systems: F9 Intro. Computer Control Systems: F9 State-feedback control and observers Dave Zachariah Dept. Information Technology, Div. Systems and Control 1 / 21 dave.zachariah@it.uu.se F8: Quiz! 2 / 21 dave.zachariah@it.uu.se

More information

Lecture 7 Monotonicity. September 21, 2008

Lecture 7 Monotonicity. September 21, 2008 Lecture 7 Monotonicity September 21, 2008 Outline Introduce several monotonicity properties of vector functions Are satisfied immediately by gradient maps of convex functions In a sense, role of monotonicity

More information

VECTORS IN A STRAIGHT LINE

VECTORS IN A STRAIGHT LINE A. The Equation of a Straight Line VECTORS P3 VECTORS IN A STRAIGHT LINE A particular line is uniquely located in space if : I. It has a known direction, d, and passed through a known fixed point, or II.

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Chapter 1 Fundamental Concepts

Chapter 1 Fundamental Concepts Chapter 1 Fundamental Concepts 1 Signals A signal is a pattern of variation of a physical quantity, often as a function of time (but also space, distance, position, etc). These quantities are usually the

More information

ENERGY METHODS IN IMAGE PROCESSING WITH EDGE ENHANCEMENT

ENERGY METHODS IN IMAGE PROCESSING WITH EDGE ENHANCEMENT ENERGY METHODS IN IMAGE PROCESSING WITH EDGE ENHANCEMENT PRASHANT ATHAVALE Abstract. Digital images are can be realized as L 2 (R 2 objects. Noise is introduced in a digital image due to various reasons.

More information