Linear Inverse Problems Ajinkya Kadu Utrecht University, The Netherlands February 26, 2018
Outline Introduction Least-squares Reconstruction Methods Examples Summary Introduction 2
What are inverse problems? Inverse problems are determining cause for an observed effect. Forward Problem Cause Inverse Problem Effect (Parameters) (Data) Forward Problem: x F (x) Inverse Problem: F (x) x Introduction 3
Properties of Inverse Problems Forward problems are always well-posed, while inverse problems are not! Well-posedness in terms of Hadamard conditions: There exists a solution for all input data. If solution exists, it must be unique. solution of the problem depends continuously on input datum. If any of these conditions is violated, problem is called ill-posed. Introduction 4
Linear Inverse Problem F (x) is a linear functional. Problems of form y Ax we are given A R M N, we observe y R M and want to find (or estimate) x R N. most fundamental concept in all of engineering, science, and applied maths! two areas of Interests: Supervised Learning Computational Imaging Introduction 5
Supervised Learning estimate a function f(t) on R D from observations of its samples. f(t m ) y m, m = 1,..., M f : R D R. Problem is not well-posed (many f possible). Need to define set of functions F from which to choose f. Introduction 6
Supervised Learning Example: Linear Regression F contain set of all linear functionals on R D. linear function f : R D R obeys for all α, β R and t 1, t 2 R D. f(αt 1 + βt 2 ) = αf(t 1 ) + βf(t 2 ) every linear functional on R D is uniquely represented by vector x f R D (Riesz Representation Theorem, holds in any Hilbert space) f(t) = t, x f Introduction 7
Supervised Learning Example: Linear Regression given (t m, y m ), find x such that y m = t m, x. y = Ax, where A = t T 1 t T 2. y 1 y 2 t T M y M, y =. Introduction 8
Supervised Learning Example: Non-Linear Regression using a basis F is spanned by basis functions ψ 1,, ψ N. N f(t) = x n ψ n (t) y M n=1 Again, fitting can be rewritten as: y 1 ψ 1 (t 1 ) ψ 2 (t 1 )... ψ N (t 1 ) x 1 y 2. = ψ 1 (t 2 ) ψ 2 (t 2 )... ψ N (t 2 ) x 2..... ψ 1 (t M ) ψ 2 (t M )... ψ N (t M ) x N Introduction 9
Computational Imaging recover a function f that represents some physical structure indexed by location similar to regression problem: discretize the problem by representing f using a basis. Unlike regression problem: not observe f, but more general linear functions. Introduction 10
Computational Imaging Example: Range profiling using deconvolution sending a pulse out (of electromagnetic or acoustic energy) and listening to the echo. Applications: radar imaging underwater acoustic imaging seismic imaging medical imaging channel equalization in wireless communications image deblurring Introduction 11
Computational Imaging Example: Range profiling using deconvolution send a pulse p(t) out, and receive back a signal y(t) y(t) = f(s)p(t s) ds assuming f(t) is time-limited, {ψ n } basis for L 2 ([0, T ]) This leads to y(t) = f(t) = N x n ψ n (t) n=1 N ( x n n=1 ) ψ n (s)p(t s) ds Introduction 12
Computational Imaging Example: Range profiling using deconvolution we only observe finite set of samples of y(t): N ( ) y m := y(t m ) = x n ψ n (s)p(t m s) ds n=1 = n A[m, n]x n where A[m, n] = ψ n (s)p(t m s) ds = p m, ψ n can write deconvolution problem as: y = Ax a solution can be synthesized using N f(t) ˆ = ˆx n ψ n (t) Introduction i=1 13
Computational Imaging Example: Tomographic reconstruction Tomography: learn about the interior of an object while only taking measurements on the exterior R[f] = f(s, t) dl Introduction 14
Computational Imaging Example: Tomographic reconstruction f(s, t) = γ Γ x γ ψ γ (s, t) = y m = R rm,θ m [f(s, t)] Resulting problem is a linear IP: y = Ax, A[m, n] = R rm,θ m [Ψ n ] Introduction 15
Outline Introduction Least-squares Reconstruction Methods Examples Summary Least-squares 16
Least-Squares formulation LS framework: find an x that minimizes length of residual solve an optimization problem r = y Ax minimize x R N y Ax 2 2 If A written using Singular value decomposition: A = UΣV T, U R M R, Σ R R R, V R N R Then the solution to least-squares problem is: x ls = VΣ 1 U T y Least-squares 17
Least-squares solution x ls = VΣ 1 U T y When y = Ax has exact solution: it must be x ls. no exact solution: x ls is a solution to least-squares problem infinite solutions: x ls is the one with smallest norm. Solution can be written in compact form: x ls = A y A (= VΣ 1 U T ) is called pseudo-inverse! Least-squares 18
Pseudo-inverse A is a square matrix A = A 1 A has full column rank A = ( A T A ) 1 A T A has full row rank A = A T ( AA T ) 1 Otherwise (for low-rank matrix) A = VΣ 1 U T Least-squares 19
Outline Introduction Least-squares Reconstruction Methods Examples Summary Reconstruction Methods 20
Stable Reconstructions Two important methods: Truncated SVD Tikhonov regularization Reconstruction Methods 21
Truncated SVD The solution of minimize x R N y Ax 2 2 is given by: x = VΣ 1 U T y = R r=1 u r T y σ r v r = If σ r 0, noise δ affects the solution. R r=1 u T r (y tr + δ) v r σ r Truncate SVD: throw away contributions of σ r < ɛ. assuming σ 1,..., σ K > ɛ then x tsvd = K r=1 u T r y σ r v r Reconstruction Methods 22
Tikhonov regularization minimize x R N y Ax 2 2 + λ x 2 2 solution is obtained by setting gradient to zero: x tik = ( A T A + λi ) 1 A T y Tikhonov reconstruction in SVD form: R σ r x tik = σr 2 + λ (ut r y)v r r=1 Generalized Tikhonov regularization: minimize y Ax 2 2 + λ Lx 2 2 = x gen-tik = x R N ( A T A + λl T L) 1 A T y Reconstruction Methods 23
Two types of methods: Iterative methods specifically for linear problems - Krylov subspace methods { Arnoldi, Lanczos, conjugate gradient (CG, BiCG, NLCG,etc), GMRES, IDR,... } Convex optimization x k+1 = x k + α k s k s k - descent direction Gradient descent method: s k = f Newton method: s k = H(f) 1 f α k - step size, chosen from line search method Reconstruction Methods 24
Sparse reconstruction minimize x R N y Ax 2 2 + λ x 1 l 2 regularization induces smoothness. l 1 regularization induces sparsity, l 1 norm in higher dimension is very pointy. The above problem is also known as LASSO in statistics. widely used in statistics, machine learning, signal processing. Reconstruction Methods 25
Outline Introduction Least-squares Reconstruction Methods Examples Summary Examples 26
Image denoising minimize x R N y Ix 2 2 True Image Noisy Image Tikhonov Reg Examples 27
Image deblurring minimize x R N y Dx 2 2 True Image Blurred Image Sparse Reg Examples 28
X-Ray Tomography minimize x R N y Wx 2 2 True Image Projection Data Examples 29
X-Ray Tomography True Image LSQR Tikhonov reg Sparse reg Examples 30
Outline Introduction Least-squares Reconstruction Methods Examples Summary Summary 31
Summary Inverse problems : an active field of research arises in many applications including computational imaging, machine learning, remote sensing, etc Least-squares is a popular choice for inversion. Stable reconstructions are important, and hence the regularization. Sparse reconstruction methods have gained popularity in last two decades. Summary 32
Thank you! If interested in the topic, Join us in the journey! Current Team Members: Tristan van Leeuwen Sarah Gaaf Nick Luiken Ajinkya Kadu Opportunities: I Undergraduate/Graduate Thesis I Summer Research Project Summary 33