Nonlinear Analysis with Frames. Part III: Algorithms

Similar documents
The Iterative and Regularized Least Squares Algorithm for Phase Retrieval

Phase Retrieval using the Iterative Regularized Least-Squares Algorithm

Fast Angular Synchronization for Phase Retrieval via Incomplete Information

Norms and embeddings of classes of positive semidefinite matrices

Statistics of the Stability Bounds in the Phase Retrieval Problem

Fast and Robust Phase Retrieval

Phase recovery with PhaseCut and the wavelet transform case

Lipschitz analysis of the phase retrieval problem

A Geometric Analysis of Phase Retrieval

Houston Journal of Mathematics. University of Houston Volume, No.,

Solving Corrupted Quadratic Equations, Provably

An algebraic approach to phase retrieval

Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms

The Phase Problem: A Mathematical Tour from Norbert Wiener to Random Matrices and Convex Optimization

Fundamental Limits of Weak Recovery with Applications to Phase Retrieval

Nonconvex Methods for Phase Retrieval

Coherent imaging without phases

Frames and Phase Retrieval

Phase Retrieval from Local Measurements in Two Dimensions

Statistical Issues in Searches: Photon Science Response. Rebecca Willett, Duke University

INEXACT ALTERNATING OPTIMIZATION FOR PHASE RETRIEVAL WITH OUTLIERS

Three Recent Examples of The Effectiveness of Convex Programming in the Information Sciences

Fourier phase retrieval with random phase illumination

Globally Convergent Levenberg-Marquardt Method For Phase Retrieval

arxiv: v1 [cs.it] 26 Oct 2015

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization

Solving Large-scale Systems of Random Quadratic Equations via Stochastic Truncated Amplitude Flow

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

Fast Compressive Phase Retrieval

AN ALGORITHM FOR EXACT SUPER-RESOLUTION AND PHASE RETRIEVAL

Sparse Solutions of an Undetermined Linear System

Phase retrieval with random Gaussian sensing vectors by alternating projections

On Lipschitz Analysis and Lipschitz Synthesis for the Phase Retrieval Problem

Provable Alternating Minimization Methods for Non-convex Optimization

Reshaped Wirtinger Flow for Solving Quadratic System of Equations

Solving PhaseLift by low-rank Riemannian optimization methods for complex semidefinite constraints

Compressive Phase Retrieval From Squared Output Measurements Via Semidefinite Programming

PHASELIFTOFF: AN ACCURATE AND STABLE PHASE RETRIEVAL METHOD BASED ON DIFFERENCE OF TRACE AND FROBENIUS NORMS

arxiv: v2 [cs.it] 20 Sep 2011

Self-Calibration and Biconvex Compressive Sensing

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Lecture 9: Laplacian Eigenmaps

Recovery of Compactly Supported Functions from Spectrogram Measurements via Lifting

Three-dimensional structure determination of molecules without crystallization

PhaseLift: A novel framework for phase retrieval. Vladislav Voroninski

Orientation Assignment in Cryo-EM

Gauge optimization and duality

Convex recovery from interferometric measurements

Generalized Phase Retrieval: Isometries in Vector Spaces

Phase Retrieval Meets Statistical Learning Theory: A Flexible Convex Relaxation

arxiv: v1 [math.na] 16 Apr 2019

Phase Retrieval by Linear Algebra

Frequency Subspace Amplitude Flow for Phase Retrieval

Robust Phase Retrieval Algorithm for Time-Frequency Structured Measurements

arxiv: v1 [math.fa] 25 Aug 2013

Linear Models for Regression CS534

Recovering overcomplete sparse representations from structured sensing

Provable Non-convex Phase Retrieval with Outliers: Median Truncated Wirtinger Flow

Sparse and Low Rank Recovery via Null Space Properties

Convex recovery from interferometric measurements

Convex Phase Retrieval without Lifting via PhaseMax

arxiv: v3 [math.fa] 6 Sep 2016

Lecture 10: Dimension Reduction Techniques

The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences

Fundamental Limits of PhaseMax for Phase Retrieval: A Replica Analysis

PHASE RETRIEVAL FROM STFT MEASUREMENTS VIA NON-CONVEX OPTIMIZATION. Tamir Bendory and Yonina C. Eldar, Fellow IEEE

Covariance Sketching via Quadratic Sampling

SPECTRAL METHOD FOR MULTIPLEXED PHASE RETRIEVAL AND APPLICATION IN OPTICAL IMAGING IN COMPLEX MEDIA

Lift me up but not too high Fast algorithms to solve SDP s with block-diagonal constraints

Hybrid projection reflection method for phase retrieval

arxiv: v1 [cs.it] 27 Sep 2015

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

IPAM Summer School Optimization methods for machine learning. Jorge Nocedal

Linear Models for Regression CS534

Vectorial Phase Retrieval of 1-D Signals

On The 2D Phase Retrieval Problem

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

Research Statement Qing Qu

Elaine T. Hale, Wotao Yin, Yin Zhang

Super-resolution via Convex Programming

Random encoding of quantized finite frame expansions

Phase Retrieval via Matrix Completion

Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems

Tractable Upper Bounds on the Restricted Isometry Constant

Conjugate Gradient (CG) Method

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Fast, Sample-Efficient Algorithms for Structured Phase Retrieval

Low-Rank Matrix Recovery

1-Bit Matrix Completion

Convention Paper Presented at the 133rd Convention 2012 October San Francisco, USA

PHASE retrieval (PR) is encountered in several imaging

LOW-RANK SPECTRAL OPTIMIZATION

PHASE RETRIEVAL FOR IMAGING PROBLEMS.

Composite nonlinear models at scale

Convex recovery from interferometric measurements

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

APPROXIMATING THE INVERSE FRAME OPERATOR FROM LOCALIZED FRAMES

Transcription:

Nonlinear Analysis with Frames. Part III: Algorithms Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD July 28-30, 2015 Modern Harmonic Analysis and Applications Summer Graduate Program University of Maryland, College Park, MD 20742

Thanks to our sponsors: This material is based upon work supported by the National Science Foundation under Grants No. DMS-1413249, DMS-1501640. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Table of Contents: 1 Problem Formulation 2 PhaseLift 3 IRLS

Table of Contents 1 Problem Formulation 2 PhaseLift 3 IRLS

Problem Formulation The phase retrieval problem Let H = C n. The quotient space Ĥ = Cn /T 1, with classes induced by x y if there is real ϕ with x = e iϕ y. Frame F = {f 1,, f m } C n and nonlinear maps α : Ĥ Rm, α(x) = ( x, f k ) 1 k m. β : Ĥ R m, β(x) = ( x, f k 2) 1 k m. The frame is said phase retrievable (or that it gives phase retrieval) if α (or β) is injective.

Problem Formulation The phase retrieval problem Let H = C n. The quotient space Ĥ = Cn /T 1, with classes induced by x y if there is real ϕ with x = e iϕ y. Frame F = {f 1,, f m } C n and nonlinear maps α : Ĥ Rm, α(x) = ( x, f k ) 1 k m. β : Ĥ R m, β(x) = ( x, f k 2) 1 k m. The frame is said phase retrievable (or that it gives phase retrieval) if α (or β) is injective. The general phase retrieval problem a.k.a. phaseless reconstruction: Decide when a given frame is phase retrievable, and, if so, find an algorithm to recover x from y = α(x) (or from y = β(x)) up to a global phase factor.

Problem Formulation Algorithms Our Problem Today: Assume F is phase retrievable. Want reconstruction algorithms.

Problem Formulation Algorithms Our Problem Today: Assume F is phase retrievable. Want reconstruction algorithms. Recursive Projections: Gerchberg-Saxton Matrix Estimation: PhaseLift (Candes, Strohmer, Voroninski 12, CandesLi) Signal Estimation: Iterative Regularized Least Squares (IRLS), Wirtinger Flow (Candes 14) Algorithms for special frames: Reconstruction via Polarization (Alexeev,Bandeira,Fickus, Mixon; Bodmann,Hammen), Fourier transform (Lim&co MIT; Bates 82; Bal 09; PhaseLift with Masking; 4n-4 by Bodmann,Hammen), Shift Invariant Frames (Iwen, Viswanathan,Wang), High Redundancy (BBCE 09) Algorithms for special signals: sparse signals (... e.g. Iwen,Viswanathan,Wang)

Problem Formulation Algorithms Our Problem Today: Assume F is phase retrievable. Want reconstruction algorithms. Recursive Projections: Gerchberg-Saxton Matrix Estimation: PhaseLift (Candes, Strohmer, Voroninski 12, CandesLi) Signal Estimation: Iterative Regularized Least Squares (IRLS), Wirtinger Flow (Candes 14) Algorithms for special frames: Reconstruction via Polarization (Alexeev,Bandeira,Fickus, Mixon; Bodmann,Hammen), Fourier transform (Lim&co MIT; Bates 82; Bal 09; PhaseLift with Masking; 4n-4 by Bodmann,Hammen), Shift Invariant Frames (Iwen, Viswanathan,Wang), High Redundancy (BBCE 09) Algorithms for special signals: sparse signals (... e.g. Iwen,Viswanathan,Wang)

Problem Formulation Algorithms Our Problem Today: Assume F is phase retrievable. Want reconstruction algorithms. Recursive Projections: Gerchberg-Saxton Matrix Estimation: PhaseLift (Candes, Strohmer, Voroninski 12, CandesLi) Signal Estimation: Iterative Regularized Least Squares (IRLS), Wirtinger Flow (Candes 14) Algorithms for special frames: Reconstruction via Polarization (Alexeev,Bandeira,Fickus, Mixon; Bodmann,Hammen), Fourier transform (Lim&co MIT; Bates 82; Bal 09; PhaseLift with Masking; 4n-4 by Bodmann,Hammen), Shift Invariant Frames (Iwen, Viswanathan,Wang), High Redundancy (BBCE 09) Algorithms for special signals: sparse signals (... e.g. Iwen,Viswanathan,Wang)

Table of Contents 1 Problem Formulation 2 PhaseLift 3 IRLS

PhaseLift The Idea Consider the noiseless case y = β(x). The main idea is embodied in the following feasibility problem: Find subject to: X = X 0 A(X) = y rank(x) = 1 X (Feas)

PhaseLift The Idea Consider the noiseless case y = β(x). The main idea is embodied in the following feasibility problem: Find subject to: X = X 0 A(X) = y rank(x) = 1 X (Feas) Alternatively, since there is a unique rank 1 that satisfies this problem: Min subject to: X = X 0 A(X) = y rank(x) (L0) Except for rank(x) the optimization problem would be convex.

PhaseLift The Idea - cont d IDEA: Replace rank(x) by trace(x) as in the Matrix Completion problem.

PhaseLift The Idea - cont d IDEA: Replace rank(x) by trace(x) as in the Matrix Completion problem. Once a solution X is found, the vector x can be easily obtained from the factorization: X = xx.

PhaseLift The Algorithm (PhaseLift) min trace(x) A(X)=y,X=X 0 which is a convex optimization problem (a semi-definite program: SDP).

PhaseLift The Algorithm (PhaseLift) min trace(x) A(X)=y,X=X 0 which is a convex optimization problem (a semi-definite program: SDP). Theorem (Candés-Li 2014) Assume each vector f k is drawn independently from N (0, I n /2) + in (0, I n /2), or each vector is drawn independently from the uniform distribution on the complex sphere of radius n. Then there are universal constants c 0, c 1, γ > 0 so that for m c 0 n, for every x C n the problem (PhaseLift) has the same solution as (L0) with probability at least 1 c 1 e γn.

PhaseLift The Algorithm (PhaseLift) min trace(x) A(X)=y,X=X 0 which is a convex optimization problem (a semi-definite program: SDP). Theorem (Candés-Li 2014) Assume each vector f k is drawn independently from N (0, I n /2) + in (0, I n /2), or each vector is drawn independently from the uniform distribution on the complex sphere of radius n. Then there are universal constants c 0, c 1, γ > 0 so that for m c 0 n, for every x C n the problem (PhaseLift) has the same solution as (L0) with probability at least 1 c 1 e γn. Hand & Demanet (2013) showed (PhaseLift) is in essence a feasibility problem.

PhaseLift Performance Bounds Consider the measurement model in the presence of noise y = β(x) + ν

PhaseLift Performance Bounds Consider the measurement model in the presence of noise Modify the optimization problem: y = β(x) + ν min X=X 0 A(X) y 1 (PL2)

PhaseLift Performance Bounds - cont d Modified Phase Lift algorithm is robust to noise: Theorem (Candés-Li 2014) Consider the same stochastic process for the random frame F. There is a universal constant C 0 > 0 so that for all x C n the solution to (PL2) obeys X xx 2 C 0 ν 1 m For the Gaussian model this holds with the same probability as in the noiseless case, whereas the probability of failure is exponentially small in n in the uniform model. The principal eigenvector x 0 of X (normalized by the square root of the principal eigenvalue) obeys D 2 (x 0, x) C 0 min( x 2, ν 1 m x 2 ).

Table of Contents 1 Problem Formulation 2 PhaseLift 3 IRLS

Iterative Regularized Least-Squares The Idea Consider the measurement process The Least-Squares criterion: y k = x, f k 2 + ν k, 1 k m m min x, f k 2 y k 2 x C n k=1 can be understood as the Maximum Likelihood Estimator (MLE) when the noise vector ν R m is normal distributed with zero mean and covariance σ 2 I m. However the optimization problem is not convex and has many local minima.

Iterative Regularized Least-Squares The Idea - cont d Consider the following optimization criterion: J(u, v; λ, µ) = m 1 2 2 k f k, v + v, f k f k, u ) y k k=1 +λ u 2 2 + µ u v 2 2 + λ v 2 2

Iterative Regularized Least-Squares The Idea - cont d Consider the following optimization criterion: J(u, v; λ, µ) = m 1 2 2 k f k, v + v, f k f k, u ) y k k=1 +λ u 2 2 + µ u v 2 2 + λ v 2 2 The Iterative Regularized Least-Squares (IRLS) algorithm is based on minimization: x t+1 = argmin u J(u, x t ; λ t, µ t )

Iterative Regularized Least-Squares The Algorithm: Initialization Step 1. Initialization. Compute the principal eigenvector of R y = m k=1 y k f k fk using e.g. the power method. Let (e 1, a 1 ) be the eigen-pair with e 1 C n and a 1 R. If a 1 0 then set x = 0 and exit. Otherwise initialize: (1 x 0 ρ)a1 = mk=1 e 1, f k 4 e 1 (3.1) λ 0 = ρa 1 (3.2) µ 0 = ρa 1 (3.3) t = 0 (3.4)

Iterative Regularized Least-Squares The Algorithm: Iterations Step 2. Iteration. Perform: 2.1 Solve the least-square problem: x t+1 = argmin u J(u, x t ; λ t, µ t ) using the conjugate gradient method. 2.2 Update: λ t+1 = γλ t, µ t = max(γµ t, µ min ), t = t + 1

Iterative Regularized Least-Squares The Algorithm: Stopping Step 3. Stopping. Repeat Step 2 until: The error criterion is achieved: J(x t, x t ; 0, 0) < ε; or x The desired signal-to-noise-ratio is reached: t 2 J(x t,x t ;0,0) > snr; or The maximum number of iterations is reached: t > T.

Iterative Regularized Least-Squares The Algorithm: Stopping Step 3. Stopping. Repeat Step 2 until: The error criterion is achieved: J(x t, x t ; 0, 0) < ε; or x The desired signal-to-noise-ratio is reached: t 2 J(x t,x t ;0,0) > snr; or The maximum number of iterations is reached: t > T. The final estimate can be x T, or the best estimate obtained in the iteration path: x est = x t 0, where t 0 = argmin t J(x t, x t ; 0, 0).

Iterative Regularized Least-Squares Performance Bounds Theorem (B. 2013) Fix 0 z 0 C n. Assume the frame F is so that ker A S 2,1 = {0}. Then there is a constant A 3 > 0 that depends of F so that for every x Ω z0 and ν C n that produce y = β(x) + ν if there are u, v C n so that J(u, v; λ, µ) < J(x, x; λ, µ) then u, v xx 1 4λ A 3 + 2 ν 2 A3 (3.5) Moreover, let u, v = a + e + e + + a e e e be its spectral factorization with a + 0 a and e + = e = 1. Set x = a + e +. Then D 2 (x, x) 2 4λ + 2 ν 2 + ν 2 2 A 3 A3 4µ + λ x 2 2 2µ (3.6)

Iterative Regularized Least-Squares Performance Bounds Theorem (B. 2013) Fix 0 z 0 C n. Assume the frame F is so that ker A S 2,1 = {0}. Then there is a constant A 3 > 0 that depends of F so that for every x Ω z0 and ν C n that produce y = β(x) + ν if there are u, v C n so that J(u, v; λ, µ) < J(x, x; λ, µ) then u, v xx 1 4λ A 3 + 2 ν 2 A3 (3.5) Moreover, let u, v = a + e + e + + a e e e be its spectral factorization with a + 0 a and e + = e = 1. Set x = a + e +. Then D 2 (x, x) 2 4λ + 2 ν 2 + ν 2 2 A 3 A3 4µ + λ x 2 2 2µ (3.6)

Thank you! Questions?

References B. Alexeev, A.S. Bandeira, M. Fickus, D.G. Mixon, Phase retrieval with polarization, SIAM J. on Imag.Sci., 7(1):35 66, 2014. R. Balan, B. Bodmann, P. Casazza, D. Edidin, Painless reconstruction from Magnitudes of Frame Coefficients, J.Fourier Anal.Applic., 15 (4) (2009), 488 501. R. Balan, Reconstruction of Signals from Magnitudes of Redundant Representations: The Complex Case, available online arxiv:1304.1839v1, Found.Comput.Math. 2015, http://dx.doi.org/10.1007/s10208-015-9261-0 B. Bodmann, N. Hammen, Algorithms and error bounds for noisy phase retrieval with low-redundancy frames, arxiv:1412.6678v1 [Dec.2014]

E.J. Cand/ es, T. Strohmer, V. Voroninski, Phaselift: Exact and stable signal recovery from magnitude measurements via convex programming, Comm.Pure Appl.Math., 66(8):1241 1274, 2013. E.J. Cand/ es, X. Li, Solving quadratic equations via phaselift when there are about as many equations as unknowns, Found.Comput.Math., 14(5):1017 1026, 2014. E.J. Cand/ es, X. Li, M. Soltanolkotabi, Phase retrieval from coded diffraction patterns, arxiv:1310.3240, 2013. E.J. Cand/ es, X. Li, M. Soltanolkotabi, Phase Retrieval via Wirtinger Flow: Theory and Aplgorithms R.W. Gerchberg, A practical algorithm for the determination of phase from image and diffraction plane pictures, Optik, 35:237, 1972. M. Iwen, A. Viswanathan, Y. Wang, Robust Sparse Phase Retrieval Made Easy, 2014.

, J.R. Fienup, Phase retrieval algorithms: a comparison, Applied optics, 21(15):2758 2769, 1982.