Gauge optimization and duality
|
|
- MargaretMargaret Garrett
- 5 years ago
- Views:
Transcription
1 1 / 54 Gauge optimization and duality Junfeng Yang Department of Mathematics Nanjing University Joint with Shiqian Ma, CUHK September, 2015
2 2 / 54 Outline Introduction Duality Lagrange duality Fenchel duality Gauge duality
3 3 / 54 Outline Introduction Duality Lagrange duality Fenchel duality Gauge duality
4 4 / 54 Gauge function Definition (gauge function) A function f : R n R {+ } is called a gauge function if 1. f is convex: for all x, y R n and α (0, 1), f (αx + (1 α)y) αf (x) + (1 α)f (y); 2. f is positively homogeneous: for all x R n and λ > 0, f (λx) = λf (x); 3. f is nonnegative: f (x) 0 for all x; 4. f vanishes at the origin: f (0) = 0. Norms and semi-norms are particular gauge functions.
5 5 / 54 The problem We are interested in the following gauge optimization min{κ(x) ρ(ax b) ε}, x κ and ρ are gauge functions, and ρ 1 (0) = 0; A, b and ε > 0 are given data.
6 Sparsity 6 / 54
7 7 / 54 Sparse solutions Problem: find a sparse solution x min{ x 0 Ax b}, x or, find x such that the residue is sparse min x Ax b 0. Drawback: Combinatorial optimization, difficult.
8 8 / 54 Sparse representation represent a signal y as a superposition of elementary signal atoms: y = i φ ix i or y = Φx; unique representation if the basis is orthonormal; overcomplete dictionaries have more flexibility, but representation is not unique; a good" transform leads to a fast decay in coefficients.
9 Sparse representation 9 / 54
10 Sparse representation 10 / 54
11 Sparse representation 11 / 54
12 Sparse representation 12 / 54
13 Sparse representation 13 / 54
14 Sparse representation 14 / 54
15 Sparse representation 15 / 54
16 Sparse representation 16 / 54
17 17 / 54 LASSO/BP/BPDN The LASSO model (Tibshirani, 1996) min{ Ax b 2 x 1 k}. x The BP/BPDN model (Donoho et al., 1998; Candes et al., 2006) min{ x 1 Ax b 2 ε}. x Joint sparsity (Malioutov-Cetin-Wilsky 2003) { } min X(:, j) 2 AX B 2 ε. X j The last two are gauge optimization.
18 18 / 54 Matrix completion Seek a matrix X R m n with some desired structure (e.g., low rank) that matches certain observations, possibly noisy. min{κ(x) AX b ε}, X where A is a linear mapping (e.g. observations of certain elements of X). Setting κ as the nuclear norm, i.e., sum of all singular values, promotes low rank solutions. Applications in recommender systems, e.g., Netflix, Amazon. (Recht et al., 2010)
19 19 / 54 Phase retrieval In phase retrieval, one recovers the phase information of a signal, e.g., an image, from magnitude-only measurements (Candes et al. 2012, Waldspurger 2015). Application in X-ray crystallography, used to image the molecular structure of a crystal (Harrison, 1993). unknown signal x C n ; measurements are given by b k = x, a k 2 for some vectors a k that encode the waveforms used to illuminate the signal, k = 1,..., m.
20 20 / 54 Phase Lift These measurements can be understood as b k = xx, a k a k = X, A k of the lifted signal X := xx, where A k := a k ak rank-1 measurement matrix. The Phase Lift convex model is the k-th lifted min X S n{ I, X + ι 0(X) AX b ε} also falls into the class of gauge optimization problems.
21 21 / 54 Image reconstruction In image reconstruction, one recovers an unknown image from a set of linear measurements b = Ax + noise. The total variation model (Rudin-Osher-Fatemi, 1992) min{tv(x) Ax b ε}, x where TV(x) := i D ix and A is identity (denoising) blurring operator (deconvolution) partial convolution/wavelet (inpainting) partial Fourier (MRI)
22 Ill-possed In deconvolution/deblurring, A is a convolution matrix (square and invertible). The least squares estimation (LSE) of x is x LSE = arg min Ax f x Blurry&Noisy Direct inverse Left: f ; Right: x LSE. 22 / 54
23 23 / 54 Ill-possed In MRI, observed data satisfy f = PF x + noise: F: 2D Fourier matrix P: projection matrix Left: k-space sampling (9.36%); Right: back projection.
24 24 / 54 RGB image recovery Model: min{mtv(x) Kx f ε}. x SNR 6.25dB SNR: 17.53dB, CPU 2.78s, It: 10, Cross-channel blur, Gaussian noise,
25 25 / 54 Deconvolution with impulsive noise Model: min{mtv(x) Kx f 1 ε}. x Corruption: 60% SNR 11.74dB, CPU 18.06s, It 35, Blur: cross-channel; Noise: Salt & Pepper 60% ( ).
26 26 / 54 Wavelet inpainting Model: min{tv(x) PWx f ε}. x Back Projection. 50% data ADM. SNR: 29.1dB, CPU: 46.2s
27 27 / 54 MRI reconstruction Model: min x {TV(x) + τ Wx 1 PFx f ε}. Left: original ( ); Middle: k-space samples (9.36%, white noise 10 3 ); Right: recovered by ADM (relative error: 0.48%, CPU: 3s).
28 28 / 54 Recovery from incomplete data Model: min{tv(x) PKx f ε}. x Lena: ; Blur: Gaussian, 15 15; White noise: 10 3 ; SRs: 5%, 3%, 1%; SNRs: 13.32, 11.98, 6.14dB; CPU: 43, 46, 57s.
29 29 / 54 Robust linear regression Given a set of feature vectors a i R n and outcomes b i R, i = 1,..., m, find weights x that predict the outcome accurately, i.e., Ax b. The robust linear regression problem min x Ax b 1 is less sensitive to outliers. Equivalent gauge problem: min x,y { y 1 Ax + y = b}.
30 30 / 54 Semidefinite programming The semidefinite programming (SDP) problem min X { C, X AX = b, X 0}. can also be reformulated as gauge optimization. To see this, chose one dual feasible point y so that Ĉ := C A y 0. Then, this SDP is equivalent to a gauge optimization min { Ĉ, X + ι 0(X) AX = b}. X
31 31 / 54 Outline Introduction Duality Lagrange duality Fenchel duality Gauge duality
32 32 / 54 Lagrange duality Given a general nonlinear programming problem, referred to as primal problem, there exists another closely related nonlinear programming problem, named Lagrange dual problem. Under suitable regularity conditions, such as convexity and/or constraint qualifications, the primal and the dual problems have equal optimal objective values.
33 33 / 54 Lagrange duality Let f : R n R, h : R n R m, g : R n R q and X R n. The nonlinear constrained optimization problem: p := inf{f (x) h(x) = 0, g(x) 0, x X }. x The Lagrange function L(x, λ, µ) = f (x) + λ, h(x) + µ, g(x). The Lagrange dual problem d := sup {g(λ, µ) µ 0}, λ,µ where g(λ, µ) := inf x {L(x, λ, µ) x X} is the dual objective.
34 34 / 54 Lagrange duality The vectors λ and µ are the Lagrange multipliers; The Lagrange multipliers corresponding to inequality constraints are restricted to be nonnegative, and those corresponding to equality constraints are unrestricted; The dual problem is a concave maximization problem; Weak duality d p ; Strong duality d = p is in general not true. For convex problems (X convex, f and g convex, and h affine), if Slater s condition is satisfied, then duality gap vanishes; Different Lagrange dual problems can be derived depending on different formulations of the problem.
35 35 / 54 Example Consider l 1 -regularized least square problem Two equivalent formulations min x x Ax b 2. min x,z min x,z { x Az b 2 x = z}, { x z b 2 Ax = z}. Their Lagrange dual problems max y 1 max A T y 1 min z y, z Az b 2, b, y 1 2 y 2.
36 36 / 54 Fenchel duality Let f : R n R {+ }. Consider unconstrained optimization min f (x). x Any constraints must be incorporated into the objective function via the indicator function. The key ingredient of Fenchel duality is that any closed convex function is the pointwise supremum of the set of affine functions which minorize it. This leads to the Fenchel conjugate function: f (y) = sup{ y, x f (x)}, y. x
37 37 / 54 Fenchel duality Assume that φ(x, y) is a perturbation function of f (x), i.e., φ(x, 0) = f (x), x, and define h(y) := inf φ(x, y), y. x Then, minimizing f is equivalent to evaluating h(0). Can show that inf y φ (0, y) = h (0) h(0) = inf x φ(x, 0). The problem of evaluating h (0) is called the Fenchel dual problem. Strong duality h (0) = h(0) holds if and only if inf x sup y { x, y h(y)} = sup y inf x { x, y h(y)}.
38 38 / 54 Fenchel duality Theorem (Fenchel duality theorem) Let f : R n R {+ } and g : R n R {+ } be closed proper convex functions. Then, under regularity conditions it holds that inf x {f (x) g(x)} = sup {g (y) f (y)}, y where f is the convex conjugate of f and g is the concave conjugate of g, i.e., f (y) := sup{ y, x f (x)}, x g (y) := inf{ y, x g(x)}. x
39 39 / 54 Geometric interpretation 150 [slop, dual obj] = [-2.05, ]
40 40 / 54 Geometric interpretation 150 [slop, dual obj] = [-1.60,-73.00]
41 41 / 54 Geometric interpretation 150 [slop, dual obj] = [-1.10,-38.62]
42 42 / 54 Geometric interpretation 150 [slop, dual obj] = [-0.60,-10.50]
43 43 / 54 Geometric interpretation 150 [slop, dual obj] = [-0.10,11.38]
44 44 / 54 Geometric interpretation 150 [slop, dual obj] = [1.40,39.50]
45 45 / 54 Geometric interpretation 150 [slop, dual obj] = [1.90,36.38]
46 46 / 54 Geometric interpretation 150 [slop, dual obj] = [2.40,27.00]
47 47 / 54 Gauge optimization Definition (gauge optimization) Minimizing a closed gauge function κ over a closed convex set C is called a gauge optimization. That is Definition (feasible point) min x {κ(x) x C}. (GP) A point x C is called a feasible point of GP; If x C and κ(x) = +, then x is essentially infeasible; If x C and κ(x) < +, then x is strongly feasible; Similar definitions for the gauge dual problem defined below.
48 48 / 54 Gauge optimization Definition (antipolar set) The antipolar of a closed convex set C is given by C = {y R n x, y 1 for all x C}. Definition (polar function) The polar function of a gauge κ is defined as κ (y) = sup{ x, y κ(x) 1}. In particular, the polar function of a norm is its dual norm. Definition (gauge dual) The nonlinear gauge dual problem of (GP) is defined by min y {κ (y) y C }. (GD)
49 49 / 54 Gauge duality Theorem (weak duality, Freund 1987) Let p and d be respectively the optimal values for GP and GD. 1. If x and y are, resp., strongly feasible for GP and GD, then κ(x)κ (y) 1, and hence p d If p = 0, then d = +, i.e., GD is essentially infeasible; 3. If d = 0, then p = +, i.e., GP is essentially infeasible.
50 50 / 54 Gauge duality Theorem (strong duality, Freund 1987) Let p and d be respectively the optimal values for GP and GD. If (ri C) (ri dom κ) and (ri C ) (ri dom κ ), then p d = 1, and each program attains its optimum.
51 51 / 54 Gauge duality The Fenchel dual of GP is given by d f = max{ ι y C ( y) k (y) 1}. (FD) Theorem (weak duality, Friedlander et al., 2014) Suppose that dom k C. Then Furthermore, p d f = 1/d > if y solves FD, then y /d f solves GD; 2. if y solves GD and d > 0, then y /d solves FD.
52 52 / 54 Gauge duality Theorem (strong duality, Friedlander et al., 2014) 1. Suppose that dom k C and (ri dom κ) (ri C), then p d = 1 and GD attains its optimal value. 2. Suppose that (ri dom κ) (ri C) and (ri dom κ ) (ri C ), then p d = 1 and both GP and GD attain their optimal values.
53 53 / 54 Gauge duality Consider the gauge optimization Lagrange dual Gauge dual min{κ(x) ρ(ax b) ε}. x max{b T y ερ (y) κ (A T y) 1} y min y {κ (A T y) b T y ερ (y) 1} In general, problems with simper constraints and complex objective are more preferred than those with simpler objective yet complex constraints.
54 54 / 54 Acknowledgements Some pictures are taken from Michael P. Friedlander s SIAM Optimization talk in 2011.
Lecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationOptimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers
Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLagrange duality. The Lagrangian. We consider an optimization program of the form
Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More informationsubject to (x 2)(x 4) u,
Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the
More informationConvex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationLagrangian Duality and Convex Optimization
Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DS-GA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationConstrained optimization
Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained
More informationAdditional Homework Problems
Additional Homework Problems Robert M. Freund April, 2004 2004 Massachusetts Institute of Technology. 1 2 1 Exercises 1. Let IR n + denote the nonnegative orthant, namely IR + n = {x IR n x j ( ) 0,j =1,...,n}.
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Vietnam
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationExample: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma
4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid
More informationEnhanced Compressive Sensing and More
Enhanced Compressive Sensing and More Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Nonlinear Approximation Techniques Using L1 Texas A & M University
More informationEE/AA 578, Univ of Washington, Fall Duality
7. Duality EE/AA 578, Univ of Washington, Fall 2016 Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More information14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity
More informationApplications of gauge duality in robust principal component analysis and semidefinite programming
SCIENCE CHINA Mathematics. ARTICLES. August 2016 Vol.59 No.8: 1579 1592 doi: 10.1007/s11425-016-0312-1 Applications of gauge duality in robust principal component analysis and semidefinite programg MA
More informationDuality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities
Duality Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities Lagrangian Consider the optimization problem in standard form
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationEE 227A: Convex Optimization and Applications October 14, 2008
EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider
More informationAdaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise
Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,
More informationOptimal Value Function Methods in Numerical Optimization Level Set Methods
Optimal Value Function Methods in Numerical Optimization Level Set Methods James V Burke Mathematics, University of Washington, (jvburke@uw.edu) Joint work with Aravkin (UW), Drusvyatskiy (UW), Friedlander
More informationHomework Set #6 - Solutions
EE 15 - Applications of Convex Optimization in Signal Processing and Communications Dr Andre Tkacenko JPL Third Term 11-1 Homework Set #6 - Solutions 1 a The feasible set is the interval [ 4] The unique
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More information4. Algebra and Duality
4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone
More informationICS-E4030 Kernel Methods in Machine Learning
ICS-E4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This
More informationRapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization
Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationLagrangian Duality Theory
Lagrangian Duality Theory Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapter 14.1-4 1 Recall Primal and Dual
More information15. Conic optimization
L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone
More informationConvex Optimization M2
Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationSubgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus
1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality
More informationSCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD
EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationEE364a Review Session 5
EE364a Review Session 5 EE364a Review announcements: homeworks 1 and 2 graded homework 4 solutions (check solution to additional problem 1) scpd phone-in office hours: tuesdays 6-7pm (650-723-1156) 1 Complementary
More informationMaking Flippy Floppy
Making Flippy Floppy James V. Burke UW Mathematics jvburke@uw.edu Aleksandr Y. Aravkin IBM, T.J.Watson Research sasha.aravkin@gmail.com Michael P. Friedlander UBC Computer Science mpf@cs.ubc.ca Current
More informationOn the Method of Lagrange Multipliers
On the Method of Lagrange Multipliers Reza Nasiri Mahalati November 6, 2016 Most of what is in this note is taken from the Convex Optimization book by Stephen Boyd and Lieven Vandenberghe. This should
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17
EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 17 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 29, 2012 Andre Tkacenko
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationMachine Learning. Support Vector Machines. Manfred Huber
Machine Learning Support Vector Machines Manfred Huber 2015 1 Support Vector Machines Both logistic regression and linear discriminant analysis learn a linear discriminant function to separate the data
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationSparse Recovery Beyond Compressed Sensing
Sparse Recovery Beyond Compressed Sensing Carlos Fernandez-Granda www.cims.nyu.edu/~cfgranda Applied Math Colloquium, MIT 4/30/2018 Acknowledgements Project funded by NSF award DMS-1616340 Separable Nonlinear
More informationLECTURE 13 LECTURE OUTLINE
LECTURE 13 LECTURE OUTLINE Problem Structures Separable problems Integer/discrete problems Branch-and-bound Large sum problems Problems with many constraints Conic Programming Second Order Cone Programming
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationPHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN
PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the
More informationVariational Image Restoration
Variational Image Restoration Yuling Jiao yljiaostatistics@znufe.edu.cn School of and Statistics and Mathematics ZNUFE Dec 30, 2014 Outline 1 1 Classical Variational Restoration Models and Algorithms 1.1
More informationUses of duality. Geoff Gordon & Ryan Tibshirani Optimization /
Uses of duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Remember conjugate functions Given f : R n R, the function is called its conjugate f (y) = max x R n yt x f(x) Conjugates appear
More informationHW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.
HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationLagrangian Duality. Richard Lusby. Department of Management Engineering Technical University of Denmark
Lagrangian Duality Richard Lusby Department of Management Engineering Technical University of Denmark Today s Topics (jg Lagrange Multipliers Lagrangian Relaxation Lagrangian Duality R Lusby (42111) Lagrangian
More informationOptimal Value Function Methods in Numerical Optimization Level Set Methods
Optimal Value Function Methods in Numerical Optimization Level Set Methods James V Burke Mathematics, University of Washington, (jvburke@uw.edu) Joint work with Aravkin (UW), Drusvyatskiy (UW), Friedlander
More informationEnhanced Fritz John Optimality Conditions and Sensitivity Analysis
Enhanced Fritz John Optimality Conditions and Sensitivity Analysis Dimitri P. Bertsekas Laboratory for Information and Decision Systems Massachusetts Institute of Technology March 2016 1 / 27 Constrained
More informationLecture 7: Convex Optimizations
Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1
More informationModule 04 Optimization Problems KKT Conditions & Solvers
Module 04 Optimization Problems KKT Conditions & Solvers Ahmad F. Taha EE 5243: Introduction to Cyber-Physical Systems Email: ahmad.taha@utsa.edu Webpage: http://engineering.utsa.edu/ taha/index.html September
More informationAcyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs
Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca & Eilyan Bitar School of Electrical and Computer Engineering American Control Conference (ACC) Chicago,
More informationSparse Solutions of an Undetermined Linear System
1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationDuality Theory of Constrained Optimization
Duality Theory of Constrained Optimization Robert M. Freund April, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 The Practical Importance of Duality Duality is pervasive
More information4. Convex optimization problems
Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization
More informationConvex optimization problems. Optimization problem in standard form
Convex optimization problems optimization problem in standard form convex optimization problems linear optimization quadratic optimization geometric programming quasiconvex optimization generalized inequality
More informationMath 273a: Optimization Overview of First-Order Optimization Algorithms
Math 273a: Optimization Overview of First-Order Optimization Algorithms Wotao Yin Department of Mathematics, UCLA online discussions on piazza.com 1 / 9 Typical flow of numerical optimization Optimization
More information10-725/ Optimization Midterm Exam
10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted
More informationSparse Optimization Lecture: Dual Certificate in l 1 Minimization
Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Instructor: Wotao Yin July 2013 Note scriber: Zheng Sun Those who complete this lecture will know what is a dual certificate for l 1 minimization
More informationThe Lagrangian L : R d R m R r R is an (easier to optimize) lower bound on the original problem:
HT05: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford Convex Optimization and slides based on Arthur Gretton s Advanced Topics in Machine Learning course
More informationSparse Optimization Lecture: Basic Sparse Optimization Models
Sparse Optimization Lecture: Basic Sparse Optimization Models Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know basic l 1, l 2,1, and nuclear-norm
More informationROBUST BLIND SPIKES DECONVOLUTION. Yuejie Chi. Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 43210
ROBUST BLIND SPIKES DECONVOLUTION Yuejie Chi Department of ECE and Department of BMI The Ohio State University, Columbus, Ohio 4 ABSTRACT Blind spikes deconvolution, or blind super-resolution, deals with
More informationConvex Optimization. Lecture 12 - Equality Constrained Optimization. Instructor: Yuanzhang Xiao. Fall University of Hawaii at Manoa
Convex Optimization Lecture 12 - Equality Constrained Optimization Instructor: Yuanzhang Xiao University of Hawaii at Manoa Fall 2017 1 / 19 Today s Lecture 1 Basic Concepts 2 for Equality Constrained
More informationTowards a Mathematical Theory of Super-resolution
Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This
More informationLecture 8. Strong Duality Results. September 22, 2008
Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation
More informationIntroduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationSPARSE SIGNAL RESTORATION. 1. Introduction
SPARSE SIGNAL RESTORATION IVAN W. SELESNICK 1. Introduction These notes describe an approach for the restoration of degraded signals using sparsity. This approach, which has become quite popular, is useful
More informationSupport Vector Machines
Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More information4. Convex optimization problems
Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization
More informationTutorial on Convex Optimization: Part II
Tutorial on Convex Optimization: Part II Dr. Khaled Ardah Communications Research Laboratory TU Ilmenau Dec. 18, 2018 Outline Convex Optimization Review Lagrangian Duality Applications Optimal Power Allocation
More informationLecture: Convex Optimization Problems
1/36 Lecture: Convex Optimization Problems http://bicmr.pku.edu.cn/~wenzw/opt-2015-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/36 optimization
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationA New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction
A New Look at First Order Methods Lifting the Lipschitz Gradient Continuity Restriction Marc Teboulle School of Mathematical Sciences Tel Aviv University Joint work with H. Bauschke and J. Bolte Optimization
More informationLecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem
Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R
More informationMachine Learning And Applications: Supervised Learning-SVM
Machine Learning And Applications: Supervised Learning-SVM Raphaël Bournhonesque École Normale Supérieure de Lyon, Lyon, France raphael.bournhonesque@ens-lyon.fr 1 Supervised vs unsupervised learning Machine
More informationMotivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:
CDS270 Maryam Fazel Lecture 2 Topics from Optimization and Duality Motivation network utility maximization (NUM) problem: consider a network with S sources (users), each sending one flow at rate x s, through
More informationLinear and Combinatorial Optimization
Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality
More informationEE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)
EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in
More informationProvable Alternating Minimization Methods for Non-convex Optimization
Provable Alternating Minimization Methods for Non-convex Optimization Prateek Jain Microsoft Research, India Joint work with Praneeth Netrapalli, Sujay Sanghavi, Alekh Agarwal, Animashree Anandkumar, Rashish
More informationDuality Uses and Correspondences. Ryan Tibshirani Convex Optimization
Duality Uses and Correspondences Ryan Tibshirani Conve Optimization 10-725 Recall that for the problem Last time: KKT conditions subject to f() h i () 0, i = 1,... m l j () = 0, j = 1,... r the KKT conditions
More information