Shiqian Ma, MAT258A: Numerical Optimization 1. Chapter 4. Subgradient


 Amber Amie Briggs
 1 years ago
 Views:
Transcription
1 Shiqian Ma, MAT258A: Numerical Optimization 1 Chapter 4 Subgradient
2 Shiqian Ma, MAT258A: Numerical Optimization Subgradients definition subgradient calculus duality and optimality conditions
3 Shiqian Ma, MAT258A: Numerical Optimization 3 Basic inequality recall basic inequality for convex differentiable f: f(y) f(x) + f(x) (y x) the firstorder approximation of f at x is a global lower bound f(x) defines nonvertical supporting hyperplane to epi f at (x, f(x)) ( ) (( ) ( )) f(x) y x 0, (y, t) epi f 1 t f(x) what if f is not differentiable?
4 Shiqian Ma, MAT258A: Numerical Optimization 4 Subgradient g is a subgradient of a convex function f at x dom f if f(y) f(x) + g (y x), y dom f properties f(x) + g (y x) is a global lower bound on f(y) g defines nonvertical supporting hyperplane to epi f at (x, f(x)) ( ) (( ) ( )) g y x 0, (y, t) epi f 1 t f(x) if f is convex and differentiable, then f(x) is a subgradient of f at x
5 Shiqian Ma, MAT258A: Numerical Optimization 5 applications algorithms for nondifferentiable convex optimization unconstrained optimality: x minimizes f(x) iff 0 f(x) KKT conditions with nondifferentiable functions
6 Shiqian Ma, MAT258A: Numerical Optimization 6 Example f(x) = max{f 1 (x), f 2 (x)}, f 1, f 2 convex and differentiable subgradients at x 0 form line segment [ f 1 (x 0 ), f 2 (x 0 )] if f 1 (ˆx) > f 2 (ˆx), subgradient of f at ˆx is f 1 (ˆx) if f 1 (ˆx) < f 2 (ˆx), subgradient of f at ˆx is f 2 (ˆx)
7 Shiqian Ma, MAT258A: Numerical Optimization Subdifferential the subdifferential f(x) of f at x is the set of all subgradients: properties f(x) = {g g (y x) f(y) f(x), y dom f} f(x) is a closed convex set (possibly empty) (follows from the definition: f(x) is an intersection of halfspaces) if x int dom f then f(x) is nonempty and bounded (see the next two pages for proof)
8 Shiqian Ma, MAT258A: Numerical Optimization 8 proof: we show that f(x) is nonempty when x int dom f (x, f(x)) is in the boundary of the convex set epi f therefore there exists a supporting hyperplane to epi f at (x, f(x)): ( ) (( ) ( )) a y x (a, b) 0, 0, (y, t) epi f b t f(x) b > 0 gives a contradiction as t b = 0 gives a contradiction for y = x + ɛa a with small ɛ > 0 therefore b < 0 and g = a/ b is a subgradient of f at x
9 Shiqian Ma, MAT258A: Numerical Optimization 9 proof: f(x) is bounded when x int dom f for small r > 0, define a set of 2n points B = {x ± re k k = 1,..., n} dom f and define M = max y B f(y) < for every nonzero g f(x), there is a point y B with f(y) f(x) + g (y x) = f(x) + r g (choose an index k with g k = g, and take y = x+rsign(g k )e k therefore f(x) is bounded: sup g M f(x) g f(x) r
10 Shiqian Ma, MAT258A: Numerical Optimization 10 absolute value f(x) = x : f(x) = Examples 1 x > 0 [ 1, 1] x = 0 1 x < 0 Euclidean norm f(x) = x 2 : f(x) = 1 x 2 x, if x 0, f(x) = {g g 2 1}, if x = 0
11 Shiqian Ma, MAT258A: Numerical Optimization 11 Monotonicity subdifferential of a convex function is a monotone operator: (u v) (x y) 0, x, y, u f(x), v f(y) proof: by definition f(y) f(x) + u (y x), f(x) f(y) + v (x y) combining the two inequalities shows monotonicity
12 Shiqian Ma, MAT258A: Numerical Optimization 12 Examples of nonsubdifferentiable functions the following functions are not subdifferentiable at x = 0 f : R R, dom f = R + f(x) = 1 if x = 0, f(x) = 0 if x > 0 f : R R, dom f = R + f(x) = x the only supporting hyperplane to epi f at (0, f(0)) is vertical
13 Shiqian Ma, MAT258A: Numerical Optimization 13 Subgradients and sublevel sets if g is a subgradient of f at x, then f(y) f(x) g (y x) 0 nonzero subgradients at x define supporting hyperplanes to sublevel set {y f(y) f(x)}
14 Shiqian Ma, MAT258A: Numerical Optimization Subgradient calculus weak subgradient calculus: rules for finding one subgradient sufficient for most nondifferentiable convex optimization algorithms if you can evaluate f(x), you can usually compute a subgradient strong subgradient calculus: subgradients) rules for finding f(x) (all some algorithms, optimality conditions, etc., need entire subdifferential can be quite complicated we will assume that x int dom f
15 Shiqian Ma, MAT258A: Numerical Optimization 15 Basic rules differentiable functions: f(x) = { f(x)} if f is differentiable at x nonnegative combination: if h(x) = α 1 f 1 (x) + α 2 f 2 (x) with α 1, α 2 0, then h(x) = α 1 f 1 (x) + α 2 f 2 (x) (r.h.s. is addition of sets) affine transformation of variables: if h(x) = f(ax + b), then h(x) = A f(ax + b)
16 Shiqian Ma, MAT258A: Numerical Optimization 16 Pointwise Maximum f(x) = max{f 1 (x),..., f m (x)} define I(x) = {i f i (x) = f(x)}, the active functions at x weak result: to compute a subgradient at x, choose any k I(x), and any subgradient of f k at x strong result: f(x) = conv f i (x) i I(x) convex hull of the union of subdifferentials of active functions at x if f i s are differentiable, f(x) = conv{ f i (x) i I(x)}
17 Shiqian Ma, MAT258A: Numerical Optimization 17 Example: piecewiselinear function f(x) = max i=1,...,m a i x + b i the subdifferential at x is a polyhedron with I(x) = {i a i x + b i = f(x)} f(x) = conv{a i i I(x)}
18 Shiqian Ma, MAT258A: Numerical Optimization 18 Example: l 1 norm f(x) = x 1 = max s { 1,1} s x n the subdifferential is a product of intervals [ 1, 1] x k = 0 f(x) = J 1... J n, J k = {1} x k > 0 { 1} x k < 0
19 Shiqian Ma, MAT258A: Numerical Optimization 19 Pointwise supremum f(x) = sup f α (x), f α (x) convex in x for every α α A weak result: to find a subgradient at ˆx, find any β for which f(ˆx) = f β (ˆx) (assuming maximum is attained) choose any g f β (ˆx) (partial) strong result: define I(x) = {α A f α (x) = f(x)} conv f α (x) f(x) α I(x) equality requires extra conditions (e.g., A compact, f α continuous in α)
20 Shiqian Ma, MAT258A: Numerical Optimization 20 Exercise: maximum eigenvalue problem: explain how to find a subgradient of f(x) = λ max (A(x)) = sup y A(x)y y 2 =1 where A(x) = A 0 + x 1 A x n A n with symmetric coefficients A i solution: to find a subgradient at ˆx, choose any unit eigenvector y with eigenvalue λ max (A(ˆx)) the gradient of y A(x)y at ˆx is a subgradient of f: (y A 1 y,..., y A n y) f(ˆx)
21 Shiqian Ma, MAT258A: Numerical Optimization 21 Minimization f(x) = inf h(x, y), h jointly convex in (x, y) y weak result: to find a subgradient at ˆx, find ŷ that minimizes h(ˆx, y) (assuming minimum is attained) find subgradient (g, 0) h(ˆx, ŷ) proof: for all x, y, h(x, y) h(ˆx, ŷ) + g (x ˆx) + 0 (y ŷ) = f(ˆx) + g (x ˆx) therefore, f(x) = inf y h(x, y) f(ˆx) + g (x ˆx)
22 Shiqian Ma, MAT258A: Numerical Optimization 22 Exercise: Euclidean distance to convex set problem: explain how to find a subgradient of f(x) = inf y C x y 2 where C is a closed convex set solution: to find a subgradient at ˆx, if f(ˆx) = 0 (that is, ˆx C), take g = 0 if f(ˆx) > 0, find projection ŷ = P (ˆx) on C; take g = 1 ŷ ˆx 2 (ˆx ŷ) = 1 ˆx P (ˆx) 2 (ˆx P (ˆx))
23 Shiqian Ma, MAT258A: Numerical Optimization 23 Composition f(x) = h(f 1 (x),..., f k (x)), h convex nondecreasing, f i convex weak result: to find a subgradient at ˆx, find z h(f 1 (ˆx),..., f k (ˆx)) and g i f i (ˆx) then g = z 1 g z k g k f(ˆx) reduces to standard formula for differentiable h, f i proof. f(x) h(f 1 (ˆx) + g1 (x ˆx),..., f k (ˆx) + gk (x ˆx)) h(f 1 (ˆx),..., f k (ˆx)) + z (g1 (x ˆx),..., gk (x ˆx)) = f(ˆx) + g (x ˆx)
24 Shiqian Ma, MAT258A: Numerical Optimization 24 Optimal value function define h(u, v) as the optimal value of convex problem min f 0 (x) s.t., f i (x) u i, i = 1,..., m Ax = b + v (functions f i are convex; optimization variable is x) weak result: suppose h(û, ˆv) is finite, strong duality holds with the dual max inf x ( f0 (x) + i λ i(f i (x) û i ) + ν (Ax b ˆv) ) s.t., λ 0 if ˆλ, ˆν are optimal dual variables (for rhs û,ˆv), then ( ˆλ, ˆν) h(û, ˆv)
25 Shiqian Ma, MAT258A: Numerical Optimization 25 proof: by weak duality for problem with rhs u,v h(u, v) inf x (f 0 (x) + ˆλ ) i i (f i (x) u i ) + ˆν (Ax b v) = inf x (f 0 (x) + ˆλ ) i i (f i (x) û i ) + ˆν (Ax b ˆv) ˆλ (u û) ˆν (v ˆv) = h(û, ˆv) ˆλ (u û) ˆν (v ˆv)
26 Shiqian Ma, MAT258A: Numerical Optimization 26 Expectation f(x) = Eh(x, u) u random, h convex in x for every u weak result: to find a subgradient at ˆx choose a function u g(u) with g(u) x h(ˆx, u) then, g = E u g(u) f(ˆx) proof: by convexity of h and definition of g(u), f(x) = Eh(x, u) E(h(ˆx, u) + g(u) (x ˆx)) = f(ˆx) + g (x ˆx)
27 Shiqian Ma, MAT258A: Numerical Optimization Duality and Optimality Conditions
28 Shiqian Ma, MAT258A: Numerical Optimization 28 Optimality conditionsunconstrained x minimizes f(x) if and only if 0 f(x ) proof: by definition f(y) f(x ) + 0 (y x ) for all y 0 f(x )
29 Shiqian Ma, MAT258A: Numerical Optimization 29 Example: piecewise linear minimization f(x) = max i=1,...,m (a i x + b i ) optimality condition 0 conv{a i i I(x )} ( where I(x) = {i a i x + b i = f(x)}) in other words, x is optimal if and only if there is a λ with m λ 0, e λ = 1, λ i a i = 0, λ i = 0 for i / I(x ) i=1 these are the optimality conditions for the equivalent linear program min t s.t., Ax + b te max b λ s.t., A λ = 0 λ 0, e λ = 1
30 Shiqian Ma, MAT258A: Numerical Optimization 30 Optimality conditionsconstrained min f 0 (x) s.t. f i (x) 0, i = 1,..., m from Lagrange duality if strong duality holds, then x, λ are primal, dual optimal iff 1. x is primal feasible 2. λ 0 3. λ i f i(x ) = 0 for i = 1,..., m 4. x is a minimizer of L(x, λ ) = f 0 (x) + m λ i f i (x) i=1
31 Shiqian Ma, MAT258A: Numerical Optimization 31 KarushKuhnTucker conditions (if dom f i = R n ) conditions 1, 2, 3 and m 0 L x (x, λ ) = f 0 (x ) + λ i f i (x ) i=1 this generalizes the condition m 0 = f 0 (x ) + λ i f i (x ) for differentiable f i i=1
Subgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus
1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality
More informationSubgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic
More informationSubgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients strong and weak subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE364b, Stanford University Basic inequality recall basic inequality
More informationLecture 1: Background on Convex Analysis
Lecture 1: Background on Convex Analysis John Duchi PCMI 2016 Outline I Convex sets 1.1 Definitions and examples 2.2 Basic properties 3.3 Projections onto convex sets 4.4 Separating and supporting hyperplanes
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very wellwritten and a pleasure to read. The
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLagrange duality. The Lagrangian. We consider an optimization program of the form
Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization
More informationPrimal/Dual Decomposition Methods
Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470  Convex Optimization Fall 201819, HKUST, Hong Kong Outline of Lecture Subgradients
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationExistence of minimizers
Existence of imizers We have just talked a lot about how to find the imizer of an unconstrained convex optimization problem. We have not talked too much, at least not in concrete mathematical terms, about
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 4. Suvrit Sra. (Conjugates, subdifferentials) 31 Jan, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 4 (Conjugates, subdifferentials) 31 Jan, 2013 Suvrit Sra Organizational HW1 due: 14th Feb 2013 in class. Please L A TEX your solutions (contact TA if this
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt2016fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More informationMath 273a: Optimization Subgradients of convex functions
Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 42 Subgradients Assumptions
More informationChap 2. Optimality conditions
Chap 2. Optimality conditions Version: 29092012 2.1 Optimality conditions in unconstrained optimization Recall the definitions of global, local minimizer. Geometry of minimization Consider for f C 1
More informationOptimality Conditions for Nonsmooth Convex Optimization
Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never
More informationLecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem
Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 00 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationKarushKuhnTucker Conditions. Lecturer: Ryan Tibshirani Convex Optimization /36725
KarushKuhnTucker Conditions Lecturer: Ryan Tibshirani Convex Optimization 10725/36725 1 Given a minimization problem Last time: duality min x subject to f(x) h i (x) 0, i = 1,... m l j (x) = 0, j =
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec  Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec  Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationCSE4830 Kernel Methods in Machine Learning
CSE4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationDuality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities
Duality Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities Lagrangian Consider the optimization problem in standard form
More informationLECTURE 12 LECTURE OUTLINE. Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions
LECTURE 12 LECTURE OUTLINE Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions Reading: Section 5.4 All figures are courtesy of Athena
More informationConvex Optimization Lecture 6: KKT Conditions, and applications
Convex Optimization Lecture 6: KKT Conditions, and applications Dr. Michel Baes, IFOR / ETH Zürich Quick recall of last week s lecture Various aspects of convexity: The set of minimizers is convex. Convex
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17
EE/ACM 150  Applications of Convex Optimization in Signal Processing and Communications Lecture 17 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 29, 2012 Andre Tkacenko
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms  A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More informationSolutions Chapter 5. The problem of finding the minimum distance from the origin to a line is written as. min 1 2 kxk2. subject to Ax = b.
Solutions Chapter 5 SECTION 5.1 5.1.4 www Throughout this exercise we will use the fact that strong duality holds for convex quadratic problems with linear constraints (cf. Section 3.4). The problem of
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More informationOptimization and Optimal Control in Banach Spaces
Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex nonsmooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,
More informationThe proximal mapping
The proximal mapping http://bicmr.pku.edu.cn/~wenzw/opt2016fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/37 1 closed function 2 Conjugate function
More informationLecture 8. Strong Duality Results. September 22, 2008
Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation
More informationCONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS
CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS A Dissertation Submitted For The Award of the Degree of Master of Philosophy in Mathematics Neelam Patel School of Mathematics
More informationConvex Analysis Notes. Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE
Convex Analysis Notes Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE These are notes from ORIE 6328, Convex Analysis, as taught by Prof. Adrian Lewis at Cornell University in the
More informationICSE4030 Kernel Methods in Machine Learning
ICSE4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This
More informationNonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6
Nonlinear Programming 3rd Edition Theoretical Solutions Manual Chapter 6 Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts 1 NOTE This manual contains
More informationHW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.
HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a realvalued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard
More informationEE 546, Univ of Washington, Spring Proximal mapping. introduction. review of conjugate functions. proximal mapping. Proximal mapping 6 1
EE 546, Univ of Washington, Spring 2012 6. Proximal mapping introduction review of conjugate functions proximal mapping Proximal mapping 6 1 Proximal mapping the proximal mapping (proxoperator) of a convex
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework  MC/MC Constrained optimization
More informationConvex Functions. Pontus Giselsson
Convex Functions Pontus Giselsson 1 Today s lecture lower semicontinuity, closure, convex hull convexity preserving operations precomposition with affine mapping infimal convolution image function supremum
More informationEE/AA 578, Univ of Washington, Fall Duality
7. Duality EE/AA 578, Univ of Washington, Fall 2016 Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationCO 250 Final Exam Guide
Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationGeometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as
Chapter 8 Geometric problems 8.1 Projection on a set The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as dist(x 0,C) = inf{ x 0 x x C}. The infimum here is always achieved.
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationLecture 6: September 17
10725/36725: Convex Optimization Fall 2015 Lecturer: Ryan Tibshirani Lecture 6: September 17 Scribes: Scribes: Wenjun Wang, Satwik Kottur, Zhiding Yu Note: LaTeX template courtesy of UC Berkeley EECS
More informationInequality Constraints
Chapter 2 Inequality Constraints 2.1 Optimality Conditions Early in multivariate calculus we learn the significance of differentiability in finding minimizers. In this section we begin our study of the
More information5. Subgradient method
L. Vandenberghe EE236C (Spring 2016) 5. Subgradient method subgradient method convergence analysis optimal step size when f is known alternating projections optimality 51 Subgradient method to minimize
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equalityconstrained Optimization Inequalityconstrained Optimization Mixtureconstrained Optimization 3 Quadratic Programming
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationLinear and nonlinear programming
Linear and nonlinear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationAdditional Homework Problems
Additional Homework Problems Robert M. Freund April, 2004 2004 Massachusetts Institute of Technology. 1 2 1 Exercises 1. Let IR n + denote the nonnegative orthant, namely IR + n = {x IR n x j ( ) 0,j =1,...,n}.
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality
More informationOn the Method of Lagrange Multipliers
On the Method of Lagrange Multipliers Reza Nasiri Mahalati November 6, 2016 Most of what is in this note is taken from the Convex Optimization book by Stephen Boyd and Lieven Vandenberghe. This should
More informationDuality Theory of Constrained Optimization
Duality Theory of Constrained Optimization Robert M. Freund April, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 The Practical Importance of Duality Duality is pervasive
More information1 Review of last lecture and introduction
Semidefinite Programming Lecture 10 OR 637 Spring 2008 April 16, 2008 (Wednesday) Instructor: Michael Jeremy Todd Scribe: Yogeshwer (Yogi) Sharma 1 Review of last lecture and introduction Let us first
More informationThe KarushKuhnTucker (KKT) conditions
The KarushKuhnTucker (KKT) conditions In this section, we will give a set of sufficient (and at most times necessary) conditions for a x to be the solution of a given convex optimization problem. These
More information8. Conjugate functions
L. Vandenberghe EE236C (Spring 201314) 8. Conjugate functions closed functions conjugate function 81 Closed set a set C is closed if it contains its boundary: x k C, x k x = x C operations that preserve
More informationConvex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version
Convex Optimization Theory Chapter 5 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com
More information3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions logconcave and logconvex functions
More informationsubject to (x 2)(x 4) u,
Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationStochastic Programming Math Review and MultiPeriod Models
IE 495 Lecture 5 Stochastic Programming Math Review and MultiPeriod Models Prof. Jeff Linderoth January 27, 2003 January 27, 2003 Stochastic Programming Lecture 5 Slide 1 Outline Homework questions? I
More informationConvex analysis and profit/cost/support functions
Division of the Humanities and Social Sciences Convex analysis and profit/cost/support functions KC Border October 2004 Revised January 2009 Let A be a subset of R m Convex analysts may give one of two
More informationMath 273a: Optimization Convex Conjugacy
Math 273a: Optimization Convex Conjugacy Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Convex conjugate (the Legendre transform) Let f be a closed proper
More information3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions logconcave and logconvex functions
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationChapter 2 Convex Analysis
Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,
More informationLocal strong convexity and local Lipschitz continuity of the gradient of convex functions
Local strong convexity and local Lipschitz continuity of the gradient of convex functions R. Goebel and R.T. Rockafellar May 23, 2007 Abstract. Given a pair of convex conjugate functions f and f, we investigate
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470  Convex Optimization Fall 201718, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationConvex Optimization and SVM
Convex Optimization and SVM Problem 0. Cf lecture notes pages 12 to 18. Problem 1. (i) A slab is an intersection of two half spaces, hence convex. (ii) A wedge is an intersection of two half spaces, hence
More informationLecture 2: Convex functions
Lecture 2: Convex functions f : R n R is convex if dom f is convex and for all x, y dom f, θ [0, 1] f is concave if f is convex f(θx + (1 θ)y) θf(x) + (1 θ)f(y) x x convex concave neither x examples (on
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationConvex Optimization. Ofer Meshi. Lecture 6: Lower Bounds Constrained Optimization
Convex Optimization Ofer Meshi Lecture 6: Lower Bounds Constrained Optimization Lower Bounds Some upper bounds: #iter μ 2 M #iter 2 M #iter L L μ 2 Oracle/ops GD κ log 1/ε M x # ε L # x # L # ε # με f
More informationIE 521 Convex Optimization Homework #1 Solution
IE 521 Convex Optimization Homework #1 Solution your NAME here your NetID here February 13, 2019 Instructions. Homework is due Wednesday, February 6, at 1:00pm; no late homework accepted. Please use the
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions HyangWon Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationLinear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016
Linear Programming Larry Blume Cornell University, IHS Vienna and SFI Summer 2016 These notes derive basic results in finitedimensional linear programming using tools of convex analysis. Most sources
More informationMathematics 530. Practice Problems. n + 1 }
Department of Mathematical Sciences University of Delaware Prof. T. Angell October 19, 2015 Mathematics 530 Practice Problems 1. Recall that an indifference relation on a partially ordered set is defined
More informationMotivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:
CDS270 Maryam Fazel Lecture 2 Topics from Optimization and Duality Motivation network utility maximization (NUM) problem: consider a network with S sources (users), each sending one flow at rate x s, through
More informationLagrangian Duality and Convex Optimization
Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DSGA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS
SECTION C: CONTINUOUS OPTIMISATION LECTURE : THE METHOD OF LAGRANGE MULTIPLIERS HONOUR SCHOOL OF MATHEMATICS OXFORD UNIVERSITY HILARY TERM 005 DR RAPHAEL HAUSER. Examples. In this lecture we will take
More information1. f(β) 0 (that is, β is a feasible point for the constraints)
xvi 2. The lasso for linear models 2.10 Bibliographic notes Appendix Convex optimization with constraints In this Appendix we present an overview of convex optimization concepts that are particularly useful
More informationConvex Analysis Background
Convex Analysis Background John C. Duchi Stanford University Park City Mathematics Institute 206 Abstract In this set of notes, we will outline several standard facts from convex analysis, the study of
More informationEquality constrained minimization
Chapter 10 Equality constrained minimization 10.1 Equality constrained minimization problems In this chapter we describe methods for solving a convex optimization problem with equality constraints, minimize
More informationCHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.
1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function
More informationExtreme points of compact convex sets
Extreme points of compact convex sets In this chapter, we are going to show that compact convex sets are determined by a proper subset, the set of its extreme points. Let us start with the main definition.
More informationBASICS OF CONVEX ANALYSIS
BASICS OF CONVEX ANALYSIS MARKUS GRASMAIR 1. Main Definitions We start with providing the central definitions of convex functions and convex sets. Definition 1. A function f : R n R + } is called convex,
More informationSome Properties of the Augmented Lagrangian in Cone Constrained Optimization
MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364765X eissn 15265471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented
More informationConvex Optimization in Communications and Signal Processing
Convex Optimization in Communications and Signal Processing Prof. Dr.Ing. Wolfgang Gerstacker 1 University of ErlangenNürnberg Institute for Digital Communications National Technical University of Ukraine,
More informationLagrange Relaxation and Duality
Lagrange Relaxation and Duality As we have already known, constrained optimization problems are harder to solve than unconstrained problems. By relaxation we can solve a more difficult problem by a simpler
More informationCONVEX OPTIMIZATION, DUALITY, AND THEIR APPLICATION TO SUPPORT VECTOR MACHINES. Contents 1. Introduction 1 2. Convex Sets
CONVEX OPTIMIZATION, DUALITY, AND THEIR APPLICATION TO SUPPORT VECTOR MACHINES DANIEL HENDRYCKS Abstract. This paper develops the fundamentals of convex optimization and applies them to Support Vector
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More information