Lecture 2: Convex functions
|
|
- Molly Morton
- 5 years ago
- Views:
Transcription
1 Lecture 2: Convex functions f : R n R is convex if dom f is convex and for all x, y dom f, θ [0, 1] f is concave if f is convex f(θx + (1 θ)y) θf(x) + (1 θ)f(y) x x convex concave neither x examples (on R) f(x) = x 2 is convex f(x) = log x is concave (dom f = R ++ ) f(x) = 1/x is convex (dom f = R ++ ) 1
2 Extended-valued extensions for f convex, it s convenient to define the extension f(x) = { f(x) x dom f + x dom f inequality f(θx + (1 θ)y) θ f(x) + (1 θ) f(y) holds for all x, y R n, 0 θ 1 (as an inequality in R {+ }) we ll use same symbol for f and its extension, i.e., we ll implicitly assume convex functions are extended 2
3 Epigraph & sublevel sets epigraph of a function f is epi f = {(x, t) x dom f, f(x) t } f(x) epi f x f convex function epi f convex set the (α-)sublevel set of f is C(α) = {x dom f f(x) α} f convex sublevel sets are convex (converse false) 3
4 Differentiable convex functions gradient of f : R n R f(x) = [ f x 1 f x 2 f xn ] T (evaluated at x) first order Taylor approximation at x 0 : first-order condition: for f differentiable, f(x) f(x 0 ) + f(x 0 ) T (x x 0 ) f is convex for all x, x 0 dom f, f(x) f(x) f(x 0 ) + f(x 0 ) T (x x 0 ) x 0 x i.e., 1st order approx. is a global underestimator f(x 0 ) + f(x 0 ) T (x x 0 ) 4
5 epigraph interpretation for all (x, t) epi f, [ f(x0 ) 1 ] T [ x x 0 t f(x 0 ) ] 0, i.e., ( f(x 0 ), 1) defines supporting hyperplane to epi f at (x 0, f(x 0 )) f(x) ( f(x 0 ), 1) 5
6 Hessian of a twice differentiable function: 2 f(x) = 2 f x f x 2 x 1. 2 f xn x 1 2 f x 1 x 2 2 f x f xn x 2 2 f x 1 xn 2 f x 2 xn. 2 f x 2 n (evaluated at x) 2nd order Taylor series expansion around x 0 : f(x) f(x 0 ) + f(x 0 ) T (x x 0 ) (x x 0) T 2 f(x 0 )(x x 0 ) second order condition: for f twice differentiable, f is convex for all x dom f, 2 f(x) 0 6
7 Simple examples linear and affine functions are convex and concave quadratic function f(x) = x T Px + 2q T x + r convex P 0; concave P 0 (P = P T ) any norm is convex examples on R: x α is convex on R ++ for α 1, α 0; concave for 0 α 1 log x is concave on R ++, x log x is convex on R + e αx is convex x, max(0, x), max(0, x) are convex log x e t2 dt is concave 7
8 Elementary properties a function is convex iff it is convex on all lines: f convex f(x 0 + th) convex in t for all x 0, h positive multiple of convex function is convex: f convex, α 0 = αf convex sum of convex functions is convex: f 1, f 2 convex = f 1 + f 2 convex extends to infinite sums, integrals: g(x, y) convex in x = g(x, y)dy convex 8
9 pointwise maximum: f 1, f 2 convex = max{f 1 (x), f 2 (x)} convex (corresponds to intersection of epigraphs) f 2 (x) epimax{f 1, f 2 } f 1 (x) pointwise supremum: f α convex = sup f α convex α A affine transformation of domain f convex = f(ax + b) convex x 9
10 More examples piecewise-linear functions: f(x) = max i {a T i x + b i} is convex in x (epi f is polyhedron) max distance to any set, sup s S x s, is convex in x f(x) = x [1] + x [2] + x [3] is convex on R n (x [i] is the ith largest x j ) f(x) = ( i x i) 1/n is concave on R n + f(x) = m i=1 log(b i a T i x) 1 is convex (dom f = {x a T i x < b i, i = 1,..., m}) least-squares cost as functions of weights, is concave in w f(w) = inf x i w i (a T i x b i) 2, 10
11 Convex functions of matrices Tr A T X = i,j A ijx ij is linear in X on R n n log det X 1 is convex on {X S n X 0} proof: let λ i be the eigenvalues of X 1/2 0 HX 1/2 0 f(t) = log det(x 0 + th) 1 = log det X log det(i + tx 1/2 0 HX 1/2 0 ) 1 = log det X 1 0 i log(1 + tλ i ) is a convex function of t (det X) 1/n is concave on {X S n X 0} λ max (X) is convex on S n. proof: λ max (X) = sup y 2 =1 y T Xy X 2 = σ 1 (X) = (λ max (X T X)) 1/2 is convex on R m n proof: X 2 = sup u 2 =1 Xu 2 11
12 Minimizing over some variables if h(x, y) is convex in x and y, then f(x) = inf y h(x, y) is convex in x corresponds to projection of epigraph, (x, y, t) (x, t) h(x, y) f(x) y x 12
13 examples if S R n is convex then (min) distance to S, is convex in x if g is convex, then dist(x, S) = inf x s s S f(y) = inf{g(x) Ax = y} is convex in y proof: (assume A R m n has rank m) find B s.t. R(B) = N(A); then Ax = y iff for some z, and hence x = A T (AA T ) 1 y + Bz f(y) = inf z g(at (AA T ) 1 y + Bz) 13
14 Composition one-dimensional case f(x) = h(g(x)) (g : R n R, h : R R) is convex if g convex; h convex, nondecreasing g concave; h convex, nonincreasing proof: (differentiable functions, x R) examples f = h (g ) 2 + g h f(x) = exp g(x) is convex if g is convex f(x) = 1/g(x) is convex if g is concave, positive f(x) = g(x) p, p 1, is convex if g(x) convex, positive f(x) = i log( f i(x)) is convex on {x f i (x) < 0} if f i are convex 14
15 Composition k-dimensional case f(x) = h(g 1 (x),..., g k (x)) with h : R k R, g i : R n R is convex if h convex, nondecreasing in each arg.; g i convex h convex, nonincreasing in each arg.; g i concave etc. proof: (differentiable functions, n = 1) examples f = h T g 1. g k + g 1. g k T 2 h g 1. g k f(x) = max i g i (x) is convex if each g i is f(x) = log i exp g i(x) is convex if each g i is 15
16 Jensen s inequality f : R n R convex two points: θ 1 + θ 2 = 1, θ i 0 = f(θ 1 x 1 + θ 2 x 2 ) θ 1 f(x 1 ) + θ 2 f(x 2 ) more than two points: i θ i = 1, θ i 0 = f( i θ ix i ) i θ if(x i ) continuous version: p(x) 0, p(x) dx = 1 = f( xp(x) dx) f(x)p(x) dx most general form: for any prob. distr. on x, f(e x) E f(x) these are all called Jensen s inequality 16
17 interpretation of Jensen s inequality: (zero mean) randomization, dithering increases average value of a convex function many (some people claim most) inequalities can be derived from Jensen s inequality example: arithmetic-geometric mean inequality a, b 0 ab (a + b)/2 proof: f(x) = log x is concave on {x x > 0}, so for a, b > 0, ( ) 1 a + b (log a + log b) log
18 Conjugate functions the conjugate function of f : R n R is f (y) = sup x dom f ( ) y T x f(x) y T x f(x) f is convex (even if f isn t) f (y) will be useful later x f (y) 18
19 Examples f(x) = log x (dom f = {x x > 0}): f (y) = sup(xy + log x) x>0 = { 1 log( y) if y < 0 + otherwise f(x) = x T Px (P 0): f (y) = sup(y T x x T Px) = 1 x 4 yt P 1 y 19
20 Quasiconvex functions f : R n R is quasiconvex if every sublevel set is convex y S α = {x dom f f(x) α} f(x) α x S α x can have locally flat regions f is quasiconcave if f is quasiconvex, i.e., superlevel sets {x f(x) α} are convex a function which is both quasiconvex and quasiconcave is called quasilinear f convex (concave) f quasiconvex (quasiconcave) 20
21 Examples f(x) = x is quasiconvex on R f(x) = log x is quasilinear on R + linear fractional function, f(x) = at x + b c T x + d is quasilinear on the halfspace c T x + d > 0 f(x) = x a 2 x b 2 is quasiconvex on the halfspace {x x a 2 x b 2 } f(a) = degree(a 0 + a 1 t + + a k t k ) on R k+1 21
22 Properties f is quasiconvex if and only if it is quasiconvex on lines, i.e., f(x 0 + th) quasiconvex in t for all x 0, h modified Jensen s inequality: f is quasiconvex iff for all x, y dom f, θ [0, 1], f(θx + (1 θ)y) max{f(x), f(y)} f(x) x y 22
23 for f differentiable, f quasiconvex for all x, y dom f f(y) f(x) (y x) T f(x) 0 S α1 x f(x) S α2 S α3 α 1 < α 2 < α 3 positive multiples f quasiconvex, α 0 = αf quasiconvex 23
24 pointwise maximum f 1, f 2 quasiconvex = max{f 1, f 2 } quasiconvex (extends to supremum over arbitrary set) affine transformation of domain f quasiconvex = f(ax + b) quasiconvex linear-fractional transformation of domain ( ) Ax + b f quasiconvex = f c T x + d on c T x + d > 0 composition with monotone increasing function quasiconvex f quasiconvex, g monotone increasing = g(f(x)) quasiconvex sums of quasiconvex functions are not quasiconvex in general f quasiconvex in x, y = g(x) = inf y f(x, y) quasiconvex in x 24
25 Nested sets characterization f quasiconvex sublevel sets S α are convex, nested, i.e., α 1 α 2 S α1 S α2 converse: if T α is a nested family of convex sets, then f(x) = inf{α x T α } is quasiconvex. engineering interpretation: T α are specs, tighter for smaller α 25
26 Examples FIR filter: H(ω) = a 0 + N k=1 a k cos kω 0 db 3 db H(ω) H(0) 50 db π f(a) f(a) π 3dB-bandwidth f(a) = inf {ω > 0 20 log 10 ( H(ω) / H(0) ) 3.0} is a quasiconcave function on {a R N+1 H(0) > 0} why? for H(0) > 0, f(a) ω 0 H(ω) > H(0)/ 2 for 0 ω < ω 0... an (infinite) intersection of halfspaces 26
27 electron-beam lithography E [0, 1] [0, 1]: desired exposure region E c = [0, 1] [0, 1]\E: desired non-exposure region E E c I(p): e-beam intensity at position p [0, 1] [0, 1] I(p) = i x i g(p p i ), i = 1,..., N x i : intensity of electron beam directed at pixel i g(p): given (point-spread) function 27
28 pattern transition width define φ(x) as minimum α s.t. I(p) 0.9 I(p) 0.1 for dist(p, E c ) α for dist(p, E) α 2φ(x) 2φ(x) dist(p, E c ) α 0.9 transition region dist(p, E) α E c E E c φ(x) is quasiconvex 28
29 Log-concave functions f : R n R + is log-concave (log-convex) if log f is concave (convex) log-convex convex; concave log-concave examples normal density, f(x) = e (1/2)(x x 0 )T Σ 1 (x x 0 ) erfc, f(x) = 2 π x e t2 dt indicator function of convex set C: I C (x) = { 1 x C 0 x C 29
30 Properties sum of log-concave functions not always log-concave (but sum of log-convex functions is log-convex) products f, g log-concave = fg log-concave (immediate) integrals f(x, y) log-concave in x, y = f(x, y)dy log-concave (not easy to show!) convolutions f, g log-concave = f(x y)g(y)dy log-concave (immediate from the properties above) 30
31 Log-concave probability densities many common probability density functions are log-concave normal (Σ 0) f(x) = 1 (2π)n det Σ e 1 2 (x x)t Σ 1 (x x) exponential (λ i > 0) f(x) = ( n λ i ) e (λ 1 x 1 + +λ nxn), x R n + i=1 uniform distribution on convex (bounded) set C f(x) = { 1/α x C 0 x C where α is Lebesgue measure of C (i.e., length, area, volume... ) 31
32 Example: manufacturing yield x manu = x + v x R n : nominal value of design parameters v R n : manufacturing errors; zero mean random variable S R n : specs, i.e., acceptable values of x manu the yield Y (x) = Prob(x + v S) is log-concave if S is a convex set the probability density of v is log-concave 10% 20% 30% 40% 60% 50% 80% 70% 32
33 example S = {x R 2 x 1 1, x 2 1} v 1, v 2 : independent, normal with σ = 1 yield(x) = Prob(x + v S) = 1 2π ( 1 x 1 e t 2 /2 dt) ( e t 2 /2 dt 1 x 2 ) 10% 30% 50% % 90% 95% S 99% 2 x x 1 33
34 example (continued): max yield vs. cost manufacturing cost c = x 1 + 2x 2 ; max yield for given cost is Y opt (c) = sup x 1 + 2x 2 = c x 1, x 2 0 Y (x) Y opt is log-concave 100% 10% log Y opt (c) = inf x 1 + 2x 2 = c x 1, x 2 0 log Y (x 1, x 2 ) 1% 0.1% 0.01% cost c 34
35 K-convexity cvx. cone K R m induces generalized inequality K f : R n R m is K-convex if 0 θ 1 = f(θx + (1 θ)y) K θf(x) + (1 θ)f(y) example. K is PSD cone (called matrix convexity). f(x) = X 2 is K-convex on S m let s show that for θ [0, 1], (θx + (1 θ)y ) 2 θx 2 + (1 θ)y 2 (1) for any u R m, u T X 2 u = Xu 2 2 is a (quadratic) convex fct of X, so which implies (1) u T (θx + (1 θ)y ) 2 u θu T X 2 u + (1 θ)u T Y 2 u 35
3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions log-concave and log-convex functions
More information3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions log-concave and log-convex functions
More informationConve functions 2{2 Conve functions f : R n! R is conve if dom f is conve and y 2 dom f 2 [0 1] + f ( + (1 ; )y) f () + (1 ; )f (y) (1) f is concave i
ELEC-E5420 Conve functions 2{1 Lecture 2 Conve functions conve functions, epigraph simple eamples, elementary properties more eamples, more properties Jensen's inequality quasiconve, quasiconcave functions
More informationConvex Functions. Wing-Kin (Ken) Ma The Chinese University of Hong Kong (CUHK)
Convex Functions Wing-Kin (Ken) Ma The Chinese University of Hong Kong (CUHK) Course on Convex Optimization for Wireless Comm. and Signal Proc. Jointly taught by Daniel P. Palomar and Wing-Kin (Ken) Ma
More informationConvex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Convex Functions Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Definition convex function Examples
More informationConvex functions. Definition. f : R n! R is convex if dom f is a convex set and. f ( x +(1 )y) < f (x)+(1 )f (y) f ( x +(1 )y) apple f (x)+(1 )f (y)
Convex functions I basic properties and I operations that preserve convexity I quasiconvex functions I log-concave and log-convex functions IOE 611: Nonlinear Programming, Fall 2017 3. Convex functions
More informationA function(al) f is convex if dom f is a convex set, and. f(θx + (1 θ)y) < θf(x) + (1 θ)f(y) f(x) = x 3
Convex functions The domain dom f of a functional f : R N R is the subset of R N where f is well-defined. A function(al) f is convex if dom f is a convex set, and f(θx + (1 θ)y) θf(x) + (1 θ)f(y) for all
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction Convex Optimization Boyd & Vandenberghe mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction Convex Optimization Boyd & Vandenberghe mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationOptimisation convexe: performance, complexité et applications.
Optimisation convexe: performance, complexité et applications. Introduction, convexité, dualité. A. d Aspremont. M2 OJME: Optimisation convexe. 1/128 Today Convex optimization: introduction Course organization
More informationConvex Functions. Pontus Giselsson
Convex Functions Pontus Giselsson 1 Today s lecture lower semicontinuity, closure, convex hull convexity preserving operations precomposition with affine mapping infimal convolution image function supremum
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationSubgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic
More informationNonlinear Programming Models
Nonlinear Programming Models Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Nonlinear Programming Models p. Introduction Nonlinear Programming Models p. NLP problems minf(x) x S R n Standard form:
More informationSubgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients strong and weak subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE364b, Stanford University Basic inequality recall basic inequality
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationCS675: Convex and Combinatorial Optimization Fall 2016 Convex Optimization Problems. Instructor: Shaddin Dughmi
CS675: Convex and Combinatorial Optimization Fall 2016 Convex Optimization Problems Instructor: Shaddin Dughmi Outline 1 Convex Optimization Basics 2 Common Classes 3 Interlude: Positive Semi-Definite
More information4. Convex optimization problems
Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization
More informationLecture 1: Background on Convex Analysis
Lecture 1: Background on Convex Analysis John Duchi PCMI 2016 Outline I Convex sets 1.1 Definitions and examples 2.2 Basic properties 3.3 Projections onto convex sets 4.4 Separating and supporting hyperplanes
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationHW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.
HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationLecture: Convex Optimization Problems
1/36 Lecture: Convex Optimization Problems http://bicmr.pku.edu.cn/~wenzw/opt-2015-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/36 optimization
More informationConvex Functions and Optimization
Chapter 5 Convex Functions and Optimization 5.1 Convex Functions Our next topic is that of convex functions. Again, we will concentrate on the context of a map f : R n R although the situation can be generalized
More information1. Gradient method. gradient method, first-order methods. quadratic bounds on convex functions. analysis of gradient method
L. Vandenberghe EE236C (Spring 2016) 1. Gradient method gradient method, first-order methods quadratic bounds on convex functions analysis of gradient method 1-1 Approximate course outline First-order
More informationEE 546, Univ of Washington, Spring Proximal mapping. introduction. review of conjugate functions. proximal mapping. Proximal mapping 6 1
EE 546, Univ of Washington, Spring 2012 6. Proximal mapping introduction review of conjugate functions proximal mapping Proximal mapping 6 1 Proximal mapping the proximal mapping (prox-operator) of a convex
More informationIE 521 Convex Optimization
Lecture 5: Convex II 6th February 2019 Convex Local Lipschitz Outline Local Lipschitz 1 / 23 Convex Local Lipschitz Convex Function: f : R n R is convex if dom(f ) is convex and for any λ [0, 1], x, y
More informationConvex Optimization in Communications and Signal Processing
Convex Optimization in Communications and Signal Processing Prof. Dr.-Ing. Wolfgang Gerstacker 1 University of Erlangen-Nürnberg Institute for Digital Communications National Technical University of Ukraine,
More informationA : k n. Usually k > n otherwise easily the minimum is zero. Analytical solution:
1-5: Least-squares I A : k n. Usually k > n otherwise easily the minimum is zero. Analytical solution: f (x) =(Ax b) T (Ax b) =x T A T Ax 2b T Ax + b T b f (x) = 2A T Ax 2A T b = 0 Chih-Jen Lin (National
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17
EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 17 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 29, 2012 Andre Tkacenko
More informationThe proximal mapping
The proximal mapping http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/37 1 closed function 2 Conjugate function
More informationLecture 1: January 12
10-725/36-725: Convex Optimization Fall 2015 Lecturer: Ryan Tibshirani Lecture 1: January 12 Scribes: Seo-Jin Bang, Prabhat KC, Josue Orellana 1.1 Review We begin by going through some examples and key
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationA : k n. Usually k > n otherwise easily the minimum is zero. Analytical solution:
1-5: Least-squares I A : k n. Usually k > n otherwise easily the minimum is zero. Analytical solution: f (x) =(Ax b) T (Ax b) =x T A T Ax 2b T Ax + b T b f (x) = 2A T Ax 2A T b = 0 Chih-Jen Lin (National
More informationSubgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus
1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality
More informationConvex optimization problems. Optimization problem in standard form
Convex optimization problems optimization problem in standard form convex optimization problems linear optimization quadratic optimization geometric programming quasiconvex optimization generalized inequality
More informationEC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1
EC9A0: Pre-sessional Advanced Mathematics Course Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1 1 Infimum and Supremum Definition 1. Fix a set Y R. A number α R is an upper bound of Y if
More information4. Convex optimization problems
Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization
More informationOptimization and Optimal Control in Banach Spaces
Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,
More informationLMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009
LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix
More informationConvex Optimization. Convex Analysis - Functions
Convex Optimization Convex Analsis - Functions p. 1 A function f : K R n R is convex, if K is a convex set and x, K,x, λ (,1) we have f(λx+(1 λ)) λf(x)+(1 λ)f(). (x, f(x)) (,f()) x - strictl convex,
More informationStatic Problem Set 2 Solutions
Static Problem Set Solutions Jonathan Kreamer July, 0 Question (i) Let g, h be two concave functions. Is f = g + h a concave function? Prove it. Yes. Proof: Consider any two points x, x and α [0, ]. Let
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationConcave and Convex Functions 1
John Nachbar Washington University March 6, 2017 Concave and Convex Functions 1 1 Basic Definitions Definition 1 Let C R N be convex and let f : C R 1 (a) f is concave iff for any a, b C and any θ [0,
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history of convex optimization 1 1 Mathematical
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction Convex Optimization Boyd & Vandenberghe mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction Convex Optimization Boyd & Vandenberghe mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history
More informationCHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.
1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationLecture 4: Convex Functions, Part I February 1
IE 521: Convex Optimization Instructor: Niao He Lecture 4: Convex Functions, Part I February 1 Spring 2017, UIUC Scribe: Shuanglong Wang Courtesy warning: These notes do not necessarily cover everything
More information8. Conjugate functions
L. Vandenberghe EE236C (Spring 2013-14) 8. Conjugate functions closed functions conjugate function 8-1 Closed set a set C is closed if it contains its boundary: x k C, x k x = x C operations that preserve
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationConcave and Convex Functions 1
John Nachbar Washington University March 27, 2018 Concave and Convex Functions 1 1 Basic Definitions. Definition 1. Let C R N be non-empty and convex and let f : C R. 1. (a) f is concave iff for any a,
More information4. Convex optimization problems (part 1: general)
EE/AA 578, Univ of Washington, Fall 2016 4. Convex optimization problems (part 1: general) optimization problem in standard form convex optimization problems quasiconvex optimization 4 1 Optimization problem
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee7c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee7c@berkeley.edu February
More informationOptimality Conditions for Nonsmooth Convex Optimization
Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never
More informationIntegral Jensen inequality
Integral Jensen inequality Let us consider a convex set R d, and a convex function f : (, + ]. For any x,..., x n and λ,..., λ n with n λ i =, we have () f( n λ ix i ) n λ if(x i ). For a R d, let δ a
More informationConvex Analysis and Optimization Chapter 2 Solutions
Convex Analysis and Optimization Chapter 2 Solutions Dimitri P. Bertsekas with Angelia Nedić and Asuman E. Ozdaglar Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 4. Suvrit Sra. (Conjugates, subdifferentials) 31 Jan, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 4 (Conjugates, subdifferentials) 31 Jan, 2013 Suvrit Sra Organizational HW1 due: 14th Feb 2013 in class. Please L A TEX your solutions (contact TA if this
More informationExercises. Exercises. Basic terminology and optimality conditions. 4.2 Consider the optimization problem
Exercises Basic terminology and optimality conditions 4.1 Consider the optimization problem f 0(x 1, x 2) 2x 1 + x 2 1 x 1 + 3x 2 1 x 1 0, x 2 0. Make a sketch of the feasible set. For each of the following
More informationGeometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as
Chapter 8 Geometric problems 8.1 Projection on a set The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as dist(x 0,C) = inf{ x 0 x x C}. The infimum here is always achieved.
More informationMathematics 530. Practice Problems. n + 1 }
Department of Mathematical Sciences University of Delaware Prof. T. Angell October 19, 2015 Mathematics 530 Practice Problems 1. Recall that an indifference relation on a partially ordered set is defined
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Convex Optimization Fourth lecture, 05.05.2010 Jun.-Prof. Matthias Hein Reminder from last time Convex functions: first-order condition: f(y) f(x) + f x,y x, second-order
More informationConvexity in R n. The following lemma will be needed in a while. Lemma 1 Let x E, u R n. If τ I(x, u), τ 0, define. f(x + τu) f(x). τ.
Convexity in R n Let E be a convex subset of R n. A function f : E (, ] is convex iff f(tx + (1 t)y) (1 t)f(x) + tf(y) x, y E, t [0, 1]. A similar definition holds in any vector space. A topology is needed
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationHandout 2: Elements of Convex Analysis
ENGG 5501: Foundations of Optimization 2018 19 First Term Handout 2: Elements of Convex Analysis Instructor: Anthony Man Cho So September 10, 2018 As briefly mentioned in Handout 1, the notion of convexity
More informationDefinition of convex function in a vector space
Convex functions Nonlinear optimization Instituto Superior Técnico and Carnegie Mellon University PhD course João Xavier TAs: Brian Swenson, Shanghang Zhang, Lucas Balthazar Convex functions Nonconvex
More informationChapter 3a Topics in differentiation. Problems in differentiation. Problems in differentiation. LC Abueg: mathematical economics
Chapter 3a Topics in differentiation Lectures in Mathematical Economics L Cagandahan Abueg De La Salle University School of Economics Problems in differentiation Problems in differentiation Problem 1.
More informationConvex Optimization. Lieven Vandenberghe Electrical Engineering Department, UCLA. Joint work with Stephen Boyd, Stanford University
Convex Optimization Lieven Vandenberghe Electrical Engineering Department, UCLA Joint work with Stephen Boyd, Stanford University Ph.D. School in Optimization in Computer Vision DTU, May 19, 2008 Introduction
More informationChapter 13. Convex and Concave. Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44
Chapter 13 Convex and Concave Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44 Monotone Function Function f is called monotonically increasing, if x 1 x 2 f (x 1 ) f (x 2 ) It
More informationThe general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.
1 Optimization Mathematical programming refers to the basic mathematical problem of finding a maximum to a function, f, subject to some constraints. 1 In other words, the objective is to find a point,
More informationL. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones. definition. spectral decomposition. quadratic representation. log-det barrier 18-1
L. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones definition spectral decomposition quadratic representation log-det barrier 18-1 Introduction This lecture: theoretical properties of the following
More informationSolution to EE 617 Mid-Term Exam, Fall November 2, 2017
Solution to EE 67 Mid-erm Exam, Fall 207 November 2, 207 EE 67 Solution to Mid-erm Exam - Page 2 of 2 November 2, 207 (4 points) Convex sets (a) (2 points) Consider the set { } a R k p(0) =, p(t) for t
More informationIn English, this means that if we travel on a straight line between any two points in C, then we never leave C.
Convex sets In this section, we will be introduced to some of the mathematical fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from
More informationg 2 (x) (1/3)M 1 = (1/3)(2/3)M.
COMPACTNESS If C R n is closed and bounded, then by B-W it is sequentially compact: any sequence of points in C has a subsequence converging to a point in C Conversely, any sequentially compact C R n is
More informationAnalysis and Linear Algebra. Lectures 1-3 on the mathematical tools that will be used in C103
Analysis and Linear Algebra Lectures 1-3 on the mathematical tools that will be used in C103 Set Notation A, B sets AcB union A1B intersection A\B the set of objects in A that are not in B N. Empty set
More informationfor all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true
3 ohn Nirenberg inequality, Part I A function ϕ L () belongs to the space BMO() if sup ϕ(s) ϕ I I I < for all subintervals I If the same is true for the dyadic subintervals I D only, we will write ϕ BMO
More informationLecture 8 Plus properties, merit functions and gap functions. September 28, 2008
Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:
More informationGeneral Notation. Exercises and Problems
Exercises and Problems The text contains both Exercises and Problems. The exercises are incorporated into the development of the theory in each section. Additional Problems appear at the end of most sections.
More informationLecture 8: Basic convex analysis
Lecture 8: Basic convex analysis 1 Convex sets Both convex sets and functions have general importance in economic theory, not only in optimization. Given two points x; y 2 R n and 2 [0; 1]; the weighted
More informationConvex Optimization for Signal Processing and Communications: From Fundamentals to Applications
Convex Optimization for Signal Processing and Communications: From Fundamentals to Applications Chong-Yung Chi Institute of Communications Engineering & Department of Electrical Engineering National Tsing
More informationLecture 4: Linear and quadratic problems
Lecture 4: Linear and quadratic problems linear programming examples and applications linear fractional programming quadratic optimization problems (quadratically constrained) quadratic programming second-order
More informationDescent methods. min x. f(x)
Gradient Descent Descent methods min x f(x) 5 / 34 Descent methods min x f(x) x k x k+1... x f(x ) = 0 5 / 34 Gradient methods Unconstrained optimization min f(x) x R n. 6 / 34 Gradient methods Unconstrained
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationsubject to (x 2)(x 4) u,
Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationEE Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko, JPL Third Term
EE 150 - Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko JPL Third Term 2011-2012 Due on Thursday May 3 in class. Homework Set #4 1. (10 points) (Adapted
More information2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian
FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationAppendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS
Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS Here we consider systems of linear constraints, consisting of equations or inequalities or both. A feasible solution
More informationMinimizing Cubic and Homogeneous Polynomials over Integers in the Plane
Minimizing Cubic and Homogeneous Polynomials over Integers in the Plane Alberto Del Pia Department of Industrial and Systems Engineering & Wisconsin Institutes for Discovery, University of Wisconsin-Madison
More information2) Let X be a compact space. Prove that the space C(X) of continuous real-valued functions is a complete metric space.
University of Bergen General Functional Analysis Problems with solutions 6 ) Prove that is unique in any normed space. Solution of ) Let us suppose that there are 2 zeros and 2. Then = + 2 = 2 + = 2. 2)
More informationPower series solutions for 2nd order linear ODE s (not necessarily with constant coefficients) a n z n. n=0
Lecture 22 Power series solutions for 2nd order linear ODE s (not necessarily with constant coefficients) Recall a few facts about power series: a n z n This series in z is centered at z 0. Here z can
More information1 Introduction to Optimization
Unconstrained Convex Optimization 2 1 Introduction to Optimization Given a general optimization problem of te form min x f(x) (1.1) were f : R n R. Sometimes te problem as constraints (we are only interested
More informationMultivariable Calculus
2 Multivariable Calculus 2.1 Limits and Continuity Problem 2.1.1 (Fa94) Let the function f : R n R n satisfy the following two conditions: (i) f (K ) is compact whenever K is a compact subset of R n. (ii)
More informationd(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N
Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x
More information