Practice Exam 1: Continuous Optimisation

Similar documents
Exam: Continuous Optimisation 2015

Chap 2. Optimality conditions

Lecture 4: Convex Functions, Part I February 1

Lecture 6 - Convex Sets

Duality (Continued) min f ( x), X R R. Recall, the general primal problem is. The Lagrangian is a function. defined by

Convex Optimization M2

Continuous Optimisation, Chpt 9: Semidefinite Problems

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Optimality Conditions for Constrained Optimization

EE 227A: Convex Optimization and Applications October 14, 2008

Sparse Optimization Lecture: Dual Certificate in l 1 Minimization

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm. Convex Optimization. Computing and Software McMaster University

Generalization to inequality constrained problem. Maximize

Continuous Optimisation, Chpt 9: Semidefinite Optimisation

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

BASICS OF CONVEX ANALYSIS

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

minimize x subject to (x 2)(x 4) u,

Subgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives

Chapter 2 Convex Analysis

Continuous Optimisation, Chpt 7: Proper Cones

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

Optimization and Optimal Control in Banach Spaces

Lecture Note 5: Semidefinite Programming for Stability Analysis

Summer School: Semidefinite Optimization

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Convex Optimization & Lagrange Duality

Lecture: Duality of LP, SOCP and SDP

Finite Dimensional Optimization Part III: Convex Optimization 1

5. Duality. Lagrangian

Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Lagrange Relaxation and Duality

March 5, 2012 MATH 408 FINAL EXAM SAMPLE

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

IE 521 Convex Optimization

SECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Lagrangian Duality Theory

Symmetric Matrices and Eigendecomposition

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

4TE3/6TE3. Algorithms for. Continuous Optimization

SEMIDEFINITE PROGRAM BASICS. Contents

Lagrange Multipliers

Optimization for Machine Learning

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016

Examination paper for TMA4180 Optimization I

4. Algebra and Duality

Constrained Optimization and Lagrangian Duality

Introduction to Mathematical Programming IE406. Lecture 3. Dr. Ted Ralphs

SF2822 Applied nonlinear optimization, final exam Saturday December

Computational Optimization. Convexity and Unconstrained Optimization 1/29/08 and 2/1(revised)

12. Interior-point methods

Subgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives

Franco Giannessi, Giandomenico Mastroeni. Institute of Mathematics University of Verona, Verona, Italy

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

Coordinate Update Algorithm Short Course Subgradients and Subgradient Methods

Inequality Constraints

Lecture: Duality.

Solution to EE 617 Mid-Term Exam, Fall November 2, 2017

Lecture 1: Convex Sets January 23

Applications of Linear Programming

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

Linear and non-linear programming

Final Exam Advanced Mathematics for Economics and Finance

CO 250 Final Exam Guide

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Convex Optimization Boyd & Vandenberghe. 5. Duality

15. Conic optimization

Introduction to Nonlinear Stochastic Programming

Chapter 1. Preliminaries

Lecture 2: Convex Sets and Functions

Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014

Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

Asteroide Santana, Santanu S. Dey. December 4, School of Industrial and Systems Engineering, Georgia Institute of Technology

GEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, Dedicated to Franco Giannessi and Diethard Pallaschke with great respect

Lecture 14: Optimality Conditions for Conic Problems

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

Nonlinear Programming

Lagrangian Duality. Richard Lusby. Department of Management Engineering Technical University of Denmark

Convex Optimization and Modeling

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents

Chapter 1. Optimality Conditions: Unconstrained Optimization. 1.1 Differentiable Problems

Nonlinear Optimization

The proximal mapping

SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Minimality Concepts Using a New Parameterized Binary Relation in Vector Optimization 1

Transcription:

Practice Exam : Continuous Optimisation. Let f : R m R be a convex function and let A R m n, b R m be given. Show that the function g(x) := f(ax + b) is a convex function of x on R n. Suppose that f is strictly convex. Show that then g(x) := f(ax+b) is strictly convex if and only if A has (full) rank n. Hint: Recall that f is strictly convex if for any x y, 0 < λ < it holds: f(λy + ( λ)y 2 ) < λf(y ) + ( λ)f(y 2 ). [3 points] [ points] For x, y R n, λ [0, ] we find: g(λx + ( λ)y) = f(a(λx + ( λ)y)b) = f(λax + ( λ)ay + λb + ( λ)b) = f(λ(ax + b) + ( λ)(ay + b)) f is convex λf(ax + b) + ( λ)f(ay + b) = λg(x) + ( λ)g(y) rank(a) = n implies: x y Ax Ay. As in for x y, λ (0, ) we obtain: g(λx + ( λ)y) = f(λ(ax + b) + ( λ)(ay + b)) f is strict convex, Ax + b Ay + b < λf(ax + b) + ( λ)f(ay + b) = λg(x) + ( λ)g(y) Assume rank(a) < n. hen there exist x y with Ax = Ay and for any λ (0, ) we obtain: g(λx + ( λ)y) = f(λ(ax + b) + ( λ)(ay + b)) = f(ax + b) g(x) = g(y) = g(x) = λg(x) + ( λ)g(y). So g is not strictly convex. 2. For given S R n we define the convex hull conv(s) by { m m conv(s) = x = λ i x i : m N, λ i = ; x i S, λ i 0 i Show that conv(s) is the smallest convex set containing S: Show that the set conv(s) is convex with S conv(s). [3 points]

est exam: Continuous Optimisation 205 Monday th December 205 Show that for any convex set C containing S we must have conv(s) C. (Hint: You may use without proof any Lemma/heorem/Corollary from the course, except for heorem.9.) [3 points] For all x S, letting m = and λ = we see that x conv(s), and thus S conv(s). o prove convexity we consider arbitrary x, y conv(s), λ [0, ]. hen there exists p, q N and x,..., x p S and y,..., y q S and θ R p + and ψ R q + such that x = θ i x i, = θ i, y = ψ i y i, = ψ i. We then have λθ i + λx + ( λ)y = λθ i x i + ( λ)ψ i y i, λθ i 0 for all i =,..., p, ( λ)ψ i 0 for all i =,..., q, ( λ)ψ i = λ θ i + ( λ) ψ i = λ + ( λ) =. herefore λx + ( λ)y conv(s), and as x, y conv(s), λ [0, ] were arbitrary, this implies that conv(s) is a convex set. Consider an arbitrary convex set C R n such that S C. For m N let S m = { x = m λ i x i : m λ i = ; x i S, λ i > 0 i. We then have conv(s) = m= Sm. We shall show by induction that S m C for all m, and thus conv(s) = m= Sm C. For m = we have S = S C. Now suppose for the sake of induction that S m arbitrary x S m+. C and consider an We have x = m+ λ ix i for some x i S, λ i > 0 for all i and m λ i =. Letting θ R m + such that θ i = λ i /( λ m+ ) and letting y = m θ ix i we have m θ i = and thus y S m C by the induction hypothesis. We then have y C and x m+ S C and λ m+ [0, ] and x = ( λ m+ )y + λ m+ x m+. As C is a convex set, this implies that x C. 2

est exam: Continuous Optimisation 205 Monday th December 205 3. Consider with 0 c R n the program: (P ) min x R n c x s.t. x x. Show that x = c is the minimizer of (P) with minimum value v(p ) = c. c ( x means here the Euclidian norm.) Compute the solution y of the Lagrangean dual (D) of (P). Show in this way that for the optimal values strong duality holds, i.e., v(d) = v(p ). [2 points] [ points] Either show this by a sketch. Or as follows (using Schwarz inequality): x implies: c x c x c, and c x = c holds iff x = c c So x = c c is the minimizer with v(p ) = c ( c c ) = c. (Alternatively find x by solving the KK-conditions.) he dual (D) is given by (D) max y 0 ψ(y) where ψ(y) := min L(x, y) x Rn with Lagrangean function L(x, y) = c x + y(x x ). We find for y = 0: ψ(0) =. for y > 0: he minimizer x of ψ(y) satisfies x L(x, y) = c + 2yx = 0 or x = c. So (fill in) 2y ψ(y) = 2y c c + y c c y = y c c y. he Lagrangian dual problem is thus max y>0 { y c c y. o find an (unconstrained) maximizer of ψ(y) for y > 0 we solve 0 = ψ (y) = y 2 c c with solution y = 2 c. So v(d) = ψ(y) = c = v(p ).. Consider the problem (in connection with the design of a cylindrical can with height h, radius r and volume at least 2π such that the total surface area is minimal): (P ) : min f(h, r) := 2π(r 2 + rh) s.t. πr 2 h 2π, (and h > 0, r > 0) 3

est exam: Continuous Optimisation 205 Monday th December 205 Compute a (the) solution (h, r) of the KK conditions of (P). Show that (P ) is not a convex optimization problem. Show that the solution (h, r) in is a local minimizer. Why is it the unique global solution? Hint: Use the sufficient optimality conditions [ points] [3 points] We first note that the functions f(h, r) = 2π(r 2 + rh) and g(h, r) := πr 2 h + 2π are not convex (for h > 0). For the objective function f, e.g., this follows by: r 0 f = 2π, 2 f = 2π 2r + h 2 By Lemma.2 we thus have that 2 f is not positive semidefinite, thus f is not a convex function and we do not have a convex optimisation problem. We now consider the KK condition: ( f = µ g, g 0, µ g = 0) So consider: 2π ( r 2r+h = µπ r 2 2rh) ( ): Case µ = 0: leads to 2π ( r 2r+h) = 0 with solution (h, r) = (0, 0) which is not feasible. Case µ > 0 and thus g = 0, or equivalently πr 2 h = 2π: he 2 equations in ( ) lead to µ = 2/r and then 2(2r+h) = 2 2rh or h = 2r. r By using the (active) constraint we find πr 2 h = 2πr 3 = 2π with solution r =. So the unique KK solution is given by (h, r) = (2, ), µ = 2. (We apply the second order sufficient conditions of h. 5. to the nonconvex program (P)). So we will show (for the set C = {d R n : f d 0, g d 0): d 2 h,rl(h, r, µ)d > 0 d C \ {0 ( ) We compute f(h, r) = 2π, g(h, r) = π, 0 0 2 0 2 L(h, r, µ) = 2π + 2( π) = 2π 2 2 2 and C = {d R 2 f(h, r) d 0, g(h, r) d 0 { = d R 2 d 0, d 0 { = λ λ R

est exam: Continuous Optimisation 205 Monday th December 205 For d = λ(, ) 0, (i.e., λ 0) we obtain (see ( )): 0 λ(, )( 2π) λ =... = 2λ 2 π6 > 0 λ 0. 2 So (h, r) = (2, ) is a local minimizer. It is the unique (global) minimizer since the point is the only KK point. Note that since the linear independency constraint qualification holds ( g = π ( r 2rh) 2 0, for r, h > 0) any local minimizer must be a KK point. Also note that for feasible (h, r) also f holds. (o show the latter fact is technically involved and was not expected to be done.) 5. Consider the closed set K = {x R 2 x + 2x 2 0 and 3x + x 2 0 Prove that K is a proper cone. [You may assume closure.] Find the dual cone to K. [5 points] [ point] In order for a set to be a proper cone it most be a closed, convex, pointed full-dimensional cone. We will assume closure and prove the rest: Convex cone: Consider an arbitrary x, y K and λ, λ 2 > 0. From heorem 7.2, if we can show that λ x + λ 2 y K then we are done. We have his implies that x + 2x 2 0, 3x + x 2 0, λ > 0, y + 2y 2 0, 3y + y 2 0, λ 2 > 0. (λ x + λ 2 y) + 2(λ x + λ 2 y) 2 = λ (x + 2x 2 ) + λ 2 (y + 2y 2 ) 0, 3(λ x + λ 2 y) + (λ x + λ 2 y) 2 = λ (3x + x 2 ) + λ 2 (3y + y 2 ) 0. herefore λ x + λ 2 y K. Full-dimensional: Using Definition 7.8, part 2, this follows from the space being two dimensional and having two linearly independent 0 vectors, K. 0 5

est exam: Continuous Optimisation 205 Monday th December 205 Pointed: We will consider an arbitrary x R 2 such that ±x K. Using Definition 7.7, if we can then show that x = 0 then we are done. We have (x) + 2(x) 2 0 x + 2x 2 = 0, ( x) + 2( x) 2 0 3(x) + (x) 2 0 3x + x 2 = 0. 3( x) + ( x) 2 0 herefore x = 2 (3x 5 + x 2 ) {{ (x 5 + 2x 2 ) {{ =0 =0 = 0, x 2 = (3x + x 2 ) {{ =0 3 x {{ = 0. =0 From heorem 8.7 we have { K = cl conic, 2 { 3 = conic, 2 3. 6. We will consider bounds to the optimal value of the following problem: min x 5x 2 x x 2 2x + x 2 2 + 2 s.t. x 2 + 5x 2 2 x x 2 8x 2 = x R 2. (A) (c) Give a finite upper bound on the optimal value of problem (A). Formulate a positive semidefinite optimisation problem to give a lower bound on the optimal value of problem (A). Give the dual problem to the positive semidefinite optimisation problem you formulated in part of this question. [ point] [2 points] [ point] o find an upper bound we can use any feasible point, x. If we limit our search for a feasible point such that x 2 = 0 then we would have a feasible point if and only if = x 2 + 5 0 2 x 0 8 0 = x 2. herefore both x = (2, 0) and x = ( 2, 0) are feasible points. We only need one point to give us an upper bound, and if we consider the feasible point x = (2, 0) 6

est exam: Continuous Optimisation 205 Monday th December 205 then this gives us the upper bound of 5 x 2 x x 2 2 x + x 2 2 + 2 = 5 2 2 2 0 2 2 + 0 2 + 2 = 20 0 + 0 + 2 = 8 Problem (A) is equivalent to 5 2 min, xx 2x x 2 + 2 2 s.t., xx 8x 2 5 2 = x x xx PSD 3 x R 3, A lower bound on this is then provided by solving the optimisation problem 5 2 min, X 2x x,x 2 + 2 2 s.t., X 8x 2 5 2 = x PSD 3. (c) his previous optimisation problem is equivalent to 2 0 min 5 2 x0 x, x,x 0 2 0 0 s.t. 0 2 x0 x, = 2 5 0 0 0 0 0 x0 x, =, 0 0 0 x0 x PSD 3. We then have that the dual problem is max y s.t. y + y 2 2 0 0 0 0 0 5 2 y 0 2 y 2 0 0 0 PSD 3. 0 2 2 5 0 0 0 7

est exam: Continuous Optimisation 205 Monday th December 205 7. (Automatic additional points) [ points] Question: 2 3 5 6 7 otal Points: 7 6 6 7 6 0 A copy of the lecture-sheets may be used during the examination. Good luck! 8