Additional Exercises for Introduction to Nonlinear Optimization Amir Beck March 16, 2017

Similar documents
Lecture 18: Optimization Programming

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then

Nonlinear Optimization: What s important?

March 8, 2010 MATH 408 FINAL EXAM SAMPLE

Written Examination

Chap 2. Optimality conditions

March 5, 2012 MATH 408 FINAL EXAM SAMPLE

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

minimize x subject to (x 2)(x 4) u,

minimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1,

5. Duality. Lagrangian

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents

1 Computing with constraints

Constrained Optimization

Convex Optimization Boyd & Vandenberghe. 5. Duality

Numerical Optimization

TMA947/MAN280 APPLIED OPTIMIZATION

Convex Optimization M2

p. 5 First line of Section 1.4. Change nonzero vector v R n to nonzero vector v C n int([x, y]) = (x, y)

Part 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)

Optimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40

Lecture 6: Conic Optimization September 8

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient

Convex Optimization & Lagrange Duality

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

Optimality Conditions for Constrained Optimization

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Optimization for Machine Learning

Additional Homework Problems

Lecture: Duality.

Linear and non-linear programming

Assignment 1: From the Definition of Convexity to Helley Theorem

4TE3/6TE3. Algorithms for. Continuous Optimization

Examination paper for TMA4180 Optimization I

Lecture 8 Plus properties, merit functions and gap functions. September 28, 2008

2.098/6.255/ Optimization Methods Practice True/False Questions

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Algorithms for constrained local optimization

Optimality, Duality, Complementarity for Constrained Optimization

Exam in TMA4180 Optimization Theory

4TE3/6TE3. Algorithms for. Continuous Optimization

Note 3: LP Duality. If the primal problem (P) in the canonical form is min Z = n (1) then the dual problem (D) in the canonical form is max W = m (2)

Homework Set #6 - Solutions

Applications of Linear Programming

More First-Order Optimization Algorithms

Part IB Optimisation

Lagrange Relaxation and Duality

Miscellaneous Nonlinear Programming Exercises

Nonlinear Programming

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Generalization to inequality constrained problem. Maximize

Lagrange duality. The Lagrangian. We consider an optimization program of the form

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Support Vector Machines

1. Algebraic and geometric treatments Consider an LP problem in the standard form. x 0. Solutions to the system of linear equations

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Symmetric Matrices and Eigendecomposition

15. Conic optimization

Chapter 2: Unconstrained Extrema

Lecture 7: Convex Optimizations

A Brief Review on Convex Optimization

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Duality. Geoff Gordon & Ryan Tibshirani Optimization /

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Computational Finance

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Lecture 6 Positive Definite Matrices

EE Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko, JPL Third Term

Here each term has degree 2 (the sum of exponents is 2 for all summands). A quadratic form of three variables looks as

Pattern Classification, and Quadratic Problems

Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

Mathematical Economics. Lecture Notes (in extracts)

Linear and Combinatorial Optimization

CSCI : Optimization and Control of Networks. Review on Convex Optimization

N. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:

Duality Theory of Constrained Optimization

Lecture 3. Optimization Problems and Iterative Algorithms

Solving Dual Problems

A Review of Linear Programming

SECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

Lecture 2: Convex Sets and Functions

1 Outline Part I: Linear Programming (LP) Interior-Point Approach 1. Simplex Approach Comparison Part II: Semidenite Programming (SDP) Concludin

Semidefinite Programming Basics and Applications

A STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE

EE/AA 578, Univ of Washington, Fall Duality

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

MATH 4211/6211 Optimization Constrained Optimization

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

Summary of the simplex method

ORIE 6300 Mathematical Programming I August 25, Lecture 2

Algorithms for nonlinear programming problems II

Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

0.1 Tangent Spaces and Lagrange Multipliers

EE364a Homework 8 solutions

Transcription:

Additional Exercises for Introduction to Nonlinear Optimization Amir Beck March 16, 2017 Chapter 1 - Mathematical Preliminaries 1.1 Let S R n. (a) Suppose that T is an open set satisfying T S. Prove that T int(s). (b) Prove that the complement of int(s) is the closure of the complement of S. (c) Do S and cl(s) always have the same interiors? 1.2 For any x R n and any nonzero vector d R n, compute the directional derivative f (x; d) of f(x) = x a 2 δ + max{c T 1 x + β 1, c T 2 x + β 2 }, where a, c 1, c 2 R n and δ R ++, β 1, β 2 R. Chapter 2 - Optimality Conditions for Unconstrained Optimization 2.1 For each of the following matrices determine, without computing eigenvalues, the interval of α for which they are positive definite/negative definite/positive semidefinite/negative semidefinite/indefinite: 1 α 1 (a) B α = α 4 α. 1 α 1 1 α 0 0 (b) E α = α 2 α 0 0 α 2 α 0 0 α 1 2.2 Let A R n n be a symmetric matrix. Assume that there exist two indices i j for which A jj, A ij 0 and A ii = 0. Prove that A is indefinite. 2.3 Let f(x 1, x 2 ) = x 2 1 2x 1 x 2 2 + 1 2 x4 2. (a) Is the function f coercive? explain your answer. (b) Find the stationary points of f and classify them (strict/nonstrict local/global minimum/maximum or a saddle point). 1

2.4 Consider the function f(x, y) = x 2 x 2 y 2 +y 4. Find all the stationary points of f and classify them (strict/non-strict, local/global, minimum/maximum or a saddle point). Chapter 3 - Least Squares 3.1 Let A R m n, b R m, L R p n and λ R ++. Consider the function f(x) = Ax b 2 2 + λ Lx 1. (a) Show that if f is coercive, then Null(A) Null(L) = {0}. (b) Show that the contrary also holds, i.e., if Null(A) Null(L) = {0} then f is coercive. Chapter 6 - Convex Sets 6.1 Show that the following set is not convex: S = {x R 2 : 3x 2 1 + x 2 2 + 4x 1 x 2 x 1 + 4x 2 10}. 6.2 (a) Prove that the extreme points of n = {x : n i=1 x i = 1, x i 0} are given by {e 1, e 2,..., e n }. (b) For each of the following sets specify the corresponding extreme points: (i) {x : e T x 1, x 0}. (ii) B[c, r] where c R n and r > 0. (iii) {(x 1, x 2 ) T : 9x 2 1 + 16x 2 2 + 24x 1 x 2 6x 1 8x 2 + 1 0, x 1 0, x 2 0}. Chapter 7 - Convex Functions 7.1 Find the optimal solution of the problem max x R 3 2x 2 1 + x 2 2 x 2 3 + 2x 1 3x 2 + 4x 3 s.t. x 1 + x 2 + x 3 = 1 x 1, x 2, x 3 0 7.2 Let f : R n R be convex, and let A R m n be an m n matrix. Consider the function h defined as follows: h(y) = min{f(x) : Ax = y}, x where we assume that h(y) > for all y. Show that h is convex. 2

Chapter 8 - Convex Optimization 8.1 Consider the problem min max{ x 1 x 2, x 2 x 3, x 1 x 3 } + x 2 1 + 2x 2 2 + 3x 2 3 2x 2 x 3 s.t. (4x 2 1 + 6x 2 2 2x 1 x 2 + 1) 4 + x2 3 x 1 +x 2 150 x 1 + x 2 1 (a) Show that it is convex. (b) Write a CVX code that solves it. (c) Write down the optimal solution (by running CVX). 8.2 Consider the following convex optimization problem: (a) Prove that the problem is convex. min 2x 2 1 + 4x 1 x 2 + 3x 2 2 + 1 + 7 s.t. ((x 2 1 + x 2 2) 2 + 1) 2 100x 1 x 2 1 +4x2 2 +4x 1x 2 2x 1 +x 2 +x 3 10 1 x 1, x 2, x 3 10. (b) Write a CVX code for solving the problem. 8.3 Prove that the following problem is convex in the sense that it consists of minimizing a convex function over a convex feasible set. min log (e x 1 x 2 + e x 2+x 3 ) s.t. x 2 1 + x 2 2 + 2x 2 3 + 2x 1 x 2 + 2x 1 x 3 + 2x 2 x 3 1, (x 1 + x 2 + 2x 3 )(2x 1 + 4x 2 + x 3 )(x 1 + x 2 + x 3 ) 1, e ex 1 + [x 2 ] 3 + 7, x 1, x 2, x 3 1 10. 8.4 Consider the problem where a, b, c R. (Q) min ax 2 1 + bx 2 2 + cx 1 x 2 s.t. 1 x 1 2, 0 x 2 x 1. (a) Prove that there exists a minimizer for problem (Q). (b) Prove that if a < 0, b < 0 and c 2 4ab 0, then the optimal value of problem (Q) is 4 min{a, a + b + c}. (c) Prove that if a > 0, b > 0 and c 2 > 4ab, then problem (Q) has a unique solution. 3

8.5 Consider the optimization problem min x 2 1 + 4x 1 x 2 + 5x 2 2 + 2x 1 + 6x 2 + 5 x 2 x 2 +1 s.t. ( x 1 1 + x 2 1 ) 2 + x4 1 x2 2 x 2 1 +x 2 7 x 2 1. (a) Prove that x 2 1 + 4x 1 x 2 + 5x 2 2 + 2x 1 + 6x 2 + 5 0 for any x 1, x 2. (b) Prove that problem is convex. (c) Write a CVX code that solves the problem. 8.6 Consider the problem max{g(y) : f 1 (y) 0, f 2 (y) 0}, where g : R n R is concave and f 1, f 2 : R n R are convex. Assume that the problem max{g(y) : f 1 (y) 0} has a unique solution ỹ. Let Y be the optimal set of problem. Prove that exactly one of the following two options holds: (i) f 2 (ỹ) 0 and in this case Y = {ỹ}. (ii) f 2 (ỹ) > 0 and in this case Y = argmax{g(y) : f 1 (y) 0, f 2 (y) = 0}. 8.7 Show that the following optimization problem can be cast as a convex optimization problem and write a CVX code for solving it. min 4x2 1 +2x 1x 2 +5x 2 2 x s.t. 1 + x 2 x 1 +1 x 1 +x 2 + (x 2 1 + x 2 2 + 1) 2 x 1 2 +1 x 2 1 x 2 2x1 + 3x 2 x 1, x 2 1. 8.8 Show that the following is a convex optimization problem and write a CVX code for solving it. min x 2 1 + (4x 1 + 5x 2 ) 2 (3x 1 + 4x 2 + 1) 2 s.t. x 2 1 + x 2 2 + 1 x 2 x 2 1 x2 2 x 2 min{x 2 x 1 + 3x 2, 7} x 2 1. Chapter 9 - Optimization over a Convex Set 9.1 Consider the set Box[l, u] {x R n : l i x i u i, i = 1, 2,..., n} 4

where l, u R n are given vectors that satisfy l u. Consider the minimization problem min{f(x) : x Box[l, u]}, where f is continuously differentiable function over Box[l, u]. Prove that x Box[l, u] is a stationarity point of if and only if f = 0, l i < x i < u i, (x ) 0, x x i i = u i, 0, x i = l i. 9.2 In the source localization problem 1 we are given m locations of sensors a 1, a 2,..., a m R n and approximate distances between the sensors and an unknown source located at x R n : d i x a i 2. The problem is to find/estimate x given the locations a 1, a 2,..., a m and the approximate distances d 1, d 2,..., d m. The following natural formulation of the problem as a minimization problem was introduced in Exercise 4.5: { m min x R n i=1 ( x a i 2 d i ) 2 }. (SL) (a) Show that the problem given by (SL) is equivalent to the problem min f(x, u 1, u 2,..., u m ) m i=1 x a i 2 2 2d i u T i (x a i ) + d 2 i s.t. u i 2 1, i = 1,..., m, x R n, (SL2) in the sense that x is an optimal solution of (SL) if and only if there exists (u 1, u 2,..., u m ) such that (x, u 1, u 2,..., u m ) is an optimal solution of (SL2). (b) Find a Lipschitz constant of the function f. (c) Consider the two-dimensional problem (n = 2) with 5 anchors (m = 5) and data generated by the MATLAB commands randn( seed,317); A=randn(2,5); x=randn(2,1); d=sqrt(sum((a-x*ones(1,5)).^2))+0.05*randn(1,5); d=d ; 1 The description of the problem also appears in Exercise 4.5 5

The columns of the 2 5 matrix A are the locations of the five sensors, x is the true location of the source and d is the vector of noisy measurements between the source and the sensors. Write a MATLAB function that implements the gradient projection algorithm employed on problem (SL2) for the generated data. Use the following step size selection strategies (i) constant step size. (ii) backtracking with parameters s = 1, α = 0.5, β = 0.5. Start both methods with the initial vectors (1000, 500) T and u i = 0 for all i = 1,..., m. Run both algorithms for 100 iterations and compare their performance. Chapter 10 - Optimality Conditions for Linearly Constrained Problems 10.1 Let A R m n. Prove that the following two claims are equivalent. (A) The system has no solution. Ax = 0, x > 0 (B) There exists a vector y R n for which A T y 0 and A T y is not the zeros vector. 10.2 Consider the problem (Q) 1 min x R n 2 xt Qx + d T x s.t. a T 1 x b 1, a T 2 x = b 2, where Q R n n (n 3) is positive definite, d, a 1, a 2 R n and b 1, b 2 R. Assume that a T 1 Q 1 a 1 = a T 2 Q 1 a 2 = 2, a T 2 Q 1 a 1 = 0, a 1 0, a 2 0 (a) Are the KKT conditions necessary and sufficient for problem (Q)? explain your answer. (b) Prove that the problem is feasible. (c) Write the KKT conditions explicitly. (d) Find the optimal solution of the problem. Chapter 11 - The KKT Conditions 11.1 Consider the problem max x 3 1 + x 3 2 + x 3 3 s.t. x 2 1 + x 2 2 + x 2 3 = 1. 6

(a) Is the problem convex? (b) Prove that all the local maximum points of the problem are also KKT points. (c) Find all the KKT points of the problem. (d) Find the optimal solution of the problem. Chapter 12 - Duality 12.1 Consider the following optimization problem: min x R n,t R s.t. 1 2 x 2 2 + ct Ax = b + te t 0, (P5) where A R m n, b R m, c R, and as usual e is the vector of all ones. Assume in addition that the rows of A are linearly independent. (i) Find a dual problem to problem (P5) (do not assign a Lagrange multiplier to the nonnegativity constraint). (ii) Solve the dual problem obtained in part (i) and find the optimal solution of problem (P5). 12.2 Consider the optimization problem (with the convention that 0 log 0 = 0): where a R n and n is the unit simplex. min a T x + n i=1 x i log x i s.t. x n, (i) Show that the problem cannot have more than one optimal solution. (ii) Find a dual problem in one dual decision variable. (iii) Solve the dual problem. (iv) Find the optimal solution of the primal problem. 12.3 Let E R k n, f R n, a R m, A R m n, b R m and c 1, c 2,..., c p R n. Consider the problem min x R n,z R 1 m 2 Ex 2 2 + 1 2 z 2 2 + f T x + a T z + p i=1 ect i x s.t. Ax + z = b, z 0. Assume that E has full column rank. 7

(a) Show that the objective function is coercive. (b) Show that if the set P = {x R n : Ax b} is nonempty, then strong duality holds for problem. (c) Write a dual problem for problem. 12.4 Consider the problem min x,y R n x 2 2 + 4 + a T y + x 2 s.t. Bx + Cy d, y 2 1, where a R n, B, C R m n, d R m. (a) Show that if BB T 0, then strong duality holds for the problem. (b) Find a dual problem. 12.5 Consider the following convex optimization problem: min Ax + b 2 + Lx 1 + Mx 2 2 + n i=1 x i log x i s.t. x 0, where A R m n, b R m, L R p n, M R q n. Write a dual problem of. Do not perform any transformations that ruin the convexity of the problem. 12.6 (a) Prove that for any a R n, the following holds: { 0, min a T a 1 1, x + x =, else. (b) Consider the following minimization problem: min Ax 2 2 + 1 + x s.t. Bx c, where A R d n, B R m n, c R m. Assume that the problem is feasible. Find a dual problem to. Do not perform any transformations that might ruin the convexity of the problem. 12.7 Consider the problem (G) min x T Qx + 2b T x s.t. x T Qx + 2c T x + d 0, where Q R n n is positive definite, b, c R n and d R. (a) Under which (explicit) condition on the data (Q, b, c, d) is the problem feasible? 8

(b) Under which (explicit) condition on the data (Q, b, c, d) does strong duality hold? (c) Find a dual problem to (G) in one variable. (d) Assume that Q = I. Find the optimal solution of the primal problem (G) assuming that the condition of part (b) holds. Hint: recast the problem as a problem of finding an orthogonal projection of a certain point to a certain set. 12.8 Consider the convex problem min x R n Ax b 1 + x n i=1 log(x i) s.t. Cx d, x e, where A R m n, b R m, C R p n, d R p and e is the vector of all ones. Find a dual problem of. Do not make any transformations that will ruin the convexity of the problem. 12.9 Let u R n ++. Consider the problem { n min x 2 j : j=1 } n x j = 1, 0 x j u j j=1 (a) Write a necessary and sufficient condition (in terms of the vector u) under which the problem is feasible. (b) Write a dual problem in one variable. (c) Describe an algorithm for solving the optimization problem using the dual problem obtained in part (b). 9