Additional Exercises for Introduction to Nonlinear Optimization Amir Beck March 16, 2017
|
|
- Lora Stafford
- 5 years ago
- Views:
Transcription
1 Additional Exercises for Introduction to Nonlinear Optimization Amir Beck March 16, 2017 Chapter 1 - Mathematical Preliminaries 1.1 Let S R n. (a) Suppose that T is an open set satisfying T S. Prove that T int(s). (b) Prove that the complement of int(s) is the closure of the complement of S. (c) Do S and cl(s) always have the same interiors? 1.2 For any x R n and any nonzero vector d R n, compute the directional derivative f (x; d) of f(x) = x a 2 δ + max{c T 1 x + β 1, c T 2 x + β 2 }, where a, c 1, c 2 R n and δ R ++, β 1, β 2 R. Chapter 2 - Optimality Conditions for Unconstrained Optimization 2.1 For each of the following matrices determine, without computing eigenvalues, the interval of α for which they are positive definite/negative definite/positive semidefinite/negative semidefinite/indefinite: 1 α 1 (a) B α = α 4 α. 1 α 1 1 α 0 0 (b) E α = α 2 α 0 0 α 2 α 0 0 α Let A R n n be a symmetric matrix. Assume that there exist two indices i j for which A jj, A ij 0 and A ii = 0. Prove that A is indefinite. 2.3 Let f(x 1, x 2 ) = x 2 1 2x 1 x x4 2. (a) Is the function f coercive? explain your answer. (b) Find the stationary points of f and classify them (strict/nonstrict local/global minimum/maximum or a saddle point). 1
2 2.4 Consider the function f(x, y) = x 2 x 2 y 2 +y 4. Find all the stationary points of f and classify them (strict/non-strict, local/global, minimum/maximum or a saddle point). Chapter 3 - Least Squares 3.1 Let A R m n, b R m, L R p n and λ R ++. Consider the function f(x) = Ax b λ Lx 1. (a) Show that if f is coercive, then Null(A) Null(L) = {0}. (b) Show that the contrary also holds, i.e., if Null(A) Null(L) = {0} then f is coercive. Chapter 6 - Convex Sets 6.1 Show that the following set is not convex: S = {x R 2 : 3x x x 1 x 2 x 1 + 4x 2 10}. 6.2 (a) Prove that the extreme points of n = {x : n i=1 x i = 1, x i 0} are given by {e 1, e 2,..., e n }. (b) For each of the following sets specify the corresponding extreme points: (i) {x : e T x 1, x 0}. (ii) B[c, r] where c R n and r > 0. (iii) {(x 1, x 2 ) T : 9x x x 1 x 2 6x 1 8x , x 1 0, x 2 0}. Chapter 7 - Convex Functions 7.1 Find the optimal solution of the problem max x R 3 2x x 2 2 x x 1 3x 2 + 4x 3 s.t. x 1 + x 2 + x 3 = 1 x 1, x 2, x Let f : R n R be convex, and let A R m n be an m n matrix. Consider the function h defined as follows: h(y) = min{f(x) : Ax = y}, x where we assume that h(y) > for all y. Show that h is convex. 2
3 Chapter 8 - Convex Optimization 8.1 Consider the problem min max{ x 1 x 2, x 2 x 3, x 1 x 3 } + x x x 2 3 2x 2 x 3 s.t. (4x x 2 2 2x 1 x 2 + 1) 4 + x2 3 x 1 +x x 1 + x 2 1 (a) Show that it is convex. (b) Write a CVX code that solves it. (c) Write down the optimal solution (by running CVX). 8.2 Consider the following convex optimization problem: (a) Prove that the problem is convex. min 2x x 1 x 2 + 3x s.t. ((x x 2 2) 2 + 1) 2 100x 1 x x2 2 +4x 1x 2 2x 1 +x 2 +x x 1, x 2, x (b) Write a CVX code for solving the problem. 8.3 Prove that the following problem is convex in the sense that it consists of minimizing a convex function over a convex feasible set. min log (e x 1 x 2 + e x 2+x 3 ) s.t. x x x x 1 x 2 + 2x 1 x 3 + 2x 2 x 3 1, (x 1 + x 2 + 2x 3 )(2x 1 + 4x 2 + x 3 )(x 1 + x 2 + x 3 ) 1, e ex 1 + [x 2 ] 3 + 7, x 1, x 2, x Consider the problem where a, b, c R. (Q) min ax bx cx 1 x 2 s.t. 1 x 1 2, 0 x 2 x 1. (a) Prove that there exists a minimizer for problem (Q). (b) Prove that if a < 0, b < 0 and c 2 4ab 0, then the optimal value of problem (Q) is 4 min{a, a + b + c}. (c) Prove that if a > 0, b > 0 and c 2 > 4ab, then problem (Q) has a unique solution. 3
4 8.5 Consider the optimization problem min x x 1 x 2 + 5x x 1 + 6x x 2 x 2 +1 s.t. ( x x 2 1 ) 2 + x4 1 x2 2 x 2 1 +x 2 7 x 2 1. (a) Prove that x x 1 x 2 + 5x x 1 + 6x for any x 1, x 2. (b) Prove that problem is convex. (c) Write a CVX code that solves the problem. 8.6 Consider the problem max{g(y) : f 1 (y) 0, f 2 (y) 0}, where g : R n R is concave and f 1, f 2 : R n R are convex. Assume that the problem max{g(y) : f 1 (y) 0} has a unique solution ỹ. Let Y be the optimal set of problem. Prove that exactly one of the following two options holds: (i) f 2 (ỹ) 0 and in this case Y = {ỹ}. (ii) f 2 (ỹ) > 0 and in this case Y = argmax{g(y) : f 1 (y) 0, f 2 (y) = 0}. 8.7 Show that the following optimization problem can be cast as a convex optimization problem and write a CVX code for solving it. min 4x2 1 +2x 1x 2 +5x 2 2 x s.t. 1 + x 2 x 1 +1 x 1 +x 2 + (x x ) 2 x x 2 1 x 2 2x1 + 3x 2 x 1, x Show that the following is a convex optimization problem and write a CVX code for solving it. min x (4x 1 + 5x 2 ) 2 (3x 1 + 4x 2 + 1) 2 s.t. x x x 2 x 2 1 x2 2 x 2 min{x 2 x 1 + 3x 2, 7} x 2 1. Chapter 9 - Optimization over a Convex Set 9.1 Consider the set Box[l, u] {x R n : l i x i u i, i = 1, 2,..., n} 4
5 where l, u R n are given vectors that satisfy l u. Consider the minimization problem min{f(x) : x Box[l, u]}, where f is continuously differentiable function over Box[l, u]. Prove that x Box[l, u] is a stationarity point of if and only if f = 0, l i < x i < u i, (x ) 0, x x i i = u i, 0, x i = l i. 9.2 In the source localization problem 1 we are given m locations of sensors a 1, a 2,..., a m R n and approximate distances between the sensors and an unknown source located at x R n : d i x a i 2. The problem is to find/estimate x given the locations a 1, a 2,..., a m and the approximate distances d 1, d 2,..., d m. The following natural formulation of the problem as a minimization problem was introduced in Exercise 4.5: { m min x R n i=1 ( x a i 2 d i ) 2 }. (SL) (a) Show that the problem given by (SL) is equivalent to the problem min f(x, u 1, u 2,..., u m ) m i=1 x a i 2 2 2d i u T i (x a i ) + d 2 i s.t. u i 2 1, i = 1,..., m, x R n, (SL2) in the sense that x is an optimal solution of (SL) if and only if there exists (u 1, u 2,..., u m ) such that (x, u 1, u 2,..., u m ) is an optimal solution of (SL2). (b) Find a Lipschitz constant of the function f. (c) Consider the two-dimensional problem (n = 2) with 5 anchors (m = 5) and data generated by the MATLAB commands randn( seed,317); A=randn(2,5); x=randn(2,1); d=sqrt(sum((a-x*ones(1,5)).^2))+0.05*randn(1,5); d=d ; 1 The description of the problem also appears in Exercise 4.5 5
6 The columns of the 2 5 matrix A are the locations of the five sensors, x is the true location of the source and d is the vector of noisy measurements between the source and the sensors. Write a MATLAB function that implements the gradient projection algorithm employed on problem (SL2) for the generated data. Use the following step size selection strategies (i) constant step size. (ii) backtracking with parameters s = 1, α = 0.5, β = 0.5. Start both methods with the initial vectors (1000, 500) T and u i = 0 for all i = 1,..., m. Run both algorithms for 100 iterations and compare their performance. Chapter 10 - Optimality Conditions for Linearly Constrained Problems 10.1 Let A R m n. Prove that the following two claims are equivalent. (A) The system has no solution. Ax = 0, x > 0 (B) There exists a vector y R n for which A T y 0 and A T y is not the zeros vector Consider the problem (Q) 1 min x R n 2 xt Qx + d T x s.t. a T 1 x b 1, a T 2 x = b 2, where Q R n n (n 3) is positive definite, d, a 1, a 2 R n and b 1, b 2 R. Assume that a T 1 Q 1 a 1 = a T 2 Q 1 a 2 = 2, a T 2 Q 1 a 1 = 0, a 1 0, a 2 0 (a) Are the KKT conditions necessary and sufficient for problem (Q)? explain your answer. (b) Prove that the problem is feasible. (c) Write the KKT conditions explicitly. (d) Find the optimal solution of the problem. Chapter 11 - The KKT Conditions 11.1 Consider the problem max x x x 3 3 s.t. x x x 2 3 = 1. 6
7 (a) Is the problem convex? (b) Prove that all the local maximum points of the problem are also KKT points. (c) Find all the KKT points of the problem. (d) Find the optimal solution of the problem. Chapter 12 - Duality 12.1 Consider the following optimization problem: min x R n,t R s.t. 1 2 x ct Ax = b + te t 0, (P5) where A R m n, b R m, c R, and as usual e is the vector of all ones. Assume in addition that the rows of A are linearly independent. (i) Find a dual problem to problem (P5) (do not assign a Lagrange multiplier to the nonnegativity constraint). (ii) Solve the dual problem obtained in part (i) and find the optimal solution of problem (P5) Consider the optimization problem (with the convention that 0 log 0 = 0): where a R n and n is the unit simplex. min a T x + n i=1 x i log x i s.t. x n, (i) Show that the problem cannot have more than one optimal solution. (ii) Find a dual problem in one dual decision variable. (iii) Solve the dual problem. (iv) Find the optimal solution of the primal problem Let E R k n, f R n, a R m, A R m n, b R m and c 1, c 2,..., c p R n. Consider the problem min x R n,z R 1 m 2 Ex z f T x + a T z + p i=1 ect i x s.t. Ax + z = b, z 0. Assume that E has full column rank. 7
8 (a) Show that the objective function is coercive. (b) Show that if the set P = {x R n : Ax b} is nonempty, then strong duality holds for problem. (c) Write a dual problem for problem Consider the problem min x,y R n x a T y + x 2 s.t. Bx + Cy d, y 2 1, where a R n, B, C R m n, d R m. (a) Show that if BB T 0, then strong duality holds for the problem. (b) Find a dual problem Consider the following convex optimization problem: min Ax + b 2 + Lx 1 + Mx n i=1 x i log x i s.t. x 0, where A R m n, b R m, L R p n, M R q n. Write a dual problem of. Do not perform any transformations that ruin the convexity of the problem (a) Prove that for any a R n, the following holds: { 0, min a T a 1 1, x + x =, else. (b) Consider the following minimization problem: min Ax x s.t. Bx c, where A R d n, B R m n, c R m. Assume that the problem is feasible. Find a dual problem to. Do not perform any transformations that might ruin the convexity of the problem Consider the problem (G) min x T Qx + 2b T x s.t. x T Qx + 2c T x + d 0, where Q R n n is positive definite, b, c R n and d R. (a) Under which (explicit) condition on the data (Q, b, c, d) is the problem feasible? 8
9 (b) Under which (explicit) condition on the data (Q, b, c, d) does strong duality hold? (c) Find a dual problem to (G) in one variable. (d) Assume that Q = I. Find the optimal solution of the primal problem (G) assuming that the condition of part (b) holds. Hint: recast the problem as a problem of finding an orthogonal projection of a certain point to a certain set Consider the convex problem min x R n Ax b 1 + x n i=1 log(x i) s.t. Cx d, x e, where A R m n, b R m, C R p n, d R p and e is the vector of all ones. Find a dual problem of. Do not make any transformations that will ruin the convexity of the problem Let u R n ++. Consider the problem { n min x 2 j : j=1 } n x j = 1, 0 x j u j j=1 (a) Write a necessary and sufficient condition (in terms of the vector u) under which the problem is feasible. (b) Write a dual problem in one variable. (c) Describe an algorithm for solving the optimization problem using the dual problem obtained in part (b). 9
Lecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationLecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then
Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then 1. x S is a global minimum point of f over S if f (x) f (x ) for any x S. 2. x S
More informationNonlinear Optimization: What s important?
Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global
More informationMarch 8, 2010 MATH 408 FINAL EXAM SAMPLE
March 8, 200 MATH 408 FINAL EXAM SAMPLE EXAM OUTLINE The final exam for this course takes place in the regular course classroom (MEB 238) on Monday, March 2, 8:30-0:20 am. You may bring two-sided 8 page
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationChap 2. Optimality conditions
Chap 2. Optimality conditions Version: 29-09-2012 2.1 Optimality conditions in unconstrained optimization Recall the definitions of global, local minimizer. Geometry of minimization Consider for f C 1
More informationMarch 5, 2012 MATH 408 FINAL EXAM SAMPLE
March 5, 202 MATH 408 FINAL EXAM SAMPLE Partial Solutions to Sample Questions (in progress) See the sample questions for the midterm exam, but also consider the following questions. Obviously, a final
More informationMATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018
MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationminimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1,
4 Duality 4.1 Numerical perturbation analysis example. Consider the quadratic program with variables x 1, x 2, and parameters u 1, u 2. minimize x 2 1 +2x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationMATHEMATICAL ECONOMICS: OPTIMIZATION. Contents
MATHEMATICAL ECONOMICS: OPTIMIZATION JOÃO LOPES DIAS Contents 1. Introduction 2 1.1. Preliminaries 2 1.2. Optimal points and values 2 1.3. The optimization problems 3 1.4. Existence of optimal points 4
More information1 Computing with constraints
Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationTMA947/MAN280 APPLIED OPTIMIZATION
Chalmers/GU Mathematics EXAM TMA947/MAN280 APPLIED OPTIMIZATION Date: 06 08 31 Time: House V, morning Aids: Text memory-less calculator Number of questions: 7; passed on one question requires 2 points
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationp. 5 First line of Section 1.4. Change nonzero vector v R n to nonzero vector v C n int([x, y]) = (x, y)
Corrections for the first printing of INTRODUCTION TO NON- LINEAR OPTIMIZATION: THEORY, ALGORITHMS, AND AP- PLICATIONS WITH MATLAB, SIAM, 204, by Amir Beck 2 Last Updated: June 8, 207 p. 5 First line of
More informationPart 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)
Part 4: Active-set methods for linearly constrained optimization Nick Gould RAL fx subject to Ax b Part C course on continuoue optimization LINEARLY CONSTRAINED MINIMIZATION fx subject to Ax { } b where
More informationOptimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40
Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 28, 2017 1 / 40 The Key Idea of Newton s Method Let f : R n R be a twice differentiable function
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationStructural and Multidisciplinary Optimization. P. Duysinx and P. Tossings
Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009
UC Berkeley Department of Electrical Engineering and Computer Science EECS 227A Nonlinear and Convex Optimization Solutions 5 Fall 2009 Reading: Boyd and Vandenberghe, Chapter 5 Solution 5.1 Note that
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More informationAdditional Homework Problems
Additional Homework Problems Robert M. Freund April, 2004 2004 Massachusetts Institute of Technology. 1 2 1 Exercises 1. Let IR n + denote the nonnegative orthant, namely IR + n = {x IR n x j ( ) 0,j =1,...,n}.
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationAssignment 1: From the Definition of Convexity to Helley Theorem
Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality
More informationExamination paper for TMA4180 Optimization I
Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted
More informationLecture 8 Plus properties, merit functions and gap functions. September 28, 2008
Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:
More information2.098/6.255/ Optimization Methods Practice True/False Questions
2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence
More informationCE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review
CE 191: Civil & Environmental Engineering Systems Analysis LEC 17 : Final Review Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 2014 Prof. Moura UC Berkeley
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationAlgorithms for constrained local optimization
Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationExam in TMA4180 Optimization Theory
Norwegian University of Science and Technology Department of Mathematical Sciences Page 1 of 11 Contact during exam: Anne Kværnø: 966384 Exam in TMA418 Optimization Theory Wednesday May 9, 13 Tid: 9. 13.
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca
More informationNote 3: LP Duality. If the primal problem (P) in the canonical form is min Z = n (1) then the dual problem (D) in the canonical form is max W = m (2)
Note 3: LP Duality If the primal problem (P) in the canonical form is min Z = n j=1 c j x j s.t. nj=1 a ij x j b i i = 1, 2,..., m (1) x j 0 j = 1, 2,..., n, then the dual problem (D) in the canonical
More informationHomework Set #6 - Solutions
EE 15 - Applications of Convex Optimization in Signal Processing and Communications Dr Andre Tkacenko JPL Third Term 11-1 Homework Set #6 - Solutions 1 a The feasible set is the interval [ 4] The unique
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More informationMore First-Order Optimization Algorithms
More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM
More informationPart IB Optimisation
Part IB Optimisation Theorems Based on lectures by F. A. Fischer Notes taken by Dexter Chua Easter 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationLagrange Relaxation and Duality
Lagrange Relaxation and Duality As we have already known, constrained optimization problems are harder to solve than unconstrained problems. By relaxation we can solve a more difficult problem by a simpler
More informationMiscellaneous Nonlinear Programming Exercises
Miscellaneous Nonlinear Programming Exercises Henry Wolkowicz 2 08 21 University of Waterloo Department of Combinatorics & Optimization Waterloo, Ontario N2L 3G1, Canada Contents 1 Numerical Analysis Background
More informationNonlinear Programming
Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More informationLagrange duality. The Lagrangian. We consider an optimization program of the form
Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization
More informationCHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.
1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function
More informationSupport Vector Machines
Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal
More information1. Algebraic and geometric treatments Consider an LP problem in the standard form. x 0. Solutions to the system of linear equations
The Simplex Method Most textbooks in mathematical optimization, especially linear programming, deal with the simplex method. In this note we study the simplex method. It requires basically elementary linear
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationSymmetric Matrices and Eigendecomposition
Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions
More information15. Conic optimization
L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone
More informationChapter 2: Unconstrained Extrema
Chapter 2: Unconstrained Extrema Math 368 c Copyright 2012, 2013 R Clark Robinson May 22, 2013 Chapter 2: Unconstrained Extrema 1 Types of Sets Definition For p R n and r > 0, the open ball about p of
More informationLecture 7: Convex Optimizations
Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationLecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.
MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.
More informationDuality. Geoff Gordon & Ryan Tibshirani Optimization /
Duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Duality in linear programs Suppose we want to find lower bound on the optimal value in our convex problem, B min x C f(x) E.g., consider
More informationConvex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version
Convex Optimization Theory Chapter 5 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationComputational Finance
Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples
More informationOptimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers
Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex
More informationLecture 6 Positive Definite Matrices
Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices
More informationEE Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko, JPL Third Term
EE 150 - Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko JPL Third Term 2011-2012 Due on Thursday May 3 in class. Homework Set #4 1. (10 points) (Adapted
More informationHere each term has degree 2 (the sum of exponents is 2 for all summands). A quadratic form of three variables looks as
Reading [SB], Ch. 16.1-16.3, p. 375-393 1 Quadratic Forms A quadratic function f : R R has the form f(x) = a x. Generalization of this notion to two variables is the quadratic form Q(x 1, x ) = a 11 x
More informationPattern Classification, and Quadratic Problems
Pattern Classification, and Quadratic Problems (Robert M. Freund) March 3, 24 c 24 Massachusetts Institute of Technology. 1 1 Overview Pattern Classification, Linear Classifiers, and Quadratic Optimization
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationMathematical Economics. Lecture Notes (in extracts)
Prof. Dr. Frank Werner Faculty of Mathematics Institute of Mathematical Optimization (IMO) http://math.uni-magdeburg.de/ werner/math-ec-new.html Mathematical Economics Lecture Notes (in extracts) Winter
More informationLinear and Combinatorial Optimization
Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationDuality Theory of Constrained Optimization
Duality Theory of Constrained Optimization Robert M. Freund April, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 The Practical Importance of Duality Duality is pervasive
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationSolving Dual Problems
Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem
More informationA Review of Linear Programming
A Review of Linear Programming Instructor: Farid Alizadeh IEOR 4600y Spring 2001 February 14, 2001 1 Overview In this note we review the basic properties of linear programming including the primal simplex
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS
SECTION C: CONTINUOUS OPTIMISATION LECTURE : THE METHOD OF LAGRANGE MULTIPLIERS HONOUR SCHOOL OF MATHEMATICS OXFORD UNIVERSITY HILARY TERM 005 DR RAPHAEL HAUSER. Examples. In this lecture we will take
More informationDEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions
DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION Part I: Short Questions August 12, 2008 9:00 am - 12 pm General Instructions This examination is
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More information1 Outline Part I: Linear Programming (LP) Interior-Point Approach 1. Simplex Approach Comparison Part II: Semidenite Programming (SDP) Concludin
Sensitivity Analysis in LP and SDP Using Interior-Point Methods E. Alper Yldrm School of Operations Research and Industrial Engineering Cornell University Ithaca, NY joint with Michael J. Todd INFORMS
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More informationA STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE
A STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE Philip E. Gill Vyacheslav Kungurtsev Daniel P. Robinson UCSD Center for Computational Mathematics Technical Report CCoM-14-1 June 30, 2014 Abstract Regularized
More informationEE/AA 578, Univ of Washington, Fall Duality
7. Duality EE/AA 578, Univ of Washington, Fall 2016 Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationInterior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems
AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss
More informationMATH 4211/6211 Optimization Constrained Optimization
MATH 4211/6211 Optimization Constrained Optimization Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 Constrained optimization
More informationAM 205: lecture 18. Last time: optimization methods Today: conditions for optimality
AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3
More informationSummary of the simplex method
MVE165/MMG630, The simplex method; degeneracy; unbounded solutions; infeasibility; starting solutions; duality; interpretation Ann-Brith Strömberg 2012 03 16 Summary of the simplex method Optimality condition:
More informationORIE 6300 Mathematical Programming I August 25, Lecture 2
ORIE 6300 Mathematical Programming I August 25, 2016 Lecturer: Damek Davis Lecture 2 Scribe: Johan Bjorck Last time, we considered the dual of linear programs in our basic form: max(c T x : Ax b). We also
More informationAlgorithms for nonlinear programming problems II
Algorithms for nonlinear programming problems II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects
More informationPart 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)
Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization Nick Gould (RAL) x IR n f(x) subject to c(x) = Part C course on continuoue optimization CONSTRAINED MINIMIZATION x
More informationPreliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012
Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.
More information0.1 Tangent Spaces and Lagrange Multipliers
01 TANGENT SPACES AND LAGRANGE MULTIPLIERS 1 01 Tangent Spaces and Lagrange Multipliers If a differentiable function G = (G 1,, G k ) : E n+k E k then the surface S defined by S = { x G( x) = v} is called
More informationEE364a Homework 8 solutions
EE364a, Winter 2007-08 Prof. S. Boyd EE364a Homework 8 solutions 9.8 Steepest descent method in l -norm. Explain how to find a steepest descent direction in the l -norm, and give a simple interpretation.
More information