Computational Optimization. Constrained Optimization Part 2
|
|
- Mervyn Reed
- 6 years ago
- Views:
Transcription
1 Computational Optimization Constrained Optimization Part
2 Optimality Conditions Unconstrained Case X* is global min Conve f X* is local min SOSC f ( *) = SONC
3 Easiest Problem Linear equality constraints min f( ) f R s.. t A = b A R, b R n m n m
4 Assume world is smooth Objective and constraints are twice continuously differentiable for the purpose of this lecture.
5 Net Easiest Problem Linear equality constraints min f( ) f R s.. t A b A R, b R n m n m Constraints form a polyhedron
6 Only some constraints are active. a = b a = b At *, only the constraint a >=b is at bound/active. Polyhedron A>=b a a 3 = b f (*) a 4 = b a * a = b contour set of function constrained minimum unconstrained minimum At optimal solution, *, only one constraint is active.
7 In this eample, the problem is equivalent to minimizing over a = b Polyhedron A>=b active set (if you knew it) a = b a Equality FONC: a 3 = b3 * min f( ) st.. a= b a 4 = b4 a f (*) * a = b Equality FONC: f (*) =λ a a ' * = b Note λ i >=
8 But in this case * doesn t solve the inequality problem Polyhedron A>=b a 5 = b 5 a = b a Equality FONC: a 3 = b3 3 * min f( ) st.. a= b Equality FONC: a 4 = b 4 a * f a = b (*) f (*) =λ a a ' * = b Now λ <. SIGN OF λ matters!
9 Polyhedron A>=b a 4 = b 4 Inequality Case a 5 = b 5 a =b 5 a f (*) * a a 3 = b 3 a = b Inequality problem * min f( ) st.. a b =... d Inequality FONC: T f (*) = A λ* = λ a A* b λ ( A* b) = i i i λ Nonnegative Multipliers imply gradient points to the greater than Side of the constraint. i i i i i i
10 Complementarity Condition a 5 = b 5 a =b 5 Inequality FONC: Polyhedron A>=b a a 3 = b 3 f (*) a 4 = b 4 a * a = b λ ( a* b) = i i i λ > a* = b ie.. constraint is active i i i λ = constraint doesn't matter. It can be dropped without i changing result.
11 Lagrangian Multipliers A if i i i i λ represents sensitivity i b λ > then a = b i If λ > then increasing a ˆ > b causes the objective to increase. If b is changed to make feasible region bigger then the objective will decrease. i i
12 First Order Necessary Conditions for Linear inequality Constraints If * is a local min of f over { A b}, then for some vector λ* f(*) A' λ* = A* b λ* λ *'( A * b) =. KKT
13 Second Order Necessary Conditions for Linear Constraints If * is a local min of f over { A b}, and Z is a null-space matri for active constraints then for some vector λ* f(*) A' λ* = A* b λ* λ *'( A * b) = ' ( *)... and Z f Z is p s d KKT
14 Z for FONC Z is null space matri for matri of active constraints at * Inde set of Active Constraints I = { i ai* = bi} Matri of active constraints A I is A restricted to rows Z = null( A I ) of Ain I
15 Every d satisfying Intuition f( )' d < Ad is a feasible descent direction at. The FONC make sure that this set is empty. f ( )' d < if and only if f ( ) = A' λ Ad Due to Farkas s Lemma λ has has no solution d a solution
16 Second Order Sufficient Conditions for Linear Inequalities If (*,λ*) satisfies A* b Primal feasibility f ( *) = A' λ * λ* Dual feasibility λ *'( A * b) = Complementarity and SOSC Z+ ' f ( *) Z+ is pd.. Then * is a strict local minimizer
17 Sufficient Conditions for Linear Inequalities where Z + is a basis matri for Null(A + ) and A corresponds to + nondegenerate active constraints) i.e. A + = A J { j A * = b, λ > } j j * j
18 Sufficient Eample Find solution and verify SOSC min /( + ) + / st.. * = [,]' λ* = [,]
19 Linear Inequality Constraints - I min s.t. Put min s.t. in ( + ) standard ( + ) form + + : [ ] ( *) f (*) + f (*) = = = Active constraint is A = - + * = by inspection
20 Linear Inequality Constraints - II ( ) [ ] - A is Active constraint *) ( * *) ( = = = + = + f f
21 Linear Inequality Constraints - III KKT Conditions A* = b f(*) = A T λ λ * λ *( A* b) =? λ = since second constraint is inactive = λ = λ λ = T f(*) A K K T Point: * = λ * =
22 Linear Inequality Constraints - IV Now look at SOSC A = [ ] Z = λ * = f(*) = Z + + ' + [] f(*) Z = a p.d. matri + Therefore SOSC are satistified. X* is a strict local minima
23 Why Necessary and Sufficient? Sufficient conditions are good for? Way to confirm that a candidate point is a minimum (local) But not every min satisfies any given SC Necessary tells you: If necessary conditions don t hold then you know you don t have a minimum. Under appropriate assumptions, every point that is a min satisfies the necessary cond. Good stopping criteria Algorithms look for points that satisfy Necessary conditions
24 You Try Solve the problem using above theorems: Verify *=[8/9,4/8] is optimal by checking KKT. Also check SOSC min / + st.. +
25 General Constraints min f( ) st.. g ( ) = i E i g ( ) i I i
26 Active Set The active set at a point consists of Equality constraints Inequality constraints at bound A( ) = E { i i I, g ( ) = i }
27 Lagrangian Function Optimality conditions epressed using Lagrangian function L(, λ) = f ( ) λigi( ) = f ( ) λ ' g( ) i = and Jacobian matri m g( )' were each row is a gradient of a constraint
28 Lagrangian Function Lagrangian Function L(, λ) = f ( ) λigi( ) = f ( ) λ ' g( ) i = Lagragian Gradient Lagragian Hessian m m L(, λ) f ( ) λ g ( ) = i i i = m λ λi i i = L(, ) = f ( ) g ( )
29 Feasible descent directions A point is not optimal at if we can take a small step s that is feasible and decrease the function min f (, ) s. t.
30 Feasible descent/improvement directions Case : point in interior min f (, ) s. t. f ( ) s Can only be optimal if f( ) =
31 Feasible descent/improvement directions Case : point at bound s f () g () min s. t. f (, Any point in cone is an Improvement directions g ( ) s Can only be optimal if Has no solution f ( ) f ( )' s< g ( )' s )
32 Feasible descent/improvement directions min f (, ) This has no as solution s s. t. f (*)' s< g (*)' s Same as saying there eists λ* λ Or equivalently f (*) = * g (*) λ* L (*, λ*) = λ* s g ( ) f ( )
33 General Nonlinear Constraints Want to solve min f ( ) st.. g ( ) = g ( ) Take FOTSE about a neighborhood of to pick feasible descent directions at assuming both constraints active g ( + d) g ( ) + g ( )' d i i i f( + d) f( ) + f( )' d The set of g ( )' d =, g ( )' d, and f( )' d < is the set of feasible descent directions for the linearized problem.
34 The set Rough idea g ( )' d =, g ( )' d, f( )' d < has no solution d, if and only if f()= g ( ) λ + g ( ) λ, λ or equivalently L(, λ, λ )=, λ has a solution So same KKT as for linear case should work as long as linearization is okay. (waving hands)
35 Sufficient Conditions Nonlinear Equality If (*,λ*) satisfies g( *) = Primal feasibility f ( *) = g( *)' λ * ( equivalently L(*, λ*)=) Dual feasibility and SOSC Z ' L( *) Z is pd.. Then * is a strict local minimizer
36 Sufficient Conditions Nonlinear Equality where Z is a basis matri for Null(A ) and A is Jacobian of active constraints i.e. For the jth row of A g j j ( *) = A ctive C onstraint A = g (*)' j
37 Second Order Sufficient Conditions Nonlinear If (*,λ*) satisfies Inequality g( *) Primal feasibility f ( *) = g( *)' λ * ( equivalently L(*, λ*)=) Dual feasibility λ* (for inequalities only) λ *' g( *) = Complementarity and SOSC Z+ ' L( *) Z+ is pd.. Then * is a strict local minimizer
38 Second Order Sufficient Conditions Nonlinear Inequality where Z + is a basis matri for Null(A i.e. + ) and A corresponds to Jacobian + of nondegenerate active constraints) For the jth row of Jacobian g λ j * j ( *) = A ctive C onstraint > ( ) + Nondegenerate A = g ( *) ' j j
39 Sufficient Eample Find solution and verify SOSC min /( + ) / + st.. / / / * = [,]' λ* =
40 = = + = = + + *) ( ) ( ) ( * Guess s.t. ) ( min - f f Nonlinear Inequality Constraints - I
41 Nonlinear Inequality Constraints - II Has one active constraint: + T Jacobian: g() = [ ] = [ ] L (, λ) = f( ) + λ' g ( ) L (, λ) = f( ) λ g( ) - = λ =
42 Nonlinear Inequality Constraints - III L (, λ) = f( ) λ ' g ( ) λ λi i i L (, ) = f( ) g( ) - = = positive definite = = [ ] T Z +, since g() - T Z+ L(, λ) Z+ = [ ] positive definite = so SOSC satisifed, * is a strict local minimum
43 Sufficient Eample Find solution and verify SOSC min st.. = * = [,]' λ* =
44 Nonlinear Inequality Constraints - V min s.t. - = * = by observation L(, λ) = f( ) λ' g( ) = - λ( - ) ' L(, λ) f( ) λ g( ) = = λ = L (*, λ*) = = λ* = λ* =
45 Nonlinear Inequality Constraints - VI T g ( ) = [ ] Z+ = λ λi i i L (, ) = f( ) g( ) = ( ) = T Z+ L(, λ) Z+ = [ ] positive definite = So SOSC are satisfied, and * is a strict local minimum.
46 CAREFUL Sometimes things aren t nice and the KKT conditions don t eist even when a point is a min. We must impose additional conditions (constraint qualifications) to make sure the KKT conditions eist.
47 Eample with no KKT point min s. t. f (,) = y y g(,) = g(,) = f(,) + λ g (,) + λ g (,) for any λ
48 What s wrong Set of linearized feasible directions do not match the real set of feasible directions that can be achieved in the limit (called the tangent cone). (Take NLP in fall to learn more about tangent cones)
49 Constraint Qualification Linear Independence CQ (LICQ) gradients of active constraints are linearly independent (also called regular point) Constraints are linear Conve program with strictly interior feasible points More eists that make sure linearization works out okay.
50 First Order Necessary Conditions- Equality If * is a local min of f over { g()=}, and LICQ (or other CQ) then there eists λ * L (*, λ*) = g (*) =
51 Second Order Necessary Conditions- Equality If * is a local min of f over { g()=}, Z is a null-space matri of the Jacobian g(*), and LICQ then there eists λ * L ( *, λ*) = or equivalently Z' f( *) = g (*) = ' ( *)... and Z L Z is p s d
52 Necessary Conditions Inequality If * is a local min of f over { g()>=}, Z is a null-space matri of the Jacobian g(*) of the active constraints, and * is a regular point (LICQ) then there eists λ * L ( *, λ*) = or equivalently Z' f( *) = g (*) λ g = ' * ( *) and Z L Z is p s d ' ( *)...
53 LICQ=Regular point If * is a regular point with respect to the constraints g(*) if the gradient of the active constraints are linearly independent. For equality constraints, all constraints are active so g (*)' I should have linearly independent rows.
54 Necessary Eample Show optimal solution *=[,] is regular and find KKT point ma + 3 st.. min ( ) 3 st.. ( + ) + ( ) + KKT point * = [, ] ' λ * = [ /, ] '
55 Constraint Qualifications Regularity or LICQ is an eample of a constraint qualification CQ. The KKT conditions are based on linearizations of the constraints. CQ guarantees that this linearization is not getting us into trouble. Problem is KKT point might not eist. There are many other CQ,e.g., for conve inequalities Slater is there eists g()<. Note CQ not needed for linear constraints.
56 Necessary Conditions General If * satisfies LICQ and is a local min of f over { g()>=,h()=}, there eists λ, λ * * I E * * = L ( *, λi*, λe*) = f( *) + λi gi( *) + λi hi( *) i i g (*), h (*) = λ g ' * ( *) = and Z L Z is p s d where Z constraints) ' ( *).. is the null space matri for A( *) (active
57 General SOSC Consider problem min of f over { g()>=,h()=}, * * If there eists *, λi, λe such that * * = L ( *, λi*, λe*) = f( *) + λi gi( *) + λi hi( *) i i g (*), h (*) = ' λ * ( *) = + ' ( *) +.. g andz L Z ispd where Z is the null space matri for A ( *) (active constraints with + + positive multipliers) then * is a strict local minimizer
58 KKT Summary X* is global min X* is local min Conve f Conve constraints CQ SOSC KKT Satisfied
Constrained Optimization Theory
Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More informationWe are now going to move on to a discussion of Inequality constraints. Our canonical problem now looks as ( ) = 0 ( ) 0
4 Lecture 4 4.1 Constrained Optimization with Inequality Constraints We are now going to move on to a discussion of Inequality constraints. Our canonical problem now looks as Problem 11 (Constrained Optimization
More informationSTATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY
STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY UNIVERSITY OF MARYLAND: ECON 600 1. Some Eamples 1 A general problem that arises countless times in economics takes the form: (Verbally):
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationTMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM
TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM H. E. Krogstad, IMF, Spring 2012 Karush-Kuhn-Tucker (KKT) Theorem is the most central theorem in constrained optimization, and since the proof is scattered
More informationComputational Optimization. Augmented Lagrangian NW 17.3
Computational Optimization Augmented Lagrangian NW 17.3 Upcoming Schedule No class April 18 Friday, April 25, in class presentations. Projects due unless you present April 25 (free extension until Monday
More informationLecture 16: October 22
0-725/36-725: Conve Optimization Fall 208 Lecturer: Ryan Tibshirani Lecture 6: October 22 Scribes: Nic Dalmasso, Alan Mishler, Benja LeRoy Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More informationLecture 3: Basics of set-constrained and unconstrained optimization
Lecture 3: Basics of set-constrained and unconstrained optimization (Chap 6 from textbook) Xiaoqun Zhang Shanghai Jiao Tong University Last updated: October 9, 2018 Optimization basics Outline Optimization
More informationNonlinear Programming and the Kuhn-Tucker Conditions
Nonlinear Programming and the Kuhn-Tucker Conditions The Kuhn-Tucker (KT) conditions are first-order conditions for constrained optimization problems, a generalization of the first-order conditions we
More informationIn view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written
11.8 Inequality Constraints 341 Because by assumption x is a regular point and L x is positive definite on M, it follows that this matrix is nonsingular (see Exercise 11). Thus, by the Implicit Function
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING
Nf SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING f(x R m g HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 5, DR RAPHAEL
More informationMATH 4211/6211 Optimization Basics of Optimization Problems
MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization
More informationConvex Optimization Overview (cnt d)
Conve Optimization Overview (cnt d) Chuong B. Do November 29, 2009 During last week s section, we began our study of conve optimization, the study of mathematical optimization problems of the form, minimize
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationtoo, of course, but is perhaps overkill here.
LUNDS TEKNISKA HÖGSKOLA MATEMATIK LÖSNINGAR OPTIMERING 018-01-11 kl 08-13 1. a) CQ points are shown as A and B below. Graphically: they are the only points that share the same tangent line for both active
More informationNewton-Type Algorithms for Nonlinear Constrained Optimization
Newton-Type Algorithms for Nonlinear Constrained Optimization Angelika Altmann-Dieses Faculty of Management Science and Engineering Karlsruhe University of Applied Sciences Moritz Diehl Department of Microsystems
More informationSecond Order Optimality Conditions for Constrained Nonlinear Programming
Second Order Optimality Conditions for Constrained Nonlinear Programming Lecture 10, Continuous Optimisation Oxford University Computing Laboratory, HT 2006 Notes by Dr Raphael Hauser (hauser@comlab.ox.ac.uk)
More informationON LICQ AND THE UNIQUENESS OF LAGRANGE MULTIPLIERS
ON LICQ AND THE UNIQUENESS OF LAGRANGE MULTIPLIERS GERD WACHSMUTH Abstract. Kyparisis proved in 1985 that a strict version of the Mangasarian- Fromovitz constraint qualification (MFCQ) is equivalent to
More informationMATH2070 Optimisation
MATH2070 Optimisation Nonlinear optimisation with constraints Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The full nonlinear optimisation problem with equality constraints
More informationOptimization Problems with Constraints - introduction to theory, numerical Methods and applications
Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)
More informationCSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods
CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods March 23, 2018 1 / 35 This material is covered in S. Boyd, L. Vandenberge s book Convex Optimization https://web.stanford.edu/~boyd/cvxbook/.
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationDuality Uses and Correspondences. Ryan Tibshirani Convex Optimization
Duality Uses and Correspondences Ryan Tibshirani Conve Optimization 10-725 Recall that for the problem Last time: KKT conditions subject to f() h i () 0, i = 1,... m l j () = 0, j = 1,... r the KKT conditions
More information10-725/36-725: Convex Optimization Spring Lecture 21: April 6
10-725/36-725: Conve Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 21: April 6 Scribes: Chiqun Zhang, Hanqi Cheng, Waleed Ammar Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationMath 273a: Optimization Basic concepts
Math 273a: Optimization Basic concepts Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 slides based on Chong-Zak, 4th Ed. Goals of this lecture The general form of optimization: minimize
More informationComputational Finance
Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples
More informationPrimal/Dual Decomposition Methods
Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients
More informationExamination paper for TMA4180 Optimization I
Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More information2.3 Linear Programming
2.3 Linear Programming Linear Programming (LP) is the term used to define a wide range of optimization problems in which the objective function is linear in the unknown variables and the constraints are
More informationConstraint qualifications for nonlinear programming
Constraint qualifications for nonlinear programming Consider the standard nonlinear program min f (x) s.t. g i (x) 0 i = 1,..., m, h j (x) = 0 1 = 1,..., p, (NLP) with continuously differentiable functions
More informationSufficient Conditions for Finite-variable Constrained Minimization
Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationLecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem
Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R
More information2.098/6.255/ Optimization Methods Practice True/False Questions
2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationDuality revisited. Javier Peña Convex Optimization /36-725
Duality revisited Javier Peña Conve Optimization 10-725/36-725 1 Last time: barrier method Main idea: approimate the problem f() + I C () with the barrier problem f() + 1 t φ() tf() + φ() where t > 0 and
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationSparse Optimization Lecture: Basic Sparse Optimization Models
Sparse Optimization Lecture: Basic Sparse Optimization Models Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know basic l 1, l 2,1, and nuclear-norm
More informationOutline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution
Outline Roadmap for the NPP segment: 1 Preliminaries: role of convexity 2 Existence of a solution 3 Necessary conditions for a solution: inequality constraints 4 The constraint qualification 5 The Lagrangian
More informationLagrangian Duality for Dummies
Lagrangian Duality for Dummies David Knowles November 13, 2010 We want to solve the following optimisation problem: f 0 () (1) such that f i () 0 i 1,..., m (2) For now we do not need to assume conveity.
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationThe Karush-Kuhn-Tucker (KKT) conditions
The Karush-Kuhn-Tucker (KKT) conditions In this section, we will give a set of sufficient (and at most times necessary) conditions for a x to be the solution of a given convex optimization problem. These
More informationARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1
AREA, Fall 5 LECTURE #: WED, OCT 5, 5 PRINT DATE: OCTOBER 5, 5 (GRAPHICAL) CONTENTS 1. Graphical Overview of Optimization Theory (cont) 1 1.4. Separating Hyperplanes 1 1.5. Constrained Maximization: One
More informationOPER 627: Nonlinear Optimization Lecture 14: Mid-term Review
OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization
More informationMathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7
Mathematical Foundations -- Constrained Optimization Constrained Optimization An intuitive approach First Order Conditions (FOC) 7 Constraint qualifications 9 Formal statement of the FOC for a maximum
More informationDual Ascent. Ryan Tibshirani Convex Optimization
Dual Ascent Ryan Tibshirani Conve Optimization 10-725 Last time: coordinate descent Consider the problem min f() where f() = g() + n i=1 h i( i ), with g conve and differentiable and each h i conve. Coordinate
More informationCHAPTER 1-2: SHADOW PRICES
Essential Microeconomics -- CHAPTER -: SHADOW PRICES An intuitive approach: profit maimizing firm with a fied supply of an input Shadow prices 5 Concave maimization problem 7 Constraint qualifications
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationLecture 11 and 12: Penalty methods and augmented Lagrangian methods for nonlinear programming
Lecture 11 and 12: Penalty methods and augmented Lagrangian methods for nonlinear programming Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lecture 11 and
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationCONSTRAINED NONLINEAR PROGRAMMING
149 CONSTRAINED NONLINEAR PROGRAMMING We now turn to methods for general constrained nonlinear programming. These may be broadly classified into two categories: 1. TRANSFORMATION METHODS: In this approach
More informationAlgorithms for constrained local optimization
Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained
More informationSparse Optimization Lecture: Dual Certificate in l 1 Minimization
Sparse Optimization Lecture: Dual Certificate in l 1 Minimization Instructor: Wotao Yin July 2013 Note scriber: Zheng Sun Those who complete this lecture will know what is a dual certificate for l 1 minimization
More informationMATH 4211/6211 Optimization Constrained Optimization
MATH 4211/6211 Optimization Constrained Optimization Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 Constrained optimization
More informationIntroduction to Alternating Direction Method of Multipliers
Introduction to Alternating Direction Method of Multipliers Yale Chang Machine Learning Group Meeting September 29, 2016 Yale Chang (Machine Learning Group Meeting) Introduction to Alternating Direction
More informationLagrange Multipliers
Lagrange Multipliers (Com S 477/577 Notes) Yan-Bin Jia Nov 9, 2017 1 Introduction We turn now to the study of minimization with constraints. More specifically, we will tackle the following problem: minimize
More informationCONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS
CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS A Dissertation Submitted For The Award of the Degree of Master of Philosophy in Mathematics Neelam Patel School of Mathematics
More informationIE 5531 Midterm #2 Solutions
IE 5531 Midterm #2 s Prof. John Gunnar Carlsson November 9, 2011 Before you begin: This exam has 9 pages and a total of 5 problems. Make sure that all pages are present. To obtain credit for a problem,
More informationCO 250 Final Exam Guide
Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,
More informationA Local Convergence Analysis of Bilevel Decomposition Algorithms
A Local Convergence Analysis of Bilevel Decomposition Algorithms Victor DeMiguel Decision Sciences London Business School avmiguel@london.edu Walter Murray Management Science and Engineering Stanford University
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More information1 Kernel methods & optimization
Machine Learning Class Notes 9-26-13 Prof. David Sontag 1 Kernel methods & optimization One eample of a kernel that is frequently used in practice and which allows for highly non-linear discriminant functions
More informationConvex Programs. Carlo Tomasi. December 4, 2018
Convex Programs Carlo Tomasi December 4, 2018 1 Introduction In an earlier note, we found methods for finding a local minimum of some differentiable function f(u) : R m R. If f(u) is at least weakly convex,
More informationCONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING
CONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING HANDE Y. BENSON, ARUN SEN, AND DAVID F. SHANNO Abstract. In this paper, we present global and local convergence results
More informationOPTIMISATION /09 EXAM PREPARATION GUIDELINES
General: OPTIMISATION 2 2008/09 EXAM PREPARATION GUIDELINES This points out some important directions for your revision. The exam is fully based on what was taught in class: lecture notes, handouts and
More informationIntro to Nonlinear Optimization
Intro to Nonlinear Optimization We now rela the proportionality and additivity assumptions of LP What are the challenges of nonlinear programs NLP s? Objectives and constraints can use any function: ma
More informationIE 5531: Engineering Optimization I
IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,
More informationLecture 13: Constrained optimization
2010-12-03 Basic ideas A nonlinearly constrained problem must somehow be converted relaxed into a problem which we can solve (a linear/quadratic or unconstrained problem) We solve a sequence of such problems
More informationOPTIMISATION 2007/8 EXAM PREPARATION GUIDELINES
General: OPTIMISATION 2007/8 EXAM PREPARATION GUIDELINES This points out some important directions for your revision. The exam is fully based on what was taught in class: lecture notes, handouts and homework.
More informationLecture 10: Linear programming. duality. and. The dual of the LP in standard form. maximize w = b T y (D) subject to A T y c, minimize z = c T x (P)
Lecture 10: Linear programming duality Michael Patriksson 19 February 2004 0-0 The dual of the LP in standard form minimize z = c T x (P) subject to Ax = b, x 0 n, and maximize w = b T y (D) subject to
More informationScientific Computing: Optimization
Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More informationOPER 627: Nonlinear Optimization Lecture 2: Math Background and Optimality Conditions
OPER 627: Nonlinear Optimization Lecture 2: Math Background and Optimality Conditions Department of Statistical Sciences and Operations Research Virginia Commonwealth University Aug 28, 2013 (Lecture 2)
More information9. Interpretations, Lifting, SOS and Moments
9-1 Interpretations, Lifting, SOS and Moments P. Parrilo and S. Lall, CDC 2003 2003.12.07.04 9. Interpretations, Lifting, SOS and Moments Polynomial nonnegativity Sum of squares (SOS) decomposition Eample
More informationCS / ISyE 730 Spring 2015 Steve Wright
CS / ISyE 730 Spring 2015 Steve Wright (swright@cs.wisc.edu) Convex Analysis Basics. We work mostly in IR n and sometimes in SIR n n. Recall vector notation (subscripts for components), inner products,
More informationAN AUGMENTED LAGRANGIAN AFFINE SCALING METHOD FOR NONLINEAR PROGRAMMING
AN AUGMENTED LAGRANGIAN AFFINE SCALING METHOD FOR NONLINEAR PROGRAMMING XIAO WANG AND HONGCHAO ZHANG Abstract. In this paper, we propose an Augmented Lagrangian Affine Scaling (ALAS) algorithm for general
More informationAM 205: lecture 18. Last time: optimization methods Today: conditions for optimality
AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3
More information. This matrix is not symmetric. Example. Suppose A =
Notes for Econ. 7001 by Gabriel A. ozada The equation numbers and page numbers refer to Knut Sydsæter and Peter J. Hammond s textbook Mathematics for Economic Analysis (ISBN 0-13- 583600-X, 1995). 1. Convexity,
More informationHow to Characterize Solutions to Constrained Optimization Problems
How to Characterize Solutions to Constrained Optimization Problems Michael Peters September 25, 2005 1 Introduction A common technique for characterizing maximum and minimum points in math is to use first
More informationThe Kuhn-Tucker and Envelope Theorems
The Kuhn-Tucker and Envelope Theorems Peter Ireland EC720.01 - Math for Economists Boston College, Department of Economics Fall 2010 The Kuhn-Tucker and envelope theorems can be used to characterize the
More informationMidterm Review. Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Midterm Review Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye (LY, Chapter 1-4, Appendices) 1 Separating hyperplane
More informationChap 2. Optimality conditions
Chap 2. Optimality conditions Version: 29-09-2012 2.1 Optimality conditions in unconstrained optimization Recall the definitions of global, local minimizer. Geometry of minimization Consider for f C 1
More informationCore 1 Inequalities and indices Section 1: Errors and inequalities
Notes and Eamples Core Inequalities and indices Section : Errors and inequalities These notes contain subsections on Inequalities Linear inequalities Quadratic inequalities This is an eample resource from
More informationFIN 550 Exam answers. A. Every unconstrained problem has at least one interior solution.
FIN 0 Exam answers Phil Dybvig December 3, 0. True-False points A. Every unconstrained problem has at least one interior solution. False. An unconstrained problem may not have any solution at all. For
More informationINTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: CONVERGENCE ANALYSIS AND COMPUTATIONAL PERFORMANCE
INTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: CONVERGENCE ANALYSIS AND COMPUTATIONAL PERFORMANCE HANDE Y. BENSON, ARUN SEN, AND DAVID F. SHANNO Abstract. In this paper, we present global
More informationLecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016
Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,
More informationContinuous Optimisation, Chpt 6: Solution methods for Constrained Optimisation
Continuous Optimisation, Chpt 6: Solution methods for Constrained Optimisation Peter J.C. Dickinson DMMP, University of Twente p.j.c.dickinson@utwente.nl http://dickinson.website/teaching/2017co.html version:
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS
SECTION C: CONTINUOUS OPTIMISATION LECTURE : THE METHOD OF LAGRANGE MULTIPLIERS HONOUR SCHOOL OF MATHEMATICS OXFORD UNIVERSITY HILARY TERM 005 DR RAPHAEL HAUSER. Examples. In this lecture we will take
More informationALADIN An Algorithm for Distributed Non-Convex Optimization and Control
ALADIN An Algorithm for Distributed Non-Convex Optimization and Control Boris Houska, Yuning Jiang, Janick Frasch, Rien Quirynen, Dimitris Kouzoupis, Moritz Diehl ShanghaiTech University, University of
More informationLecture 24: August 28
10-725: Optimization Fall 2012 Lecture 24: August 28 Lecturer: Geoff Gordon/Ryan Tibshirani Scribes: Jiaji Zhou,Tinghui Zhou,Kawa Cheung Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationSupport Vector Machines
Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal
More informationA Basic Course in Real Analysis Prof. P. D. Srivastava Department of Mathematics Indian Institute of Technology, Kharagpur
A Basic Course in Real Analysis Prof. P. D. Srivastava Department of Mathematics Indian Institute of Technology, Kharagpur Lecture - 36 Application of MVT, Darbou Theorem, L Hospital Rule (Refer Slide
More information