IE 5531: Engineering Optimization I

Size: px
Start display at page:

Download "IE 5531: Engineering Optimization I"

Transcription

1 IE 5531: Engineering Optimization I Lecture 19: Midterm 2 Review Prof. John Gunnar Carlsson November 22, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

2 Administrivia Midterm 2 on 11/24 Covers lectures Open book, open notes Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

3 Lecture 10: Introduction to nonlinear methods A global minimizer for the problem minimize f (x) x F s.t. is a vector x F such that f ( x) f (x) for all x F Unlike linear programming, sometimes we must settle for a local minimizer x F, which is only locally optimal A local minimizer x F is a vector satisfying f ( x) f (x) for all x F N ( x), where N ( x) is called a neighborhood of x Typically N ( x) is an open ball centered at x with suciently small radius δ > 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

4 Lecture 10: Introduction to nonlinear methods Suppose that f (x) is a dierentiable function; if x R n and there exists a vector d such that f ( x) T d < 0 then there exists a scalar τ > 0 such that f ( x + τ d) < f ( x) for all τ (0, τ) The vector d is called a descent direction at x The most obvious descent direction is d = f ( x) Sometimes a better descent direction is d = H 1 f ( x), where H is the Hessian matrix of f (this captures more global behavior of the function) Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

5 Lecture 10: Introduction to nonlinear methods Clearly a necessary condition for optimality of a feasible point x is that there be no feasible descent direction; that is, D ( x; F) D ( x; f ) = In an unconstrained problem we have F = R n and therefore every direction is feasible It follows that if x is optimal, then we must have D ( x; f ) = This means that for all vectors d we must have f ( x) T d 0, which can only happen if f ( x) = 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

6 Lecture 12: Nonlinear methods, continued Theorem Let x be a local minimizer of LEP. A necessary condition for optimality at a point x for the problem minimize f (x) s.t. Ax = b is that f ( x) = A T y for some vector y R m. The geometric interpretation is that the gradient vector must be perpendicular to the constraint hyperplanes Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

7 Lecture 12: Nonlinear methods, continued Theorem Let x be a local minimizer of the problem minimize f (x) s.t. g i (x) = 0 i {1,..., m} If the functions f (x) and g i (x) are continuously dierentiable at x and the Jacobian matrix g ( x) has rank m, then there exist scalars y 1,..., y m such that m f ( x) = y i g i ( x) i=1 where the y i 's are called Lagrange multipliers. Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

8 Lecture 12: Nonlinear methods, continued Theorem A necessary condition for optimality at a point x for the problem minimize f (x) s.t. Ax b is that f ( x) = A T y for some vector y R m with y 0. Furthermore, we must have y i = 0 if A i x > b i. Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

9 Lecture 12: Nonlinear methods, continued Theorem (KKT Conditions) If x is a local minimizer for the problem minimize f (x) s.t. c (x) 0 h (x) = 0 with c (x) R m and h (x) R p and certain (technical) constraint qualications are satised at x, then there exist scalars y 1,..., y m and z 1,..., z p such that f ( x) = m i=1 y i c i ( x) + p z i h i ( x) y i 0 i y i c i ( x) = 0 i Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34 i=1

10 Lecture 12: Nonlinear methods, continued In all of the preceding problems, if the objective function and the feasible sets are convex, then the necessary optimality conditions are also sucient! Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

11 Lecture 13: Applications of KKT conditions Consider an economy consisting of n = 2 agents, each of whom consume a public good (national defense) x and a private good y (cars) A public good has two important properties: It is nonrival: consumption of the good by one agent does not reduce the amount from another agent It is non-excludable: no agent can prevent another agent from using it The agents each have initial endowments y 1, y 2 which can be spent on the public good (price p) or the private good (price 1, say) The agents' utility functions are given by u 1 (x, z 1 ) and u 2 (x, z 2 ), where x: total amount of the public good that agents 1 and 2 pay for z 1, z 2 : amount of the private good that agents 1 and 2 have Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

12 Lecture 13: Applications of KKT conditions At an optimal allocation, u 1 / x + u 2/ x = p u 1 / z 1 u 2 / z 2 which is called the Samuelson condition Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

13 Lecture 13: Applications of KKT conditions At an optimal allocation, u 1 / x + u 2/ x = p u 1 / z 1 u 2 / z 2 which is called the Samuelson condition The quantity u i / x is called the marginal rate of substitution u i / z i between the public and the private good; it says how many units of the private good the consumer will give up for one extra unit of x This says that, when public goods are allocated optimally, the unit cost of the public good p should be equal to the sum of the benets to all agents from the public good Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

14 Lecture 13: Applications of KKT conditions Power allocation: We have a collection of n communication channels and we need to decide how much power to allocate to each of them The capacity (communication rate) of channel i is log (α i + x i ) with given α i > 0, and we have a budget constraint 1 T x = 1, x 0 The optimization problem is maximize n log (α i + x i ) s.t. i=1 1 T x = 1 x 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

15 Lecture 13: Applications of KKT conditions The optimal solution is x i = max {0, 1/ν α i } for some ν { x 1/ν α i ν < 1/α i i = 0 otherwise Think of a set of patches with heights α i ; we then ood the region with water of height 1/ν The total amount of water is n i=1 max {0, 1/ν α i} The water-lling algorithm is precisely to ll the region with water until an amount 1 has been used! Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

16 Lecture 13: Applications of KKT conditions Buyers have money w i to buy j dierent goods and maximize their individual (linear) utility functions Producers sells their goods for money The equilibrium price is an assignment of prices to goods so that the market clears when every buyer buys his optimal set of goods Each buyer's strategy is maximize u T i x i s.t. p T x i w i x i 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

17 Lecture 13: Applications of KKT conditions ( maximize w i log u T i i x i i ) x i = 1 s.t. x i 0 i Theorem (Eisenberg and Gale 1959) The optimal Lagrange multiplier for the equality constraints in the above NLP is an equilibrium price vector. Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

18 Lecture 14: Unconstrained optimization Optimization algorithms tend to be iterative procedures: Starting at a given point x0, they generate a sequence {x k} of iterates This sequence terminates when either no more progress can be made (out of memory, etc.) or when a solution point has been approximated satisfactorily At any given iterate x k, we generally want x k+1 to satisfy f (x k+1) < f (x k) Furthermore, we want our sequence to converge to a local minimizer x The general approach is a line search: At any given iterate xk, choose a direction dk, and then set xk+1 = xk + α k dk for some scalar α k > 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

19 Lecture 14: Unconstrained optimization Root-nding methods: Bisection Golden section search Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

20 Lecture 14: Unconstrained optimization Consider the multi-dimensional problem for x R n minimize f (x) At each iteration x k we set d k = f (x k) and set x k+1 = x k + α k d k, for appropriately chosen α k In the big picture, we want α k to give us a sucient reduction in f (x), without spending too much time on it Two conditions we can impose are the Wolfe and Goldstein conditions Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

21 Lecture 14: Unconstrained optimization The Goldstein Conditions Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

22 Lecture 14: Unconstrained optimization The Wolfe Conditions Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

23 Lecture 14: Unconstrained optimization Theorem Let f (x) be a given continuously dierentiable function. Let x0 R n be a point for which the sub-level set X 0 = {x R n : f (x) f (x0)} is bounded. Let {x k} be a sequence of points generated by the steepest descent method initiated at x0, using either the Wolfe or Goldstein line search conditions. Then {x k} converges to a stationary point of f (x). Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

24 Lecture 15: Nonlinear optimization Minimizing a function f (x) can be thought of as nding a solution to the nonlinear system of equations f (x) = 0 Suppose we begin at a point x0 that is thought to be close to a minimizer x We may consider the problem of nding a solution to f (x) = 0 that is close to x0 (we're assuming that there aren't any maximizers that are closer to x0) Newton's method is a general method for solving a system of equations g (x) = 0 (to minimize/maximize, set g (x) := f (x) Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

25 Lecture 15: Nonlinear optimization Newton's method is an iterative method that follows the following scheme: 1 At a given iterate xk, make a linear approximation L (x) to g (x) at xk by dierentiating g (x) 2 Set xk+1 to be the solution to the linear system of equations L (x) = 0 It is not hard to show that, in the univariate case, the iteration is x k+1 = x k g (x k) g (x k ) which is well-dened provided g (x k ) exists and is nonzero at each step Note that the iteration terminates if g (x k ) = 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

26 Lecture 15: Nonlinear optimization Consider the problem of solving g (x) = 0 Dene the Jacobian matrix J = g by [J] ij = g i (x) (x j ) (the rows of J are just the gradient vectors g i (x)) The iterations are where J is constructed at point x k x k+1 = x k J 1 g (x) Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

27 Lecture 15: Nonlinear optimization The ellipsoid method is best introduced by considering the problem of nding an element of of a solution set X given by a system of linear inequalities: X = {x R n : Ax b, i = 1,..., m} An ellipsoid is just a set of the form { } E k = x R n : (x x k) T B 1 (x k x k) 1 where x k is the center of the ellipsoid B k is a symmetric positive denite matrix of dimension n Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

28 Lecture 15: Nonlinear optimization The ellipsoid method is best introduced by considering the problem of nding an element of of a solution set X given by a system of linear inequalities: X = {x R n : Ax b, i = 1,..., m} An ellipsoid is just a set of the form { } E k = x R n : (x x k) T B 1 (x k x k) 1 where x k is the center of the ellipsoid B k is a symmetric positive denite matrix of dimension n Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

29 Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

30 Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

31 Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

32 Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

33 Lecture 15: Nonlinear optimization At a given iteration k with x k and E k, we construct E k+1 as follows: dene τ = 1 n + 1 ; δ = n2 n 2 1 ; σ = 2τ We set x k+1 = x k + a T j τ B k a j B k a j; B k+1 = δ ( B k σ B k a j at j a T j B k a j B k ) Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

34 Lecture 15: Nonlinear optimization Theorem The ellipsoid E k+1 dened in the { preceding slide is the} minimum volume ellipsoid that contains E half := x E k k : a T x j a T j x k. Moreover, vol (E k+1 ) vol (E k ) ( ) n 2 (n 1)/2 = n 2 1 ( ) n 1 n + 1 < exp < 1 2 (n + 1) This establishes that the volume of the ellipsoid decreases by a constant amount at each iteration. It can be shown that the ellipsoid solves linear programs in O ( n 2 log (R/ɛ) ) iterations. Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

35 Lecture 16: Complexity theory, interior point methods If we determine that we can solve problem P in no more than α f (N) operations, where α is a constant, and f (N) is some function, then we say that algorithm A solves problem P in running time O (f (N)) If f (N) is bounded above by a polynomial, then we say that algorithm A solves problem P in polynomial running time (a desirable property) An undesirable property: solving problem P in exponential or factorial time Long-standing question (solved a long time ago): does the simplex method solve an LP in polynomial running time? (answer: NO) e.g. O ( mn 2), O ( ) m 3 n, etc. Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

36 Lecture 17: Interior point methods Consider the linearly-constrained problem minimize f (x) s.t. Ax = b x 0 The KKT conditions are X s = 0 Ax = b A T y + f (x) s = 0 x, s 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

37 Lecture 17: Interior point methods We can treat the constraint x 0 in a soft way by applying a penalty, or barrier function: n minimize f (x) µ log x j j=1 s.t. Ax = b as x i 0, the barrier term increases The KKT conditions are f (x) µx 1 1 A T y = 0 Ax = b x > 0 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

38 Lecture 17: Interior point methods The central path C of the barrier problem is dened by C = {x (µ) > 0, y (µ), s (µ) > 0 : 0 < µ < } It turns out that as µ 0, the central path converges to the minimizer of the original constrained problem Interior point methods solve convex optimization problems (including LPs) with running time that is both theoretically and practically fast Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, / 34

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson November 1, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 1, 2010 1 / 24

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 14: Unconstrained optimization Prof. John Gunnar Carlsson October 27, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 27, 2010 1

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,

More information

IE 5531 Midterm #2 Solutions

IE 5531 Midterm #2 Solutions IE 5531 Midterm #2 s Prof. John Gunnar Carlsson November 9, 2011 Before you begin: This exam has 9 pages and a total of 5 problems. Make sure that all pages are present. To obtain credit for a problem,

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 7: Duality and applications Prof. John Gunnar Carlsson September 29, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 29, 2010 1

More information

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 1: Introduction Prof. John Gunnar Carlsson September 8, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 8, 2010 1 / 35 Administrivia

More information

IE 5531 Practice Midterm #2

IE 5531 Practice Midterm #2 IE 5531 Practice Midterm #2 Prof. John Gunnar Carlsson November 23, 2010 Problem 1: Nonlinear programming You are a songwriter who writes Top 40 style songs for the radio. Each song you write can be described

More information

2.098/6.255/ Optimization Methods Practice True/False Questions

2.098/6.255/ Optimization Methods Practice True/False Questions 2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence

More information

Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for So far, we have considered unconstrained optimization problems.

Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for So far, we have considered unconstrained optimization problems. Consider constraints Notes for 2017-04-24 So far, we have considered unconstrained optimization problems. The constrained problem is minimize φ(x) s.t. x Ω where Ω R n. We usually define x in terms of

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 3: Linear Programming, Continued Prof. John Gunnar Carlsson September 15, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 15, 2010

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

Linear Programming Duality

Linear Programming Duality Summer 2011 Optimization I Lecture 8 1 Duality recap Linear Programming Duality We motivated the dual of a linear program by thinking about the best possible lower bound on the optimal value we can achieve

More information

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control 19/4/2012 Lecture content Problem formulation and sample examples (ch 13.1) Theoretical background Graphical

More information

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality

More information

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection 6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE Three Alternatives/Remedies for Gradient Projection Two-Metric Projection Methods Manifold Suboptimization Methods

More information

Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

More information

Rice University. Answer Key to Mid-Semester Examination Fall ECON 501: Advanced Microeconomic Theory. Part A

Rice University. Answer Key to Mid-Semester Examination Fall ECON 501: Advanced Microeconomic Theory. Part A Rice University Answer Key to Mid-Semester Examination Fall 006 ECON 50: Advanced Microeconomic Theory Part A. Consider the following expenditure function. e (p ; p ; p 3 ; u) = (p + p ) u + p 3 State

More information

Applications of Linear Programming

Applications of Linear Programming Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal

More information

The Ellipsoid (Kachiyan) Method

The Ellipsoid (Kachiyan) Method Yinyu Ye, MS&E, Stanford MS&E310 Lecture Note: Ellipsoid Method 1 The Ellipsoid (Kachiyan) Method Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

More information

Optimization Methods

Optimization Methods Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available

More information

1 Computing with constraints

1 Computing with constraints Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)

More information

Fundamentals of Operations Research. Prof. G. Srinivasan. Indian Institute of Technology Madras. Lecture No. # 15

Fundamentals of Operations Research. Prof. G. Srinivasan. Indian Institute of Technology Madras. Lecture No. # 15 Fundamentals of Operations Research Prof. G. Srinivasan Indian Institute of Technology Madras Lecture No. # 15 Transportation Problem - Other Issues Assignment Problem - Introduction In the last lecture

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

Numerical Optimization. Review: Unconstrained Optimization

Numerical Optimization. Review: Unconstrained Optimization Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 5: The Simplex method, continued Prof. John Gunnar Carlsson September 22, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 22, 2010

More information

Interior Point Methods for Mathematical Programming

Interior Point Methods for Mathematical Programming Interior Point Methods for Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Florianópolis, Brazil EURO - 2013 Roma Our heroes Cauchy Newton Lagrange Early results Unconstrained

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Numerical Optimization of Partial Differential Equations

Numerical Optimization of Partial Differential Equations Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained

More information

Computational Finance

Computational Finance Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples

More information

CMSC 858F: Algorithmic Game Theory Fall 2010 Market Clearing with Applications

CMSC 858F: Algorithmic Game Theory Fall 2010 Market Clearing with Applications CMSC 858F: Algorithmic Game Theory Fall 2010 Market Clearing with Applications Instructor: Mohammad T. Hajiaghayi Scribe: Rajesh Chitnis September 15, 2010 1 Overview We will look at MARKET CLEARING or

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

2.3 Linear Programming

2.3 Linear Programming 2.3 Linear Programming Linear Programming (LP) is the term used to define a wide range of optimization problems in which the objective function is linear in the unknown variables and the constraints are

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α

More information

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way.

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way. AMSC 607 / CMSC 878o Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 3: Penalty and Barrier Methods Dianne P. O Leary c 2008 Reference: N&S Chapter 16 Penalty and Barrier

More information

LINEAR AND NONLINEAR PROGRAMMING

LINEAR AND NONLINEAR PROGRAMMING LINEAR AND NONLINEAR PROGRAMMING Stephen G. Nash and Ariela Sofer George Mason University The McGraw-Hill Companies, Inc. New York St. Louis San Francisco Auckland Bogota Caracas Lisbon London Madrid Mexico

More information

Primal/Dual Decomposition Methods

Primal/Dual Decomposition Methods Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients

More information

Multidisciplinary System Design Optimization (MSDO)

Multidisciplinary System Design Optimization (MSDO) Multidisciplinary System Design Optimization (MSDO) Numerical Optimization II Lecture 8 Karen Willcox 1 Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox Today s Topics Sequential

More information

Lecture 18: Optimization Programming

Lecture 18: Optimization Programming Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming

More information

Optimisation and Operations Research

Optimisation and Operations Research Optimisation and Operations Research Lecture 22: Linear Programming Revisited Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/ School

More information

Interior Point Algorithms for Constrained Convex Optimization

Interior Point Algorithms for Constrained Convex Optimization Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

More information

Fisher Equilibrium Price with a Class of Concave Utility Functions

Fisher Equilibrium Price with a Class of Concave Utility Functions Fisher Equilibrium Price with a Class of Concave Utility Functions Ning Chen 1, Xiaotie Deng 2, Xiaoming Sun 3, and Andrew Chi-Chih Yao 4 1 Department of Computer Science & Engineering, University of Washington

More information

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained

More information

Lecture 13: Constrained optimization

Lecture 13: Constrained optimization 2010-12-03 Basic ideas A nonlinearly constrained problem must somehow be converted relaxed into a problem which we can solve (a linear/quadratic or unconstrained problem) We solve a sequence of such problems

More information

CS295: Convex Optimization. Xiaohui Xie Department of Computer Science University of California, Irvine

CS295: Convex Optimization. Xiaohui Xie Department of Computer Science University of California, Irvine CS295: Convex Optimization Xiaohui Xie Department of Computer Science University of California, Irvine Course information Prerequisites: multivariate calculus and linear algebra Textbook: Convex Optimization

More information

Optimization. Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations. Subsections

Optimization. Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations. Subsections Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations Subsections One-dimensional Unconstrained Optimization Golden-Section Search Quadratic Interpolation Newton's

More information

E 600 Chapter 4: Optimization

E 600 Chapter 4: Optimization E 600 Chapter 4: Optimization Simona Helmsmueller August 8, 2018 Goals of this lecture: Every theorem in these slides is important! You should understand, remember and be able to apply each and every one

More information

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006 Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in

More information

Sufficient Conditions for Finite-variable Constrained Minimization

Sufficient Conditions for Finite-variable Constrained Minimization Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute

More information

minimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1,

minimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1, 4 Duality 4.1 Numerical perturbation analysis example. Consider the quadratic program with variables x 1, x 2, and parameters u 1, u 2. minimize x 2 1 +2x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x

More information

LMI Methods in Optimal and Robust Control

LMI Methods in Optimal and Robust Control LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 02: Optimization (Convex and Otherwise) What is Optimization? An Optimization Problem has 3 parts. x F f(x) :

More information

The Fundamental Welfare Theorems

The Fundamental Welfare Theorems The Fundamental Welfare Theorems The so-called Fundamental Welfare Theorems of Economics tell us about the relation between market equilibrium and Pareto efficiency. The First Welfare Theorem: Every Walrasian

More information

10 Numerical methods for constrained problems

10 Numerical methods for constrained problems 10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Operations Research Lecture 4: Linear Programming Interior Point Method

Operations Research Lecture 4: Linear Programming Interior Point Method Operations Research Lecture 4: Linear Programg Interior Point Method Notes taen by Kaiquan Xu@Business School, Nanjing University April 14th 2016 1 The affine scaling algorithm one of the most efficient

More information

More on Lagrange multipliers

More on Lagrange multipliers More on Lagrange multipliers CE 377K April 21, 2015 REVIEW The standard form for a nonlinear optimization problem is min x f (x) s.t. g 1 (x) 0. g l (x) 0 h 1 (x) = 0. h m (x) = 0 The objective function

More information

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition)

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition) NONLINEAR PROGRAMMING (Hillier & Lieberman Introduction to Operations Research, 8 th edition) Nonlinear Programming g Linear programming has a fundamental role in OR. In linear programming all its functions

More information

The Fundamental Welfare Theorems

The Fundamental Welfare Theorems The Fundamental Welfare Theorems The so-called Fundamental Welfare Theorems of Economics tell us about the relation between market equilibrium and Pareto efficiency. The First Welfare Theorem: Every Walrasian

More information

Economics 101. Lecture 2 - The Walrasian Model and Consumer Choice

Economics 101. Lecture 2 - The Walrasian Model and Consumer Choice Economics 101 Lecture 2 - The Walrasian Model and Consumer Choice 1 Uncle Léon The canonical model of exchange in economics is sometimes referred to as the Walrasian Model, after the early economist Léon

More information

problem. max Both k (0) and h (0) are given at time 0. (a) Write down the Hamilton-Jacobi-Bellman (HJB) Equation in the dynamic programming

problem. max Both k (0) and h (0) are given at time 0. (a) Write down the Hamilton-Jacobi-Bellman (HJB) Equation in the dynamic programming 1. Endogenous Growth with Human Capital Consider the following endogenous growth model with both physical capital (k (t)) and human capital (h (t)) in continuous time. The representative household solves

More information

TMA947/MAN280 APPLIED OPTIMIZATION

TMA947/MAN280 APPLIED OPTIMIZATION Chalmers/GU Mathematics EXAM TMA947/MAN280 APPLIED OPTIMIZATION Date: 06 08 31 Time: House V, morning Aids: Text memory-less calculator Number of questions: 7; passed on one question requires 2 points

More information

Interior Point Methods in Mathematical Programming

Interior Point Methods in Mathematical Programming Interior Point Methods in Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Brazil Journées en l honneur de Pierre Huard Paris, novembre 2008 01 00 11 00 000 000 000 000

More information

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares Robert Bridson October 29, 2008 1 Hessian Problems in Newton Last time we fixed one of plain Newton s problems by introducing line search

More information

Second Welfare Theorem

Second Welfare Theorem Second Welfare Theorem Econ 2100 Fall 2015 Lecture 18, November 2 Outline 1 Second Welfare Theorem From Last Class We want to state a prove a theorem that says that any Pareto optimal allocation is (part

More information

3E4: Modelling Choice. Introduction to nonlinear programming. Announcements

3E4: Modelling Choice. Introduction to nonlinear programming. Announcements 3E4: Modelling Choice Lecture 7 Introduction to nonlinear programming 1 Announcements Solutions to Lecture 4-6 Homework will be available from http://www.eng.cam.ac.uk/~dr241/3e4 Looking ahead to Lecture

More information

Exchange Market Equilibria with Leontief s Utility: Freedom of Pricing Leads to Rationality

Exchange Market Equilibria with Leontief s Utility: Freedom of Pricing Leads to Rationality Exchange Market Equilibria with Leontief s Utility: Freedom of Pricing Leads to Rationality Yinyu Ye April 23, 2005; revised August 5, 2006 Abstract This paper studies the equilibrium property and algorithmic

More information

Penalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

where u is the decision-maker s payoff function over her actions and S is the set of her feasible actions.

where u is the decision-maker s payoff function over her actions and S is the set of her feasible actions. Seminars on Mathematics for Economics and Finance Topic 3: Optimization - interior optima 1 Session: 11-12 Aug 2015 (Thu/Fri) 10:00am 1:00pm I. Optimization: introduction Decision-makers (e.g. consumers,

More information

MS-E2140. Lecture 1. (course book chapters )

MS-E2140. Lecture 1. (course book chapters ) Linear Programming MS-E2140 Motivations and background Lecture 1 (course book chapters 1.1-1.4) Linear programming problems and examples Problem manipulations and standard form problems Graphical representation

More information

Numerical Methods. V. Leclère May 15, x R n

Numerical Methods. V. Leclère May 15, x R n Numerical Methods V. Leclère May 15, 2018 1 Some optimization algorithms Consider the unconstrained optimization problem min f(x). (1) x R n A descent direction algorithm is an algorithm that construct

More information

Generalization to inequality constrained problem. Maximize

Generalization to inequality constrained problem. Maximize Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum

More information

Solving Dual Problems

Solving Dual Problems Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Constrained Optimization Theory

Constrained Optimization Theory Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August

More information

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur Lecture - 01 Introduction to Optimization Today, we will start the constrained

More information

4.6 Linear Programming duality

4.6 Linear Programming duality 4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP Different spaces and objective functions but in general same optimal

More information

ECON 5111 Mathematical Economics

ECON 5111 Mathematical Economics Test 1 October 1, 2010 1. Construct a truth table for the following statement: [p (p q)] q. 2. A prime number is a natural number that is divisible by 1 and itself only. Let P be the set of all prime numbers

More information

Microeconomic Theory -1- Introduction

Microeconomic Theory -1- Introduction Microeconomic Theory -- Introduction. Introduction. Profit maximizing firm with monopoly power 6 3. General results on maximizing with two variables 8 4. Model of a private ownership economy 5. Consumer

More information

A vector from the origin to H, V could be expressed using:

A vector from the origin to H, V could be expressed using: Linear Discriminant Function: the linear discriminant function: g(x) = w t x + ω 0 x is the point, w is the weight vector, and ω 0 is the bias (t is the transpose). Two Category Case: In the two category

More information

Econ Slides from Lecture 10

Econ Slides from Lecture 10 Econ 205 Sobel Econ 205 - Slides from Lecture 10 Joel Sobel September 2, 2010 Example Find the tangent plane to {x x 1 x 2 x 2 3 = 6} R3 at x = (2, 5, 2). If you let f (x) = x 1 x 2 x3 2, then this is

More information

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Module - 03 Simplex Algorithm Lecture 15 Infeasibility In this class, we

More information

Constrained Optimization

Constrained Optimization Constrained Optimization Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 General Problem Consider the following general constrained optimization problem:

More information

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION Part I: Short Questions August 12, 2008 9:00 am - 12 pm General Instructions This examination is

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

A Perfect Price Discrimination Market Model with Production, and a Rational Convex Program for it

A Perfect Price Discrimination Market Model with Production, and a Rational Convex Program for it A Perfect Price Discrimination Market Model with Production, and a Rational Convex Program for it Gagan Goel Vijay V. Vazirani Abstract Recent results showing PPAD-completeness of the problem of computing

More information

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Convex Optimization Prof. Nati Srebro Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Equality Constrained Optimization f 0 (x) s. t. A R p n, b R p Using access to: 2 nd order oracle

More information

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx. Two hours To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER CONVEX OPTIMIZATION - SOLUTIONS xx xxxx 27 xx:xx xx.xx Answer THREE of the FOUR questions. If

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

Optimization Methods. Lecture 18: Optimality Conditions and. Gradient Methods. for Unconstrained Optimization

Optimization Methods. Lecture 18: Optimality Conditions and. Gradient Methods. for Unconstrained Optimization 5.93 Optimization Methods Lecture 8: Optimality Conditions and Gradient Methods for Unconstrained Optimization Outline. Necessary and sucient optimality conditions Slide. Gradient m e t h o d s 3. The

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information