Lagrange Multipliers
|
|
- Kerry Sullivan
- 6 years ago
- Views:
Transcription
1 Lagrange Multipliers (Com S 477/577 Notes) Yan-Bin Jia Nov 9, Introduction We turn now to the study of minimization with constraints. More specifically, we will tackle the following problem: minimize f(x) subject to h 1 (x) = 0. h m (x) = 0 where x Ω R n, and the functions f, h 1,...,h m are continuous, and usually assumed to be in C 2 (i.e., with continuous second partial derivatives). When f and h j s are linear, the problem is a linear programming one solvable using the simplex algorithm [3]. Hence we would like to focus on the case that these functions are nonlinear. In order to gain some intuition, let us look at the case where n = 2 and m = 1. he problem becomes minimize f(x,y) subject to h(x,y) = 0, x,y R. he constraint h(x,y) = 0 defines a curve. Differentiate the equation with respect to x: h x + h dy y dx = 0. he tangent of the curve is (x,y) = (1, dy dx ). And the gradient of the curve is h = ( h the above equation states that h = 0; x, h y ). So namely, the tangent of the curve must be normal to the gradient all the time. Suppose we are at a point on the curve. o stay on the curve, any motion must be along the tangent. h h(x, y) = 0 1
2 In order to increase or decrease f(x,y), motion along the constraint curve must have a component along the gradient of f, that is, f 0. f h(x, y) = 0 f At an extremum of f, a differential motion should not yield a component of motion along f. hus is orthogonal to f; in other words, the condition f = 0 must hold. Now is orthogonal to both gradients f and h at an extrema. his means that f and h must be parallel. Phrased differently, there exists some λ R such that f +λ h = 0. (1) (x 2, y 2 ) h(x, y) = 0 (x 1, y 1 ) (x, y ) f h f = c 5 f = c 4 f = c 1 f = c f = c 3 he figure above explains condition (1) by superposing the curve h(x,y) = 0 onto the family of level curves of f(x,y), that is, the collection of curves f(x,y) = c, where c is any real number in the range of f. In the figure, c 5 > c 4 > c 3 > c > c 1. he tangent of a level curve is always orthogonal to the gradient f. Otherwise moving along the curve would result in an increase or decrease of the value of f. Imagine a point moving on the curve h(x,y) = 0 from (x 1,y 1 ) to (x 2,y 2 ). Initially, the motion has a component along the negative gradient direction f, resulting in the decrease of the value of f. his component becomes smaller and smaller. When the moving point reaches (x,y ), the motion is orthogonal to the gradient. From that point on, the motion starts having a 2
3 component along the gradient direction f so the value of f increases. hus at (x,y ), f achieves a local minimum, which is c. he motion is in the tangential direction of the curve h(x,y) = 0, which is orthogonal to the gradient h. herefore, at the point (x,y ) the two gradients f and h must be collinear. his is what equation (1) says. It is clear that the two curves f(x,y) = c and h(x,y) = 0 are tangent at (x,y ). Suppose we find the set S of points satisfying the following equations: h(x,y) = 0, f +λ h = 0, for some λ. hen S contains the extremal points of f subject to the constraints h(x,y) = 0. he above two equations constitute a nonlinear system in the variables x, y, λ. It can be solved using numerical techniques, for example, Newton s method. 2 Lagrangian It is convenient to introduce the Lagrangian associated with the constrained problem, defined as Note that l = l(x,y,λ) = f(x,y)+λh(x,y). f x +λ h x f y +λ h y h = ( f +λ h,h). hus, setting l = 0 yields the same system of nonlinear equations we derived earlier. he value λ is known as the Lagrange multiplier. he approach of constructing the Lagrangians and setting its gradient to zero is known as the method of Lagrange multipliers. Here we are not minimizing the Lagrangian, but merely finding its stationary point (x, y, λ). Example 1 Find the extremal values of the function f(x,y) = xy subject to the constraint h(x,y) = x2 8 + y2 1 = 0. 2 We first construct the Lagrangian and find its gradient: ( ) x 2 l(x,y,λ) = xy +λ 8 + y2 2 1, he above leads to three equations l(x, yλ) = y + λx 4 x+λy x y2 2 1 = 0. y + λx = 0, (2) 4 x+λy = 0, (3) x 2 +4y 2 = 8. (4) 3
4 Eliminate x from (2) and (3): y λ2 4 y = 0, which, since y 0 (otherwise x = 0 would hold by (2)), leads to λ 2 = 4 and λ = ±2. Now x = ±2y by (3). Substituting this equation into (4) gives us y = ±1 and x = ±2. So there are four extremal points of f subject to the constraint h: (2,1),( 2, 1),(2, 1), and ( 2, 1). he maximum value 2 is achieved at the first two points while the minimum value 2 is achieved at the last two points. c = 2 c = 2 c = 1 c = 1 Graphically, the constraint h defines an ellipse. he level contours of f are the hyperbolas xy = c, with c increasing as the curves move out from the origin. 3 General Formulation Now we would like to generalize to the case with multiple constraints. Let h = (h 1,...,h m ) be a function from R n to R m. Consider the constrained optimization problem below. minimize f(x) subject to h(x) = 0. (5) Each function h i defines a surface S i = {x h i (x) = 0} of generally n 1 dimensions in the space R n. his surface is smooth provided h i (x) C 1. he m constraints h(x) = 0 together defines a surface S, which is the intersection of the surfaces S 1,...,S m ; namely, {x h(x) = 0} = {x h 1 (x) = 0} {x h m (x) = 0}. 4
5 h 2 (x) = 0 h(x ) x h(x) = 0 h 2 (x) h 1 (x) x S h 1 (x) = 0 S x(t) = 0 (a) (b) Figure 1: angent space at x on a surface defined by (a) one constraint h(x) = 0 (shown as a tangent plane) and (b) two constraints h 1 (x) = 0 and h 2 (x) = 0 (shown as a tangent line). he surface S is of generally n m dimensions. In Figure 1(a), S is defined by one constraint; in (b), the surface is defined by two constraints. he constrained optimization (5) is carried out over the surface S. A curve on S is a family of points x(t) S with a t b. he curve is differentiable if x = dx/dt exists, and twice differentiable if x = d 2 x/dt 2 exists. he curve passes through a point x if x = x(t ) for some t, a t b. he tangent space at x is the subspace of R n spanned by the tangents x (t ) of all curves x(t) on S such that x(t ) = x. In other words, is the set of the derivatives at x of all surface curves through x. Since a curve on S through x is also one on S i, every tangent vector to S at x is automatically a tangent vector to S i. his implies that the tangent space to S at x is a subspaceof thetangent space i to S i at the same point, for 1 i m. Meanwhile, the gradient h i is orthogonal to i because, for any curve x(t) in the surface S i such that x(t ) = x, for some t, we have h i(x ) = h i (x ) x (t ) = 0. herefore, h i must be orthogonal to the subspace of i. In other words, h i lies in the normal space, which is the complement of. A point x satisfying h(x) = 0 is a regular point of the constraint if the gradient vectors h 1 (x),..., h m (x) are linearly independent. From our previous intuition, we expect that f v = 0 for all v at an extremum. his implies that f also lies in the orthogonal complement of. We would like to claim that f can be composed from a linear combination of the h i s. his is only valid provided that these gradients span, which is true when the extremal point is regular. heorem 1 At a regular point x of the surface S defined by h(x) = 0, the tangent space is the same as {y h(x)y = 0}, 5
6 where h = h 1. h m. he rows of the matrix h(x) are the gradient vectors h j (x), j = 1,...,m. he theorem says that the tangent space at x is equal to the null space of h(x). hus, its orthogonal complement must equal the row space of h(x). Hence the vectors h j (x) span. Example 2. Suppose h(x 1,x 2 ) = x 1. hen h(x) = 0 yields the x 2 axis. And h = (1,0) at all the points on the axis. So every x R 2 is regular. he tangent space is also the x 2 axis and has dimension 1. If instead h(x 1,x 2 ) = x 2 1 then h(x) = 0 still defines the x 2 axis. On this axis, h = (2x 1,0) = (0,0). hus, no point is regular. he dimension of, which is the x 2 axis, is still one, but the dimension of the space {y h y = 0} is two. Lemma 2 Let x be a local extremum of f subject to the constraints h(x) = 0. hen for all y in the tangent space of the constraint surface at x, f(x )y = 0. he next theorem states that the Lagrange multiplier method is a necessary condition on an extremum point. heorem 3 (First-Order Necessary Conditions) Let x be a local extremum point of f subject to the constraints h(x) = 0. Assume further that x is a regular point of these constraints. hen there is a λ R m such that f(x )+λ h(x ) = 0. he first order necessary conditions together with the constraints h(x ) = 0 give a total of n + m equations in n + m variables x and λ. hus a unique solution can be determined at least locally. Example 3. We seek to construct a cardboard box of maximum volume, given a fixed area of cardboard. Denoting the dimensions of the box by x,y,z, the problem can be expressed as maximize xyz subject to xy +yz +xz = c 2, where c > 0 is the given area of cardboard. Consider the Lagrangian xyz + λ(xy + yz + xz c 2 ). he first-order necessary conditions are easily found to be yz +λ(y +z) = 0, (6) xz +λ(x+z) = 0, (7) xy +λ(x+y) = 0, (8) 6
7 together with the original constraint. Before solving the above equations, we note that their sum is which, under the original constraint, becomes (xy +yz +xz)+2λ(x+y +z) = 0, c/2+2λ(x+y +z) = 0. Hence, it is clear that λ 0. Neither of x,y,z can be zero since if either is zero, all must be so according to (6) (8). o solve the equations (6) (8), multiply (6) by x and (7) by y, and then subtract the two to obtain λ(x y)z = 0. Operate similarly on the second and the third to obtain Since no variables can be zero, it follows that λ(y z)x = 0. x = y = z = c 6 is the unique solution to the necessary conditions. he box must be a cube. We can derive second-order conditions for constrained problems, assuming f and h are twice continuously differentiable. heorem 4 (Second-Order Necessary Conditions) Suppose that x is a local minimum of f subject to h(x) = 0 and that x is a regular point of these constraints. Denote by F the Hessian of f, and by H i the Hessian of h i, for 1 i m. hen there is a λ R m such that he matrix f(x )+λ h(x ) = 0. L(x ) = F(x )+ is positive semidefinite on the tangent space {y h(x )y = 0}. m λ i H i (x ) (9) heorem 5 (Second-Order Sufficient Conditions) Suppose there is a point x satisfying h(x ) = 0, and a λ such that f(x )+λ h(x ) = 0. Suppose also that the matrix L(x ) defined in (9) is positive definite on the tangent space {y h(x )y = 0}. hen x is a strict local minimum of f subject to h(x) = 0. i=1 Example 4. consider the problem minimize x 1 x 2 +x 2 x 3 +x 1 x 3 subject to x 1 +x 2 +x 3 = 3 7
8 he first order necessary conditions become x 2 +x 3 +λ = 0, x 1 +x 3 +λ = 0, x 1 +x 2 +λ = 0. We can solve these three equations together with the one constraint equation and obtain x 1 = x 2 = x 3 = 1 and λ = 2. hus x = (1,1,1). Now we need to resort to the second-order sufficient conditions to determine if the problem achieves a local maximum or minimum at x 1 = x 2 = x 3 = 1. We find the matrix L(x ) = F(x )+λh(x ) = λ = is neither positive nor negative definite. On the tangent space M = {y y 1 + y 2 + y 3 = 0}, however, we note that y Ly = y 1 (y 2 +y 3 )+y 2 (y 1 +y 3 )+y 3 (y 1 +y 2 ) = (y 2 1 +y2 2 +y2 3 ) < 0, for all y 0.. hus L is negative definite on M and the solution 3 we found is at least a local maximum. 4 Inequality Constraints Finally, we address problems of the general form minimize f(x) subject to h(x) = 0 g(x) 0 where h = (h 1,...,h m ) and g = (g 1,...,g p ). A fundamental concept that provides a great deal of insight as well as simplifies the required theoretical development is that of an active constraint. An inequality constraint g i (x) 0 is said to be active at a feasible point x if g i (x) = 0 and inactive at x if g i (x) < 0. By convention we refer to any equality constraint h i (x) = 0 as active at any feasible point. he constraints active at a feasible point x restrict the domain of feasibility in neighborhoods of x. herefore, in studying the properties of a local minimum point, it is clear that attention can be restricted to the active constraints. his is illustrated in the figure below where local properties satisfied by the solution x obviously do not depend on the inactive constraints g 2 and g 3. 8
9 x = 0 feasible region g 3 (x) = 0 g 1 (x) = 0 g 2 (x) = 0 Assume that the functions f, h = (h 1,...,h m ), g = (g 1,...,g p ) are twice continuously differentiable. Let x be a point satisfying the constraints h(x ) = 0 and g(x ) 0, and let J = {j g j (x ) = 0}. hen x is said to be a regular point of the above constraints if the gradient vectors h i (x ), g j (x ), 1 i m, j J are linearly independent. Now suppose this regular point x is also a relative minimum point for the original problem (1). hen it can be shown that there exists a vector λ R m and a vector µ R p with µ 0 such that f(x )+λ h(x )+µ g(x ) = 0; µ g(x ) = 0. Since µ 0 and g(x ) 0, the second constraint above is equivalent to the statement that µ j 0, for each 1 j p, only if g j is active. o find a solution, we enumerate various combinations of active constraints, that is, constraints where equalities are attained at x, and check the signs of the resulting Lagrange multipliers. here are a number of distinct theories concerning this problem, based on various regularity conditions or constraint qualifications, which are directed toward obtaining definitive general statements of necessary and sufficient conditions. One can by no means pretend that all such results can be obtained as minor extensions of the theory for problems having equality constraints only. o date, however, their use has been limited to small-scale programming problems of two or three variables. We refer you to [3] for more on the subject. References [1]. Abell. Lecture notes. he Robotics Institute, School of Computer Science, Carnegie Mellon University, [2] M. Avriel. Nonlinear Programming: Analysis and Methods. Prentice-Hall, 1976; Dover Publications,
10 [3] D. G. Luenberger. Linear and Nonlinear Programming. Addison-Wesley, 2nd edition,
MATH2070 Optimisation
MATH2070 Optimisation Nonlinear optimisation with constraints Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The full nonlinear optimisation problem with equality constraints
More information1 Lagrange Multiplier Method
1 Lagrange Multiplier Method Near a maximum the decrements on both sides are in the beginning only imperceptible. J. Kepler When a quantity is greatest or least, at that moment its flow neither increases
More informationMethod of Lagrange Multipliers
Method of Lagrange Multipliers A. Salih Department of Aerospace Engineering Indian Institute of Space Science and Technology, Thiruvananthapuram September 2013 Lagrange multiplier method is a technique
More informationIn view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written
11.8 Inequality Constraints 341 Because by assumption x is a regular point and L x is positive definite on M, it follows that this matrix is nonsingular (see Exercise 11). Thus, by the Implicit Function
More informationLagrange Multipliers
Lagrange Multipliers 3-15-2018 The method of Lagrange multipliers deals with the problem of finding the maxima and minima of a function subject to a side condition, or constraint. Example. Find graphically
More informationTMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM
TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM H. E. Krogstad, IMF, Spring 2012 Karush-Kuhn-Tucker (KKT) Theorem is the most central theorem in constrained optimization, and since the proof is scattered
More informationReview of Optimization Methods
Review of Optimization Methods Prof. Manuela Pedio 20550 Quantitative Methods for Finance August 2018 Outline of the Course Lectures 1 and 2 (3 hours, in class): Linear and non-linear functions on Limits,
More informationLecture 9: Implicit function theorem, constrained extrema and Lagrange multipliers
Lecture 9: Implicit function theorem, constrained extrema and Lagrange multipliers Rafikul Alam Department of Mathematics IIT Guwahati What does the Implicit function theorem say? Let F : R 2 R be C 1.
More informationPartial Derivatives. w = f(x, y, z).
Partial Derivatives 1 Functions of Several Variables So far we have focused our attention of functions of one variable. These functions model situations in which a variable depends on another independent
More informationEC /11. Math for Microeconomics September Course, Part II Lecture Notes. Course Outline
LONDON SCHOOL OF ECONOMICS Professor Leonardo Felli Department of Economics S.478; x7525 EC400 20010/11 Math for Microeconomics September Course, Part II Lecture Notes Course Outline Lecture 1: Tools for
More informationExamination paper for TMA4180 Optimization I
Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted
More informationg(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to
1 of 11 11/29/2010 10:39 AM From Wikipedia, the free encyclopedia In mathematical optimization, the method of Lagrange multipliers (named after Joseph Louis Lagrange) provides a strategy for finding the
More informationMath 212-Lecture Interior critical points of functions of two variables
Math 212-Lecture 24 13.10. Interior critical points of functions of two variables Previously, we have concluded that if f has derivatives, all interior local min or local max should be critical points.
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING
Nf SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING f(x R m g HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 5, DR RAPHAEL
More informationMechanical Systems II. Method of Lagrange Multipliers
Mechanical Systems II. Method of Lagrange Multipliers Rafael Wisniewski Aalborg University Abstract. So far our approach to classical mechanics was limited to finding a critical point of a certain functional.
More informationThe general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.
1 Optimization Mathematical programming refers to the basic mathematical problem of finding a maximum to a function, f, subject to some constraints. 1 In other words, the objective is to find a point,
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationMath 118, Fall 2014 Final Exam
Math 8, Fall 4 Final Exam True or false Please circle your choice; no explanation is necessary True There is a linear transformation T such that T e ) = e and T e ) = e Solution Since T is linear, if T
More informationLet f(x) = x, but the domain of f is the interval 0 x 1. Note
I.g Maximum and Minimum. Lagrange Multipliers Recall: Suppose we are given y = f(x). We recall that the maximum/minimum points occur at the following points: (1) where f = 0; (2) where f does not exist;
More informationMath 207 Honors Calculus III Final Exam Solutions
Math 207 Honors Calculus III Final Exam Solutions PART I. Problem 1. A particle moves in the 3-dimensional space so that its velocity v(t) and acceleration a(t) satisfy v(0) = 3j and v(t) a(t) = t 3 for
More informationChapter 7. Extremal Problems. 7.1 Extrema and Local Extrema
Chapter 7 Extremal Problems No matter in theoretical context or in applications many problems can be formulated as problems of finding the maximum or minimum of a function. Whenever this is the case, advanced
More information3.7 Constrained Optimization and Lagrange Multipliers
3.7 Constrained Optimization and Lagrange Multipliers 71 3.7 Constrained Optimization and Lagrange Multipliers Overview: Constrained optimization problems can sometimes be solved using the methods of the
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationHigher order derivative
2 Î 3 á Higher order derivative Î 1 Å Iterated partial derivative Iterated partial derivative Suppose f has f/ x, f/ y and ( f ) = 2 f x x x 2 ( f ) = 2 f x y x y ( f ) = 2 f y x y x ( f ) = 2 f y y y
More informationLagrange Multipliers
Optimization with Constraints As long as algebra and geometry have been separated, their progress have been slow and their uses limited; but when these two sciences have been united, they have lent each
More informationTangent spaces, normals and extrema
Chapter 3 Tangent spaces, normals and extrema If S is a surface in 3-space, with a point a S where S looks smooth, i.e., without any fold or cusp or self-crossing, we can intuitively define the tangent
More informationMATH20132 Calculus of Several Variables. 2018
MATH20132 Calculus of Several Variables 2018 Solutions to Problems 8 Lagrange s Method 1 For x R 2 let fx = x 2 3xy +y 2 5x+5y i Find the critical values of fx in R 2, ii Findthecriticalvaluesoffxrestrictedtotheparametriccurvet
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationMATH 4211/6211 Optimization Constrained Optimization
MATH 4211/6211 Optimization Constrained Optimization Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 Constrained optimization
More informationLink lecture - Lagrange Multipliers
Link lecture - Lagrange Multipliers Lagrange multipliers provide a method for finding a stationary point of a function, say f(x, y) when the variables are subject to constraints, say of the form g(x, y)
More informationOptimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30
Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained
More information0.1 Tangent Spaces and Lagrange Multipliers
01 TANGENT SPACES AND LAGRANGE MULTIPLIERS 1 01 Tangent Spaces and Lagrange Multipliers If a differentiable function G = (G 1,, G k ) : E n+k E k then the surface S defined by S = { x G( x) = v} is called
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More informationConstrained optimization
Constrained optimization In general, the formulation of constrained optimization is as follows minj(w), subject to H i (w) = 0, i = 1,..., k. where J is the cost function and H i are the constraints. Lagrange
More informationMATH529 Fundamentals of Optimization Constrained Optimization I
MATH529 Fundamentals of Optimization Constrained Optimization I Marco A. Montes de Oca Mathematical Sciences, University of Delaware, USA 1 / 26 Motivating Example 2 / 26 Motivating Example min cost(b)
More information1 Computing with constraints
Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)
More informationAPPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of
CHAPTER III APPLICATIONS The eigenvalues are λ =, An orthonormal basis of eigenvectors consists of, The eigenvalues are λ =, A basis of eigenvectors consists of, 4 which are not perpendicular However,
More informationLagrange Multipliers
Calculus 3 Lia Vas Lagrange Multipliers Constrained Optimization for functions of two variables. To find the maximum and minimum values of z = f(x, y), objective function, subject to a constraint g(x,
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More information3. Minimization with constraints Problem III. Minimize f(x) in R n given that x satisfies the equality constraints. g j (x) = c j, j = 1,...
3. Minimization with constraints Problem III. Minimize f(x) in R n given that x satisfies the equality constraints g j (x) = c j, j = 1,..., m < n, where c 1,..., c m are given numbers. Theorem 3.1. Let
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More informationSummary Notes on Maximization
Division of the Humanities and Social Sciences Summary Notes on Maximization KC Border Fall 2005 1 Classical Lagrange Multiplier Theorem 1 Definition A point x is a constrained local maximizer of f subject
More informationEC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2
LONDON SCHOOL OF ECONOMICS Professor Leonardo Felli Department of Economics S.478; x7525 EC400 2010/11 Math for Microeconomics September Course, Part II Problem Set 1 with Solutions 1. Show that the general
More informationMATH Midterm 1 Sample 4
1. (15 marks) (a) (4 marks) Given the function: f(x, y) = arcsin x 2 + y, find its first order partial derivatives at the point (0, 3). Simplify your answers. Solution: Compute the first order partial
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationComputation. For QDA we need to calculate: Lets first consider the case that
Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the
More information2.3 Linear Programming
2.3 Linear Programming Linear Programming (LP) is the term used to define a wide range of optimization problems in which the objective function is linear in the unknown variables and the constraints are
More informationNonlinear Optimization
Nonlinear Optimization (Com S 477/577 Notes) Yan-Bin Jia Nov 7, 2017 1 Introduction Given a single function f that depends on one or more independent variable, we want to find the values of those variables
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More information3 Applications of partial differentiation
Advanced Calculus Chapter 3 Applications of partial differentiation 37 3 Applications of partial differentiation 3.1 Stationary points Higher derivatives Let U R 2 and f : U R. The partial derivatives
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationHOMEWORK 7 SOLUTIONS
HOMEWORK 7 SOLUTIONS MA11: ADVANCED CALCULUS, HILARY 17 (1) Using the method of Lagrange multipliers, find the largest and smallest values of the function f(x, y) xy on the ellipse x + y 1. Solution: The
More informationOptimization using Calculus. Optimization of Functions of Multiple Variables subject to Equality Constraints
Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints 1 Objectives Optimization of functions of multiple variables subjected to equality constraints
More information1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016
AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 8 February 22nd 1 Overview In the previous lecture we saw characterizations of optimality in linear optimization, and we reviewed the
More informationECE580 Solution to Problem Set 6
ECE580 Fall 2015 Solution to Problem Set 6 December 23 2015 1 ECE580 Solution to Problem Set 6 These problems are from the textbook by Chong and Zak 4th edition which is the textbook for the ECE580 Fall
More informationImplicit Functions, Curves and Surfaces
Chapter 11 Implicit Functions, Curves and Surfaces 11.1 Implicit Function Theorem Motivation. In many problems, objects or quantities of interest can only be described indirectly or implicitly. It is then
More informationLagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.
Chapter 9 Lagrange multipliers Portfolio optimization The Lagrange multipliers method for finding constrained extrema of multivariable functions 91 Lagrange multipliers Optimization problems often require
More informationSecond Order Optimality Conditions for Constrained Nonlinear Programming
Second Order Optimality Conditions for Constrained Nonlinear Programming Lecture 10, Continuous Optimisation Oxford University Computing Laboratory, HT 2006 Notes by Dr Raphael Hauser (hauser@comlab.ox.ac.uk)
More informationb) The system of ODE s d x = v(x) in U. (2) dt
How to solve linear and quasilinear first order partial differential equations This text contains a sketch about how to solve linear and quasilinear first order PDEs and should prepare you for the general
More informationMath Advanced Calculus II
Math 452 - Advanced Calculus II Manifolds and Lagrange Multipliers In this section, we will investigate the structure of critical points of differentiable functions. In practice, one often is trying to
More informationMultivariable Calculus and Matrix Algebra-Summer 2017
Multivariable Calculus and Matrix Algebra-Summer 017 Homework 4 Solutions Note that the solutions below are for the latest version of the problems posted. For those of you who worked on an earlier version
More informationMathematical Economics. Lecture Notes (in extracts)
Prof. Dr. Frank Werner Faculty of Mathematics Institute of Mathematical Optimization (IMO) http://math.uni-magdeburg.de/ werner/math-ec-new.html Mathematical Economics Lecture Notes (in extracts) Winter
More information7a3 2. (c) πa 3 (d) πa 3 (e) πa3
1.(6pts) Find the integral x, y, z d S where H is the part of the upper hemisphere of H x 2 + y 2 + z 2 = a 2 above the plane z = a and the normal points up. ( 2 π ) Useful Facts: cos = 1 and ds = ±a sin
More informationConstrained Optimization
Constrained Optimization Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 General Problem Consider the following general constrained optimization problem:
More informationOptimality Conditions
Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationMATHEMATICAL ECONOMICS: OPTIMIZATION. Contents
MATHEMATICAL ECONOMICS: OPTIMIZATION JOÃO LOPES DIAS Contents 1. Introduction 2 1.1. Preliminaries 2 1.2. Optimal points and values 2 1.3. The optimization problems 3 1.4. Existence of optimal points 4
More informationPh.D. Katarína Bellová Page 1 Mathematics 2 (10-PHY-BIPMA2) EXAM - Solutions, 20 July 2017, 10:00 12:00 All answers to be justified.
PhD Katarína Bellová Page 1 Mathematics 2 (10-PHY-BIPMA2 EXAM - Solutions, 20 July 2017, 10:00 12:00 All answers to be justified Problem 1 [ points]: For which parameters λ R does the following system
More informationx +3y 2t = 1 2x +y +z +t = 2 3x y +z t = 7 2x +6y +z +t = a
UCM Final Exam, 05/8/014 Solutions 1 Given the parameter a R, consider the following linear system x +y t = 1 x +y +z +t = x y +z t = 7 x +6y +z +t = a (a (6 points Discuss the system depending on the
More informationMath 291-3: Lecture Notes Northwestern University, Spring 2016
Math 291-3: Lecture Notes Northwestern University, Spring 216 Written by Santiago Cañez These are lecture notes for Math 291-3, the third quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,
More informationIntroduction to Optimization Techniques. Nonlinear Optimization in Function Spaces
Introduction to Optimization Techniques Nonlinear Optimization in Function Spaces X : T : Gateaux and Fréchet Differentials Gateaux and Fréchet Differentials a vector space, Y : a normed space transformation
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationSufficient Conditions for Finite-variable Constrained Minimization
Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute
More informationConstrained Optimization Theory
Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August
More informationTaylor Series and stationary points
Chapter 5 Taylor Series and stationary points 5.1 Taylor Series The surface z = f(x, y) and its derivatives can give a series approximation for f(x, y) about some point (x 0, y 0 ) as illustrated in Figure
More informationLecture 4: Optimization. Maximizing a function of a single variable
Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationSeptember Math Course: First Order Derivative
September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which
More informationf(x, y) = 1 2 x y2 xy 3
Problem. Find the critical points of the function and determine their nature. We have We find the critical points We calculate f(x, y) = 2 x2 + 3 2 y2 xy 3 f x = x y 3 = 0 = x = y 3 f y = 3y 3xy 2 = 0
More informationOptimization: Problem Set Solutions
Optimization: Problem Set Solutions Annalisa Molino University of Rome Tor Vergata annalisa.molino@uniroma2.it Fall 20 Compute the maxima minima or saddle points of the following functions a. f(x y) =
More informationIE 5531: Engineering Optimization I
IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,
More informationIntroduction to Nonlinear Stochastic Programming
School of Mathematics T H E U N I V E R S I T Y O H F R G E D I N B U Introduction to Nonlinear Stochastic Programming Jacek Gondzio Email: J.Gondzio@ed.ac.uk URL: http://www.maths.ed.ac.uk/~gondzio SPS
More informationLecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima
B9824 Foundations of Optimization Lecture 1: Introduction Fall 2009 Copyright 2009 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained
More informationMath 273a: Optimization Basic concepts
Math 273a: Optimization Basic concepts Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 slides based on Chong-Zak, 4th Ed. Goals of this lecture The general form of optimization: minimize
More informationSECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS
SECTION C: CONTINUOUS OPTIMISATION LECTURE : THE METHOD OF LAGRANGE MULTIPLIERS HONOUR SCHOOL OF MATHEMATICS OXFORD UNIVERSITY HILARY TERM 005 DR RAPHAEL HAUSER. Examples. In this lecture we will take
More informationLAGRANGE MULTIPLIERS
LAGRANGE MULTIPLIERS MATH 195, SECTION 59 (VIPUL NAIK) Corresponding material in the book: Section 14.8 What students should definitely get: The Lagrange multiplier condition (one constraint, two constraints
More informationLecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima
B9824 Foundations of Optimization Lecture 1: Introduction Fall 2010 Copyright 2010 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained
More informationLecture 13 - Wednesday April 29th
Lecture 13 - Wednesday April 29th jacques@ucsdedu Key words: Systems of equations, Implicit differentiation Know how to do implicit differentiation, how to use implicit and inverse function theorems 131
More informationB553 Lecture 3: Multivariate Calculus and Linear Algebra Review
B553 Lecture 3: Multivariate Calculus and Linear Algebra Review Kris Hauser December 30, 2011 We now move from the univariate setting to the multivariate setting, where we will spend the rest of the class.
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More informationComputational Optimization. Constrained Optimization Part 2
Computational Optimization Constrained Optimization Part Optimality Conditions Unconstrained Case X* is global min Conve f X* is local min SOSC f ( *) = SONC Easiest Problem Linear equality constraints
More informationThe Derivative. Appendix B. B.1 The Derivative of f. Mappings from IR to IR
Appendix B The Derivative B.1 The Derivative of f In this chapter, we give a short summary of the derivative. Specifically, we want to compare/contrast how the derivative appears for functions whose domain
More informationIntroduction to unconstrained optimization - direct search methods
Introduction to unconstrained optimization - direct search methods Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Structure of optimization methods Typically Constraint handling converts the
More informationExercises for Multivariable Differential Calculus XM521
This document lists all the exercises for XM521. The Type I (True/False) exercises will be given, and should be answered, online immediately following each lecture. The Type III exercises are to be done
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationA new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints
Journal of Computational and Applied Mathematics 161 (003) 1 5 www.elsevier.com/locate/cam A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality
More informationOptimization Theory. Lectures 4-6
Optimization Theory Lectures 4-6 Unconstrained Maximization Problem: Maximize a function f:ú n 6 ú within a set A f ú n. Typically, A is ú n, or the non-negative orthant {x0ú n x$0} Existence of a maximum:
More informationMATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018
MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S
More information