AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

Size: px
Start display at page:

Download "AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality"

Transcription

1 AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality

2 Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3 is not coercive on R (f for x ) f (x) = e x is not coercive on R (f 0 for x )

3 Convexity An important concept for uniqueness is convexity A set S R n is convex if it contains the line segment between any two of its points That is, S is convex if for any x, y S, we have {θx + (1 θ)y : θ [0, 1]} S

4 Convexity Similarly, we define convexity of a function f : S R n R f is convex if its graph along any line segment in S is on or below the chord connecting the function values i.e. f is convex if for any x, y S and any θ (0, 1), we have Also, if then f is strictly convex f (θx + (1 θ)y) θf (x) + (1 θ)f (y) f (θx + (1 θ)y) < θf (x) + (1 θ)f (y)

5 Convexity Strictly convex

6 Convexity Non-convex

7 Convexity Convex (not strictly convex)

8 Convexity If f is a convex function on a convex set S, then any local minimum of f must be a global minimum 1 Proof: Suppose x is a local minimum, i.e. f (x) f (y) for y B(x, ɛ) (where B(x, ɛ) {y S : y x ɛ}) Suppose that x is not a global minimum, i.e. that there exists w S such that f (w) < f (x) (Then we will show that this gives a contradiction) 1 A global minimum is defined as a point z such that f (z) f (x) for all x S. Note that a global minimum may not be unique, e.g. if f (x) = cos x then 0 and 2π are both global minima.

9 Convexity Proof (continued...): For θ [0, 1] we have f (θw + (1 θ)x) θf (w) + (1 θ)f (x) Let σ (0, 1] be sufficiently small so that z σw + (1 σ) x B(x, ɛ) Then f (z) σf (w) + (1 σ) f (x) < σf (x) + (1 σ) f (x) = f (x), i.e. f (z) < f (x), which contradicts that f (x) is a local minimum! Hence we cannot have w S such that f (w) < f (x)

10 Convexity Note that convexity does not guarantee uniqueness of global minimum e.g. a convex function can clearly have a horizontal section (see earlier plot) If f is a strictly convex function on a convex set S, then a local minimum of f is the unique global minimum Optimization of convex functions over convex sets is called convex optimization, which is an important subfield of optimization

11 Optimality Conditions We have discussed existence and uniqueness of minima, but haven t considered how to find a minimum The familiar optimization idea from calculus in one dimension is: set derivative to zero, check the sign of the second derivative This can be generalized to R n

12 Optimality Conditions If f : R n R is differentiable, then the gradient vector f : R n R n is f (x) f (x) x 1 f (x) x 2. f (x) x n The importance of the gradient is that f points uphill, i.e. towards points with larger values than f (x) And similarly f points downhill

13 Optimality Conditions This follows from Taylor s theorem for f : R n R Recall that f (x + δ) = f (x) + f (x) T δ + H.O.T. Let δ ɛ f (x) for ɛ > 0 and suppose that f (x) 0, then: f (x ɛ f (x)) f (x) ɛ f (x) T f (x) < f (x) Also, we see from Cauchy Schwarz that f (x) is the steepest descent direction

14 Optimality Conditions Similarly, we see that a necessary condition for a local minimum at x S is that f (x ) = 0 In this case there is no downhill direction at x The condition f (x ) = 0 is called a first-order necessary condition for optimality, since it only involves first derivatives

15 Optimality Conditions x S that satisfies the first-order optimality condition is called a critical point of f But of course a critical point can be a local min., local max., or saddle point (Recall that a saddle point is where some directions are downhill and others are uphill, e.g. (x, y) = (0, 0) for f (x, y) = x 2 y 2 )

16 Optimality Conditions As in the one-dimensional case, we can look to second derivatives to classify critical points If f : R n R is twice differentiable, then the Hessian is the matrix-valued function H f : R n R n n H f (x) 2 f (x) x1 2 2 f (x) x 2 x 1. 2 f (x) x nx 1 2 f (x) x 1 x 2 2 f (x) x f (x) x nx 2 2 f (x) x 1 x n 2 f (x) x 2 x n f (x) x 2 n The Hessian is the Jacobian matrix of the gradient f : R n R n If the second partial derivatives of f are continuous, then 2 f / x i x j = 2 f / x j x i, and H f is symmetric

17 Optimality Conditions Suppose we have found a critical point x, so that f (x ) = 0 From Taylor s Theorem, for δ R n, we have f (x + δ) = f (x ) + f (x ) T δ δt H f (x + ηδ)δ = f (x ) δt H f (x + ηδ)δ for some η (0, 1)

18 Optimality Conditions Recall positive definiteness: A is positive definite if x T Ax > 0 Suppose H f (x ) is positive definite Then (by continuity) H f (x + ηδ) is also positive definite for δ sufficiently small, so that: δ T H f (x + ηδ)δ > 0 Hence, we have f (x + δ) > f (x ) for δ sufficiently small, i.e. f (x ) is a local minimum Hence, in general, positive definiteness of H f at a critical point x is a second-order sufficient condition for a local minimum

19 Optimality Conditions A matrix can also be negative definite: x T Ax < 0 for all x 0 Or indefinite: There exists x, y such that x T Ax < 0 < y T Ay Then we can classify critical points as follows: H f (x ) positive definite = x is a local minimum H f (x ) negative definite = x is a local maximum H f (x ) indefinite = x is a saddle point

20 Optimality Conditions Also, positive definiteness of the Hessian is closely related to convexity of f If H f (x) is positive definite, then f is convex on some convex neighborhood of x If H f (x) is positive definite for all x S, where S is a convex set, then f is convex on S Question: How do we test for positive definiteness?

21 Optimality Conditions Answer: A is positive (resp. negative) definite if and only if all eigenvalues of A are positive (resp. negative) 2 Also, a matrix with positive and negative eigenvalues is indefinite Hence we can compute all the eigenvalues of A and check their signs 2 This is related to the Rayleigh quotient, see Unit V

22 Heath Example 6.5 Consider f (x) = 2x x x 1 x 2 + 3x 2 2 6x Then f (x) = [ 6x x x 2 12x 1 + 6x 2 6 ] We set f (x) = 0 to find critical points 3 [1, 1] T and [2, 3] T 3 In general solving f (x) = 0 requires an iterative method

23 Heath Example 6.5, continued... The Hessian is and hence H f (1, 1) = H f (2, 3) = H f (x) = [ [ [ 12x ] ], which has eigenvalues 25.4, 1.4 ], which has eigenvalues 35.0, 1.0 Hence [2, 3] T is a local min. whereas [1, 1] T is a saddle point

24 Optimality Conditions: Equality Constrained Case So far we have ignored constraints Let us now consider equality constrained optimization min f (x) subject to g(x) = 0, x Rn where f : R n R and g : R n R m, with m n Since g maps to R m, we have m constraints This situation is treated with Lagrange mutlipliers

25 Optimality Conditions: Equality Constrained Case We illustrate the concept of Lagrange multipliers for f, g : R 2 R Let f (x, y) = x + y and g(x, y) = 2x 2 + y g is normal to S: 4 at any x S we must move in direction ( g(x)) (tangent direction) to remain in S 4 This follows from Taylor s Theorem: g(x + δ) g(x) + g(x) T δ

26 Optimality Conditions: Equality Constrained Case Also, change in f due to infinitesimal step in direction ( g(x)) is f (x ± ɛ( g(x)) ) = f (x) ± ɛ f (x) T ( g(x)) + H.O.T. Hence stationary point x S if f (x ) T ( g(x )) = 0, or f (x ) = λ g(x ), for some λ R

27 Optimality Conditions: Equality Constrained Case This shows that for a stationary point with m = 1 constraints, f cannot have any component in the tangent direction to S Now, consider the case with m > 1 equality constraints Then g : R n R m and we now have a set of constraint gradient vectors, g i, i = 1,..., m Then we have S = {x R n : g i (x) = 0, i = 1,..., m} Any tangent direction at x S must be orthogonal to all gradient vectors { g i (x), i = 1,..., m} to remain in S

28 Optimality Conditions: Equality Constrained Case Let T (x) {v R n : g i (x) T v = 0, i = 1, 2,..., m} denote the orthogonal complement of { g i (x), i = 1,..., m} Then, for δ T (x) and ɛ R >0, ɛδ is a step in a tangent direction of S at x Since we have f (x + ɛδ) = f (x ) + ɛ f (x ) T δ + H.O.T. it follows that for a stationary point we need f (x ) T δ = 0 for all δ T (x )

29 Optimality Conditions: Equality Constrained Case Hence, we require that at a stationary point x S we have f (x ) span{ g i (x ), i = 1,..., m} This can be written succinctly as a linear system f (x ) = (J g (x )) T λ for some λ R m, where (J g (x )) T R n m This follows because the columns of (J g (x )) T are the vectors { g i (x ), i = 1,..., m}

30 Optimality Conditions: Equality Constrained Case We can write equality constrained optimization problems more succinctly by introducing the Lagrangian function, L : R n+m R, Then we have, L(x, λ) f (x) + λ T g(x) = f (x) + λ 1 g 1 (x) + + λ m g m (x) L(x,λ) x i = f (x) x i + λ 1 g 1 (x) x i + + λ n g n(x) x i, i = 1,..., n L(x,λ) λ i = g i (x), i = 1,..., m

31 Optimality Conditions: Equality Constrained Case Hence L(x, λ) = [ x L(x, λ) λ L(x, λ) ] [ f (x) + Jg (x) = T λ g(x) ], so that the first order necessary condition for optimality for the constrained problem can be written as a nonlinear system: 5 [ f (x) + Jg (x) L(x, λ) = T ] λ = 0 g(x) (As before, stationary points can be classified by considering the Hessian, though we will not consider this here...) 5 n + m variables, n + m equations

32 Optimality Conditions: Equality Constrained Case See Lecture: Constrained optimization of cylinder surface area

33 Optimality Conditions: Equality Constrained Case As another example of equality constrained optimization, recall our underdetermined linear least squares problem from I.3 min f (b) subject to g(b) = 0, b Rn where f (b) b T b, g(b) Ab y and A R m n with m < n

34 Optimality Conditions: Equality Constrained Case Introducing Lagrange multipliers gives where b R n and λ R m L(b, λ) b T b + λ T (Ab y) Hence L(b, λ) = 0 implies [ f (b) + Jg (b) T ] λ = g(b) [ 2b + A T λ Ab y ] = 0 R n+m

35 Optimality Conditions: Equality Constrained Case Hence, we obtain the (n + m) (n + m) square linear system [ ] [ ] [ ] 2I A T b 0 = A 0 λ y which we can solve for [ b λ ] R n+m

36 Optimality Conditions: Equality Constrained Case We have b = 1 2 AT λ from the first block row Subsituting into Ab = y (the second block row ) yields λ = 2(AA T ) 1 y And hence b = 1 2 AT λ = A T (AA T ) 1 y which was the solution we introduced (but didn t derive) in I.3

37 Optimality Conditions: Inequality Constrained Case Similar Lagrange multiplier methods can be developed for the more difficult case of inequality constrained optimization

38 Steepest Descent We first consider the simpler case of unconstrained optimization (as opposed to constrained optimization) Perhaps the simplest method for unconstrained optimization is steepest descent Key idea: The negative gradient f (x) points in the steepest downhill direction for f at x Hence an iterative method for minimizing f is obtained by following f (x k ) at each step Question: How far should we go in the direction of f (x k )?

39 Steepest Descent We can try to find the best step size via a subsidiary (and easier!) optimization problem For a direction s R n, let φ : R R be given by φ(η) = f (x + ηs) Then minimizing f along s corresponds to minimizing the one-dimensional function φ This process of minimizing f along a line is called a line search 6 6 The line search can itself be performed via Newton s method, as described for f : R n R shortly, or via a built-in function

40 Steepest Descent Putting these pieces together leads to the steepest descent method: 1: choose initial guess x 0 2: for k = 0, 1, 2,... do 3: s k = f (x k ) 4: choose η k to minimize f (x k + η k s k ) 5: x k+1 = x k + η k s k 6: end for However, steepest descent often converges very slowly Convergence rate is linear, and scaling factor can be arbitrarily close to 1 (Steepest descent will be covered on Assignment 5)

41 Newton s Method We can get faster convergence by using more information about f Note that f (x ) = 0 is a system of nonlinear equations, hence we can solve it with quadratic convergence via Newton s method 7 The Jacobian matrix of f (x) is H f (x) and hence Newton s method for unconstrained optimization is: 1: choose initial guess x 0 2: for k = 0, 1, 2,... do 3: solve H f (x k )s k = f (x k ) 4: x k+1 = x k + s k 5: end for 7 Note that in its simplest form this algorithm searches for stationary points, not necessarily minima

42 Newton s Method We can also interpret Newton s method as seeking stationary point based on a sequence of local quadratic approximations Recall that for small δ f (x + δ) f (x) + f (x) T δ δt H f (x)δ q(δ) where q(δ) is quadratic in δ (for a fixed x) We find stationary point of q in the usual way: 8 q(δ) = f (x) + H f (x)δ = 0 This leads to H f (x)δ = f (x), as in the previous slide 8 Recall I.4 for differentiation of δ T H f (x)δ

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

More information

Lecture Notes: Geometric Considerations in Unconstrained Optimization

Lecture Notes: Geometric Considerations in Unconstrained Optimization Lecture Notes: Geometric Considerations in Unconstrained Optimization James T. Allison February 15, 2006 The primary objectives of this lecture on unconstrained optimization are to: Establish connections

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Chapter 3 Numerical Methods

Chapter 3 Numerical Methods Chapter 3 Numerical Methods Part 2 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization 1 Outline 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization Summary 2 Outline 3.2

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

Vector Derivatives and the Gradient

Vector Derivatives and the Gradient ECE 275AB Lecture 10 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 10 ECE 275A Vector Derivatives and the Gradient ECE 275AB Lecture 10 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego

More information

1 Computing with constraints

1 Computing with constraints Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

CS-E4830 Kernel Methods in Machine Learning

CS-E4830 Kernel Methods in Machine Learning CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This

More information

Optimization Tutorial 1. Basic Gradient Descent

Optimization Tutorial 1. Basic Gradient Descent E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.

More information

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are Quadratic forms We consider the quadratic function f : R 2 R defined by f(x) = 2 xt Ax b T x with x = (x, x 2 ) T, () where A R 2 2 is symmetric and b R 2. We will see that, depending on the eigenvalues

More information

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems Outline Scientific Computing: An Introductory Survey Chapter 6 Optimization 1 Prof. Michael. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained

More information

Examination paper for TMA4180 Optimization I

Examination paper for TMA4180 Optimization I Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Unit 2: Multivariable optimization problems Che-Rung Lee Scribe: February 28, 2011 (UNIT 2) Numerical Optimization February 28, 2011 1 / 17 Partial derivative of a two variable function

More information

APPLICATIONS OF DIFFERENTIABILITY IN R n.

APPLICATIONS OF DIFFERENTIABILITY IN R n. APPLICATIONS OF DIFFERENTIABILITY IN R n. MATANIA BEN-ARTZI April 2015 Functions here are defined on a subset T R n and take values in R m, where m can be smaller, equal or greater than n. The (open) ball

More information

1 Introduction

1 Introduction 2018-06-12 1 Introduction The title of this course is Numerical Methods for Data Science. What does that mean? Before we dive into the course technical material, let s put things into context. I will not

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

Nonlinear Optimization for Optimal Control

Nonlinear Optimization for Optimal Control Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]

More information

10.34 Numerical Methods Applied to Chemical Engineering Fall Quiz #1 Review

10.34 Numerical Methods Applied to Chemical Engineering Fall Quiz #1 Review 10.34 Numerical Methods Applied to Chemical Engineering Fall 2015 Quiz #1 Review Study guide based on notes developed by J.A. Paulson, modified by K. Severson Linear Algebra We ve covered three major topics

More information

CONSTRAINED NONLINEAR PROGRAMMING

CONSTRAINED NONLINEAR PROGRAMMING 149 CONSTRAINED NONLINEAR PROGRAMMING We now turn to methods for general constrained nonlinear programming. These may be broadly classified into two categories: 1. TRANSFORMATION METHODS: In this approach

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Optimization and Calculus

Optimization and Calculus Optimization and Calculus To begin, there is a close relationship between finding the roots to a function and optimizing a function. In the former case, we solve for x. In the latter, we solve: g(x) =

More information

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares Robert Bridson October 29, 2008 1 Hessian Problems in Newton Last time we fixed one of plain Newton s problems by introducing line search

More information

N. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:

N. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form: 0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything

More information

Generalization to inequality constrained problem. Maximize

Generalization to inequality constrained problem. Maximize Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum

More information

Optimality Conditions

Optimality Conditions Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson November 1, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 1, 2010 1 / 24

More information

Unconstrained Optimization

Unconstrained Optimization 1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation

More information

Computational Finance

Computational Finance Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained

More information

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima Calculus 50A - Advanced Calculus I Fall 014 14.7: Local minima and maxima Martin Frankland November 17, 014 In these notes, we discuss the problem of finding the local minima and maxima of a function.

More information

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions CE 191: Civil and Environmental Engineering Systems Analysis LEC : Optimality Conditions Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 214 Prof. Moura

More information

Optimality Conditions for Constrained Optimization

Optimality Conditions for Constrained Optimization 72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization

More information

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren SF2822 Applied Nonlinear Optimization Lecture 9: Sequential quadratic programming Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH / 24 Lecture 9, 207/208 Preparatory question. Try to solve theory

More information

Existence of minimizers

Existence of minimizers Existence of imizers We have just talked a lot about how to find the imizer of an unconstrained convex optimization problem. We have not talked too much, at least not in concrete mathematical terms, about

More information

TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM

TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM H. E. Krogstad, IMF, Spring 2012 Karush-Kuhn-Tucker (KKT) Theorem is the most central theorem in constrained optimization, and since the proof is scattered

More information

Constrained Optimization Theory

Constrained Optimization Theory Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August

More information

Lecture 4 September 15

Lecture 4 September 15 IFT 6269: Probabilistic Graphical Models Fall 2017 Lecture 4 September 15 Lecturer: Simon Lacoste-Julien Scribe: Philippe Brouillard & Tristan Deleu 4.1 Maximum Likelihood principle Given a parametric

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

Optimization Methods

Optimization Methods Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available

More information

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α

More information

Math (P)refresher Lecture 8: Unconstrained Optimization

Math (P)refresher Lecture 8: Unconstrained Optimization Math (P)refresher Lecture 8: Unconstrained Optimization September 2006 Today s Topics : Quadratic Forms Definiteness of Quadratic Forms Maxima and Minima in R n First Order Conditions Second Order Conditions

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

3.7 Constrained Optimization and Lagrange Multipliers

3.7 Constrained Optimization and Lagrange Multipliers 3.7 Constrained Optimization and Lagrange Multipliers 71 3.7 Constrained Optimization and Lagrange Multipliers Overview: Constrained optimization problems can sometimes be solved using the methods of the

More information

Appendix A Taylor Approximations and Definite Matrices

Appendix A Taylor Approximations and Definite Matrices Appendix A Taylor Approximations and Definite Matrices Taylor approximations provide an easy way to approximate a function as a polynomial, using the derivatives of the function. We know, from elementary

More information

Lagrangian Duality and Convex Optimization

Lagrangian Duality and Convex Optimization Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DS-GA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018 MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S

More information

Introduction to gradient descent

Introduction to gradient descent 6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our

More information

Nonlinear equations and optimization

Nonlinear equations and optimization Notes for 2017-03-29 Nonlinear equations and optimization For the next month or so, we will be discussing methods for solving nonlinear systems of equations and multivariate optimization problems. We will

More information

Convex Optimization & Lagrange Duality

Convex Optimization & Lagrange Duality Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT

More information

Optimal control problems with PDE constraints

Optimal control problems with PDE constraints Optimal control problems with PDE constraints Maya Neytcheva CIM, October 2017 General framework Unconstrained optimization problems min f (q) q x R n (real vector) and f : R n R is a smooth function.

More information

Mathematical Economics. Lecture Notes (in extracts)

Mathematical Economics. Lecture Notes (in extracts) Prof. Dr. Frank Werner Faculty of Mathematics Institute of Mathematical Optimization (IMO) http://math.uni-magdeburg.de/ werner/math-ec-new.html Mathematical Economics Lecture Notes (in extracts) Winter

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C. Chapter 4: Numerical Computation Deep Learning Authors: I. Goodfellow, Y. Bengio, A. Courville Lecture slides edited by 1 Chapter 4: Numerical Computation 4.1 Overflow and Underflow 4.2 Poor Conditioning

More information

Chapter 8 Gradient Methods

Chapter 8 Gradient Methods Chapter 8 Gradient Methods An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Introduction Recall that a level set of a function is the set of points satisfying for some constant. Thus, a point

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method. Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization

More information

The Steepest Descent Algorithm for Unconstrained Optimization

The Steepest Descent Algorithm for Unconstrained Optimization The Steepest Descent Algorithm for Unconstrained Optimization Robert M. Freund February, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 1 Steepest Descent Algorithm The problem

More information

Neural Network Training

Neural Network Training Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification

More information

Arc Search Algorithms

Arc Search Algorithms Arc Search Algorithms Nick Henderson and Walter Murray Stanford University Institute for Computational and Mathematical Engineering November 10, 2011 Unconstrained Optimization minimize x D F (x) where

More information

Sufficient Conditions for Finite-variable Constrained Minimization

Sufficient Conditions for Finite-variable Constrained Minimization Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute

More information

Numerical Optimization of Partial Differential Equations

Numerical Optimization of Partial Differential Equations Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada

More information

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then 1. x S is a global minimum point of f over S if f (x) f (x ) for any x S. 2. x S

More information

Nonlinear Optimization

Nonlinear Optimization Nonlinear Optimization (Com S 477/577 Notes) Yan-Bin Jia Nov 7, 2017 1 Introduction Given a single function f that depends on one or more independent variable, we want to find the values of those variables

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

Newton s Method. Javier Peña Convex Optimization /36-725

Newton s Method. Javier Peña Convex Optimization /36-725 Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and

More information

Introduction to Unconstrained Optimization: Part 2

Introduction to Unconstrained Optimization: Part 2 Introduction to Unconstrained Optimization: Part 2 James Allison ME 555 January 29, 2007 Overview Recap Recap selected concepts from last time (with examples) Use of quadratic functions Tests for positive

More information

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen Numerisches Rechnen (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2011/12 IGPM, RWTH Aachen Numerisches Rechnen

More information

Nonlinear Programming (NLP)

Nonlinear Programming (NLP) Natalia Lazzati Mathematics for Economics (Part I) Note 6: Nonlinear Programming - Unconstrained Optimization Note 6 is based on de la Fuente (2000, Ch. 7), Madden (1986, Ch. 3 and 5) and Simon and Blume

More information

OR MSc Maths Revision Course

OR MSc Maths Revision Course OR MSc Maths Revision Course Tom Byrne School of Mathematics University of Edinburgh t.m.byrne@sms.ed.ac.uk 15 September 2017 General Information Today JCMB Lecture Theatre A, 09:30-12:30 Mathematics revision

More information

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark Solution Methods Richard Lusby Department of Management Engineering Technical University of Denmark Lecture Overview (jg Unconstrained Several Variables Quadratic Programming Separable Programming SUMT

More information

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor. ECE580 Exam 1 October 4, 2012 1 Name: Solution Score: /100 You must show ALL of your work for full credit. This exam is closed-book. Calculators may NOT be used. Please leave fractions as fractions, etc.

More information

Non-polynomial Least-squares fitting

Non-polynomial Least-squares fitting Applied Math 205 Last time: piecewise polynomial interpolation, least-squares fitting Today: underdetermined least squares, nonlinear least squares Homework 1 (and subsequent homeworks) have several parts

More information

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way.

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way. AMSC 607 / CMSC 878o Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 3: Penalty and Barrier Methods Dianne P. O Leary c 2008 Reference: N&S Chapter 16 Penalty and Barrier

More information

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term;

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term; Chapter 2 Gradient Methods The gradient method forms the foundation of all of the schemes studied in this book. We will provide several complementary perspectives on this algorithm that highlight the many

More information

Lecture 4: Optimization. Maximizing a function of a single variable

Lecture 4: Optimization. Maximizing a function of a single variable Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable

More information

Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for So far, we have considered unconstrained optimization problems.

Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for So far, we have considered unconstrained optimization problems. Consider constraints Notes for 2017-04-24 So far, we have considered unconstrained optimization problems. The constrained problem is minimize φ(x) s.t. x Ω where Ω R n. We usually define x in terms of

More information

Math 273a: Optimization Basic concepts

Math 273a: Optimization Basic concepts Math 273a: Optimization Basic concepts Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 slides based on Chong-Zak, 4th Ed. Goals of this lecture The general form of optimization: minimize

More information

Fundamentals of Unconstrained Optimization

Fundamentals of Unconstrained Optimization dalmau@cimat.mx Centro de Investigación en Matemáticas CIMAT A.C. Mexico Enero 2016 Outline Introduction 1 Introduction 2 3 4 Optimization Problem min f (x) x Ω where f (x) is a real-valued function The

More information

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016 AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 8 February 22nd 1 Overview In the previous lecture we saw characterizations of optimality in linear optimization, and we reviewed the

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions. Chapter 9 Lagrange multipliers Portfolio optimization The Lagrange multipliers method for finding constrained extrema of multivariable functions 91 Lagrange multipliers Optimization problems often require

More information

Introduction to Optimization Techniques. Nonlinear Optimization in Function Spaces

Introduction to Optimization Techniques. Nonlinear Optimization in Function Spaces Introduction to Optimization Techniques Nonlinear Optimization in Function Spaces X : T : Gateaux and Fréchet Differentials Gateaux and Fréchet Differentials a vector space, Y : a normed space transformation

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

Optimization and Root Finding. Kurt Hornik

Optimization and Root Finding. Kurt Hornik Optimization and Root Finding Kurt Hornik Basics Root finding and unconstrained smooth optimization are closely related: Solving ƒ () = 0 can be accomplished via minimizing ƒ () 2 Slide 2 Basics Root finding

More information

Optimization. A first course on mathematics for economists

Optimization. A first course on mathematics for economists Optimization. A first course on mathematics for economists Xavier Martinez-Giralt Universitat Autònoma de Barcelona xavier.martinez.giralt@uab.eu II.3 Static optimization - Non-Linear programming OPT p.1/45

More information

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)

More information