Sufficient Conditions for Finite-variable Constrained Minimization

Similar documents
Constrained Optimization

Constrained optimization: direct methods (cont.)

In view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written

Quadratic Programming

Optimization using Calculus. Optimization of Functions of Multiple Variables subject to Equality Constraints

General Variation of a Functional

Optimization Methods: Optimization using Calculus - Equality constraints 1. Module 2 Lecture Notes 4

MATH2070 Optimisation

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

5 Handling Constraints

Generalization to inequality constrained problem. Maximize

Constrained Optimization Theory

Numerical optimization

Introduction to Machine Learning Prof. Sudeshna Sarkar Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

1 Computing with constraints

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Lecture 11 and 12: Penalty methods and augmented Lagrangian methods for nonlinear programming

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Constrained Optimization and Lagrangian Duality

Numerical optimization. Numerical optimization. Longest Shortest where Maximal Minimal. Fastest. Largest. Optimization problems

Chapter 3 Numerical Methods

Algorithms for Constrained Optimization

Support Vector Machines: Maximum Margin Classifiers

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.

Constrained optimization

Gradient Descent. Dr. Xiaowei Huang

TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

Advanced Mathematical Programming IE417. Lecture 24. Dr. Ted Ralphs

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren

Review of Optimization Methods

Final Exam - Answer key

Support Vector Machines

Introduction to gradient descent

Numerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09

Computational Optimization. Constrained Optimization Part 2

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way.

8. Constrained Optimization

Applications of Linear Programming

Lesson 11. Lagrange's Multiplier Rule / Constrained Optimization

Lagrange Multipliers

IE 5531: Engineering Optimization I

Exam in TMA4180 Optimization Theory

On sequential optimality conditions for constrained optimization. José Mario Martínez martinez

Max Margin-Classifier


Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

UVA CS 4501: Machine Learning

ECON 5111 Mathematical Economics

IE 5531: Engineering Optimization I

Lecture 3. Optimization Problems and Iterative Algorithms

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

Numerical Optimization. Review: Unconstrained Optimization

Written Examination

10-725/36-725: Convex Optimization Spring Lecture 21: April 6

Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

2.098/6.255/ Optimization Methods Practice True/False Questions

Numerical Optimization

Optimality conditions for Equality Constrained Optimization Problems

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.

SECTION C: CONTINUOUS OPTIMISATION LECTURE 11: THE METHOD OF LAGRANGE MULTIPLIERS

10701 Recitation 5 Duality and SVM. Ahmed Hefny

We are now going to move on to a discussion of Inequality constraints. Our canonical problem now looks as ( ) = 0 ( ) 0

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control

Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints

Scientific Computing: Optimization

Primal/Dual Decomposition Methods

Lecture 18 Oct. 30, 2014

Optimization Methods

Support Vector Machines, Kernel SVM

Optimality Conditions

More on Lagrange multipliers

CS711008Z Algorithm Design and Analysis

Barrier Method. Javier Peña Convex Optimization /36-725

39.1 Absolute maxima/minima

Design and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras

10-725/ Optimization Midterm Exam

FINANCIAL OPTIMIZATION

Conditional Gradient (Frank-Wolfe) Method

Dual methods and ADMM. Barnabas Poczos & Ryan Tibshirani Convex Optimization /36-725

Second Order Optimality Conditions for Constrained Nonlinear Programming

Lagrange Multipliers

Lagrange duality. The Lagrangian. We consider an optimization program of the form

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Microeconomics I. September, c Leopold Sögner

Modern Optimization Theory: Concave Programming

Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)

Lagrange Relaxation and Duality

Lecture Support Vector Machine (SVM) Classifiers

A vector from the origin to H, V could be expressed using:

Optimization Tutorial 1. Basic Gradient Descent

Reading group: Calculus of Variations and Optimal Control Theory by Daniel Liberzon

Paul Schrimpf. October 17, UBC Economics 526. Constrained optimization. Paul Schrimpf. First order conditions. Second order conditions

Uses of duality. Geoff Gordon & Ryan Tibshirani Optimization /

Calculus of Variations Summer Term 2014

Numerical Methods I Solving Nonlinear Equations

Topic one: Production line profit maximization subject to a production rate constraint. c 2010 Chuan Shi Topic one: Line optimization : 22/79

Transcription:

Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute of Science Calculus of Variations and Structural Optimization G. K. Ananthasuresh Professor, Mechanical Engineering, Indian Institute of Science, Banagalore suresh@mecheng.iisc.ernet.in 1

Outline of the lecture Feasible perturbations Second-order term in Taylor series of an n-variable function Sufficient conditions for constrained minimization Bordered Hessian What we will learn: How to interpret feasible perturbations around a constrained local minimum Positive definiteness of Hessian is an overkill How to check positive definiteness of the Hessian over the feasible perturbations Significance of the bordered Hessian 2

Re-cap of KKT conditions The first of KKT conditions says that the gradient of the objective function is a linear combination of the gradients of the equality and active inequality constraints. Lagrange multipliers of inequality constraints cannot be negative; those of equality constraints can be any sign. Complementarity conditions (the third line) help decide if a constraint is active or not. 3

What if we maximize? Notice the change in the sign of the Lagrange multipliers. Now they need to be non-positive; that is, they cannot be positive. 4

What if we flip the inequality sign? Notice the change in the sign of the Lagrange multipliers. Now they need to be non-positive; that is, they cannot be positive. 5

What if we maximize and flip the inequality sign? Notice the sign of the Lagrange multipliers. Now they need to be non-negative again. Two negatives annul each other s effect. 6

Feasible perturbations; constrained subspace For sufficient conditions, we need to consider only feasible perturbations.. Consider m equality constraints plus active inequality constraints such that they are linearly independent. Together they represent a hyper surface of dimension (n-m) h 1 = 0 h 2 = 0 We need to verify sufficiency by taking perturbations only in S, which is called the constrained subspace. 7

First order term of f(x) in the constrained subspace Recall from Slide 13 in Lecture 5 where After eliminating the s variables, we can think of as some other function z that depend only on d. So, we can write in a shorthand notation: T T z f s f = + d s d d T where 8

Second-order derivative (Hessian) of f in the constrained space T T z f s f = + d s d d T By differentiating the above first-order term, we get the second order term. 9

Hessian of f in the constrained space (contd.) In the above expression, we know how to compute all quantities except. This, we will compute in the same way as, i.e., using 10

Hessian of the constraints in the constrained space Requires that the second-order perturbation of the m constraints also be to be zero for feasibility. Therefore 11

From Slides 10 and 11 Recall from Slide 15 in Lecture 5 that 12

And now the complete Hessian in the constrained space Where The long expression of the last slide reduces to this because of the way we had defined the Lagrangian, L. This is the sufficient condition for the constrained minimum. Note that the perturbations are only in the independent d variables. 13

Sufficient condition for a constrained minimum Therefore, we get: 14

Sufficient condition for a constrained minimum Only feasible perturbations with Where 15

How do we check this easily? with Note that this is a less stringent sufficient condition than requiring the positive definiteness of the Hessian at the minimum point. We want positive definiteness only in the subspace formed by feasible perturbations in the neighborhood of the minimum. So, requiring positive definiteness of the Hessian is an overkill! But how do we check this restricted positive definiteness? Next slide 16

Bordered Hessian with The above condition is satisfied if the last (n-m) principal minors of the bordered Hessian, H b (defined below) have the sign (-1) m. Bordered Hessian is simply Hessian of the Lagrangian bordered by the gradients of equality and active inequality constraints. 17

Bordered Hessian check Last principal minor Last-but-one principal minor Last-but-two principal minor 18

Example is a solution. Let us check the sufficiency. 19

Example (contd.) Eigenvalues of H are: -1.0000, 0.5858, and 3.4142; Not positive definite! So, consider the Bordered Hessian: H B 0 1 0 0 0 1 0 0 1 λ 0 0 1 1 0 0 = = 0 0 2+ λ 1 0 0 1 1 0 0 1 4+ λ 0 0 1 3 n - m = 3 1 = 2; So, last two principal minors should have the sign of (-1) m = -1. That is they should be negative. Last principal minor = -2; it is fine. Last-but-one principal minor = -1; it is also fine. So, we have a minimum. 20

The concept of optimization search algorithms Optimization search algorithms work like you would walk blindfolded in a rough terrain! They are iterative. They move from one point to another and eventually converge to a minimum at which KKT conditions are satisfied. They need an initial guess. Various algorithms differ in the way they choose a search direction. Once the search direction is chosen, the algorithms needs one-variable search to decide how much to move in that direction. This is called line search. Iteration number Updated variable Line search parameter Search direction 21

The end note Sufficient conditions for Constrained finite-variable optimization Recap of KKT conditions Feasible perturbation 2 nd order term in Taylor series expansion of an n-variable function with constraints Constrained subspace; Sufficient conditions for constrained minimization Positive definiteness of the Hessian within the constrained subspace Constrained positive definiteness using bordered Hessian The concept of search algorithms Thanks 22