Introduction to Optimization Techniques. Nonlinear Optimization in Function Spaces

Similar documents
Numerical Optimization

In view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written

Constrained optimization

Chap 2. Optimality conditions

Constrained Optimization

Generalization to inequality constrained problem. Maximize

Nonlinear Programming and the Kuhn-Tucker Conditions

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Math 5311 Constrained Optimization Notes

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

ON LICQ AND THE UNIQUENESS OF LAGRANGE MULTIPLIERS

Lecture 3. Optimization Problems and Iterative Algorithms

APPLICATIONS OF DIFFERENTIABILITY IN R n.

Nonlinear Optimization

Date: July 5, Contents

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

Centre d Economie de la Sorbonne UMR 8174

Inequality Constraints

On constraint qualifications with generalized convexity and optimality conditions

TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM

Introduction. Chapter 1. Contents. EECS 600 Function Space Methods in System Theory Lecture Notes J. Fessler 1.1

FUNCTIONAL ANALYSIS LECTURE NOTES: COMPACT SETS AND FINITE-DIMENSIONAL SPACES. 1. Compact Sets

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

The Relation Between Pseudonormality and Quasiregularity in Constrained Optimization 1

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

Optimality Conditions for Constrained Optimization

IE 5531: Engineering Optimization I

Lecture 18: Optimization Programming

Notions such as convergent sequence and Cauchy sequence make sense for any metric space. Convergent Sequences are Cauchy

1. Bounded linear maps. A linear map T : E F of real Banach

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

More on Lagrange multipliers

A FRITZ JOHN APPROACH TO FIRST ORDER OPTIMALITY CONDITIONS FOR MATHEMATICAL PROGRAMS WITH EQUILIBRIUM CONSTRAINTS

Constrained Optimization and Lagrangian Duality

Course Summary Math 211

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS

Symmetric and Asymmetric Duality

5 Handling Constraints

Characterizations of Solution Sets of Fréchet Differentiable Problems with Quasiconvex Objective Function

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Enhanced Fritz John Optimality Conditions and Sensitivity Analysis

Machine Learning Support Vector Machines. Prof. Matteo Matteucci

The Karush-Kuhn-Tucker (KKT) conditions

************************************* Applied Analysis I - (Advanced PDE I) (Math 940, Fall 2014) Baisheng Yan

Elements of Convex Optimization Theory

Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

Constraint qualifications for nonlinear programming

E 600 Chapter 4: Optimization

University of California, Davis Department of Agricultural and Resource Economics ARE 252 Lecture Notes 2 Quirino Paris

4TE3/6TE3. Algorithms for. Continuous Optimization

Convex Optimization M2

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1

SOME STABILITY RESULTS FOR THE SEMI-AFFINE VARIATIONAL INEQUALITY PROBLEM. 1. Introduction

Introduction to Support Vector Machines

2. Dual space is essential for the concept of gradient which, in turn, leads to the variational analysis of Lagrange multipliers.

Duality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities

Chapter 3: Constrained Extrema

6.254 : Game Theory with Engineering Applications Lecture 7: Supermodular Games

ON A CLASS OF NONSMOOTH COMPOSITE FUNCTIONS

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

The Karush-Kuhn-Tucker conditions

Differentiable exact penalty functions for nonlinear optimization with easy constraints. Takuma NISHIMURA

Statistical Machine Learning from Data

MATH 4211/6211 Optimization Constrained Optimization

DUALITY, OPTIMALITY CONDITIONS AND PERTURBATION ANALYSIS

Optimization. A first course on mathematics for economists

Two-Step Iteration Scheme for Nonexpansive Mappings in Banach Space

Summary Notes on Maximization

8 Barrier Methods for Constrained Optimization

A convergence result for an Outer Approximation Scheme

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.

g(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Introduction to Real Analysis Alternative Chapter 1

PATTERN SEARCH METHODS FOR LINEARLY CONSTRAINED MINIMIZATION

Constrained Controllability of Nonlinear Systems

Tangent spaces, normals and extrema

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

The Implicit and Inverse Function Theorems Notes to supplement Chapter 13.

Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for So far, we have considered unconstrained optimization problems.

STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY

Mathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7

Lagrange Multipliers

First-order optimality conditions for mathematical programs with second-order cone complementarity constraints

Tutorial on Convex Optimization: Part II

Largest dual ellipsoids inscribed in dual cones

Semi-infinite programming, duality, discretization and optimality conditions

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Perturbation analysis of a class of conic programming problems under Jacobian uniqueness conditions 1

Convergence of Stationary Points of Sample Average Two-Stage Stochastic Programs: A Generalized Equation Approach

Linear Support Vector Machine. Classification. Linear SVM. Huiping Cao. Huiping Cao, Slide 1/26

Lecture 7 Monotonicity. September 21, 2008

Gradient Descent. Dr. Xiaowei Huang

Duality Theory of Constrained Optimization

INVEX FUNCTIONS AND CONSTRAINED LOCAL MINIMA

Second Order Optimality Conditions for Constrained Nonlinear Programming

Optimality conditions for problems over symmetric cones and a simple augmented Lagrangian method

Appendix A Taylor Approximations and Definite Matrices

1. Nonlinear Equations. This lecture note excerpted parts from Michael Heath and Max Gunzburger. f(x) = 0

Transcription:

Introduction to Optimization Techniques Nonlinear Optimization in Function Spaces

X : T : Gateaux and Fréchet Differentials Gateaux and Fréchet Differentials a vector space, Y : a normed space transformation (possibly nonlinear) mapping D( X) R( Y) Definition Let x D X and let h be arbitrary in X. If the limit T( x; h) lim [ T( xh) T( x)] exists, it is called the Gateaux differential of T at x with increment h. If the limit exists for each h X, the transformation T is said to be Gateaux differentiable at x. The Gateaux differential generalizes the concept of directional derivative familiar in finite-dimensional space Fréchet differential: more satisfactory definition 2

Gateaux and Fréchet Differentials Definition Let T be a transformation defined on an open domain D in a normed space X and having range in a normed space Y. If for fixed x D and each h X there exists T( x; h) Y which is linear and continuous w.r.t. h such that T( xh) T( x) T( x; h) lim h h then T is said to be Fréchet differentiable at x and T( x; h) is said to be the Fréchet differential of T at x with increment h. 3

Gateaux and Fréchet Differentials Proposition. If the transformation T has a Fréchet differential, it is unique. Proposition 2. If the Fréchet differential of T exists at x, then the Gateaux differential exists at x and they are equal. Proposition 3. If the transformation T defined on an open set D in has a Fréchet differential at x, then T is continuous at x. X 4

Local Theory of Constrained Optimization Lagrange Multiplier Theorems Inverse Function Theorem T Definition Let be a continuously Fréchet differentiable transformation from an open set in a Banach space X into a Banach space Y. If x is such that maps onto, the point D T( x ) X Y is said to be a regular point of the transformation. T x T Ex. If is a mapping from into, a point is a regular point if the Jacobian matrix of has rank. E n T x m m E E n 5

Local Theory of Constrained Optimization Theorem. (Generalized Inverse Function Theorem) Let x be a regular point of a transformation T mapping the Banach space X into the Banach space Y. Then there is a neighborhood N( y) of the point y T( x) (i.e., a sphere centered at y ) and a constant K such that the equation T( x) has a solution for every and the solution satisfies y N( y ) x x K y y y 6

Local Theory of Constrained Optimization Equality Constraints f Gx ( ) Necessary conditions for an extremum of subject to where f is a real-valued functional on a Banach space X and G is a mapping from X into a Banach space Z. f Gx ( ) Lemma Let achieve a local extremum subject to at the point x and assume that f and G are continuously Fréchet differentiable in an open set containing x and that is x a regular point of. Then for all satisfying G( x ) h. G f( x ) h h 7

Local Theory of Constrained Optimization Proof To be specific, assume that the local extremum is a local minimum. Consider the transformation T : X RZ defined by T( x) ( f ( x), G( x)). If there were an h such that G( x, ) h f( x, then ) h T( x) ( f( x), G( x)): X RZ would be onto R Z since G( x is onto. By the inverse ) Z function theorem, it would follow that for any, there exists a vector x and with x x such that T( x) ( f( x ), ), contradicting the assumption that x is a local minimum. 8

Tangent Space Tangent space of the constraint surface. Tangent space at : N ( ( )) x G x { hz G( x ) h} Tangent space of the surface M { x: G( x) } near. x f is stationary at x with respect to variation in the tangent plane. f : const tangent space gx ( ) G( x ) x f ( x ) Constrained optimization 9

Lagrange Multiplier Theorem Defintion Let X be a normed linear vector space. The space of all bounded linear functionals on X is called the normed dual of X and is denoted X. The norm of an element f X is f sup f( x). x The value of a linear functional x X at the point x X is denoted by x ( x ) or by the more symmetric notation x, x. Theorem X is a Banach space.

Lagrange Multiplier Theorem Theorem (Lagrange Multiplier) If the continuously Fréchet differentiable functional f has a local extremum under the constraint Gx ( ) at the regular point x, then there exists an element z Z such that the Lagrangian functional Lxz (, ) f( x) Gx ( ), z, or Lxz (, ) f( x) zgx ( ) x stationary at, i.e., f( x ) z G( x ).

Lagrange Multiplier Theorem Proof. From Lemma it is clear that f( x ) is orthogonal to the nullspace of G( x. Since, however, the range of is closed, it ) G( x ) follows that f x R G x ( ) [ ( ) ] Theorem Let X and Y be normed spaces and let A BXY (, ). Then R N [ ( A )] ( A ) R [ ( )] ( ) A N A N N R ( A ) ( A ) R ( A ) ( A ) 2

Lagrange Multiplier Theorem Hence there is a z Z such that f( x ) G ( x ) z or f( x ) z G( x ) When x is not regular. Corollary. Assuming all the hypothesis of Theorem with the exception that the range of G( x ) is closed but perhaps not onto, there exists a nonzero element ( r such that, z) R Z the functional rf( x) zgx ( ) is stationary at x. 3

Lagrange Multiplier Theorem Ex. G consists of two functionals g, g2. For optimality the gradient of must lie in the plane generated by and g ; 2 hence f g f( x ) z g( x ) z g( x ) 2 2 g f g 2 g g 2 4

Inequality Constraints (Karush-Kuhn-Tucker Theorem) Derivation of the local necessary conditions minimize subject to f ( x) Gx ( ) f : X R G: X Z normed space with positive cone P 5

Inequality Constraints (Karush-Kuhn-Tucker Theorem) Ex. Consider a problem in two dimensions with three scalar equations g ( x) as constraints. i g ( x) 2 g ( x) g ( x) 2 x g ( x) g ( x) 3 g ( x) 3 (a) (b) g ( x) 2 x g f g 2 g x g g ( x) g ( x) ( ) f 2 x g ( x) g ( x) 3 3 (c) (d) 6

Inequality Constraints (Karush-Kuhn-Tucker Theorem) (b) x : interior of the region f x (c) The minimum occurs on the boundary g x. f x must be orthogonal to the boundary and point inside. f x g x for some (d) The minimum point x satisfies both and g x g 2 x f x g x g x with, 2 2 2 7

Inequality Constraints (Karush-Kuhn-Tucker Theorem) General Statement f x G x and, i, 2, 3 igi x If i g x i 8

Inequality Constraints (Karush-Kuhn-Tucker Theorem) Definition: Let X be a vector space and let Z be a normed space with a positive cone P having nonempty interior. Let G be a mapping G: X Z which has a Gateaux differential that is linear in its increment. A point x X is said to be a regular point of the inequality G x G x h if G x and there is an h X such that G x ; 9

Inequality Constraints (Karush-Kuhn-Tucker Theorem) Theorem. Let X be a space vector and Z a normed space having positive cone P. Assume that P contains an interior point. Let f be a Gateaux differentiable real-valued functional on X and G a Gateaux differentiable mapping from X into Z. Assume that the Gateaux differentials are linear in their increments. x Suppose minimizes f subject to G x and that x is a regular point of the inequality G x. Then there is a z Z, z such that the Lagrangian, f x G x z 2

Inequality Constraints (Karush-Kuhn-Tucker Theorem) x is stationary at ; furthermore G x, z. 2