Variational Methods & Optimal Control

Similar documents
Variational Methods & Optimal Control

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima

MA102: Multivariable Calculus

OR MSc Maths Revision Course

Variational Methods and Optimal Control

VANDERBILT UNIVERSITY. MATH 2300 MULTIVARIABLE CALCULUS Practice Test 1 Solutions

The Derivative. Appendix B. B.1 The Derivative of f. Mappings from IR to IR

REVIEW OF DIFFERENTIAL CALCULUS

ARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1

September Math Course: First Order Derivative

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016

14.7: Maxima and Minima

MAT 122 Homework 7 Solutions

Taylor Series and stationary points

P1 Calculus II. Partial Differentiation & Multiple Integration. Prof David Murray. dwm/courses/1pd

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017

Functions. A function is a rule that gives exactly one output number to each input number.

Math 180, Final Exam, Fall 2007 Problem 1 Solution

MTH4100 Calculus I. Week 6 (Thomas Calculus Sections 3.5 to 4.2) Rainer Klages. School of Mathematical Sciences Queen Mary, University of London

Faculty of Engineering, Mathematics and Science School of Mathematics

Lecture 8: Maxima and Minima

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then

Lecture 5: September 12

Math 5BI: Problem Set 6 Gradient dynamical systems

Mathematics for Economics ECON MA/MSSc in Economics-2017/2018. Dr. W. M. Semasinghe Senior Lecturer Department of Economics

Math (P)refresher Lecture 8: Unconstrained Optimization

Section 4.2. Types of Differentiation

Partial Derivatives. w = f(x, y, z).

4 Partial Differentiation

Vector Derivatives and the Gradient

Optimization: Problem Set Solutions

The Inverse Function Theorem 1

Second Order ODEs. Second Order ODEs. In general second order ODEs contain terms involving y, dy But here only consider equations of the form

Math 207 Honors Calculus III Final Exam Solutions

MA261-A Calculus III 2006 Fall Midterm 2 Solutions 11/8/2006 8:00AM ~9:15AM

Math Maximum and Minimum Values, I

Exercises for Multivariable Differential Calculus XM521

Calculus - II Multivariable Calculus. M.Thamban Nair. Department of Mathematics Indian Institute of Technology Madras

EC /11. Math for Microeconomics September Course, Part II Lecture Notes. Course Outline

MATH 105: PRACTICE PROBLEMS FOR CHAPTER 3: SPRING 2010

Higher order derivative

Computational Optimization. Mathematical Programming Fundamentals 1/25 (revised)

AB CALCULUS SEMESTER A REVIEW Show all work on separate paper. (b) lim. lim. (f) x a. for each of the following functions: (b) y = 3x 4 x + 2

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

This exam will be over material covered in class from Monday 14 February through Tuesday 8 March, corresponding to sections in the text.

Functions of Several Variables

12.1 Taylor Polynomials In Two-Dimensions: Nonlinear Approximations of F(x,y) Polynomials in Two-Variables. Taylor Polynomials and Approximations.

MAT 211 Final Exam. Spring Jennings. Show your work!

Math 2204 Multivariable Calculus Chapter 14: Partial Derivatives Sec. 14.7: Maximum and Minimum Values

STUDY MATERIALS. (The content of the study material is the same as that of Chapter I of Mathematics for Economic Analysis II of 2011 Admn.

Multivariable Calculus and Matrix Algebra-Summer 2017

21-256: Partial differentiation

Advanced Calculus Notes. Michael P. Windham

Absolute Maxima and Minima Section 12.11

Mechanical Systems II. Method of Lagrange Multipliers

University of Alberta. Math 214 Sample Exam Math 214 Solutions

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

2015 Math Camp Calculus Exam Solution

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

An Overview of Mathematics for Thermodynamics

Announcements. Topics: Homework: - sections , 6.1 (extreme values) * Read these sections and study solved examples in your textbook!

Introduction to gradient descent

Calculus of Variations Summer Term 2015

Nonlinear equations and optimization

MATH 2053 Calculus I Review for the Final Exam

Review of Optimization Methods

MATH 200 WEEK 5 - WEDNESDAY DIRECTIONAL DERIVATIVE

Lecture Notes for MATH6106. March 25, 2010

Problem Worth Score Total 14

Unconstrained minimization of smooth functions

Strain Transformation equations

446 CHAP. 8 NUMERICAL OPTIMIZATION. Newton's Search for a Minimum of f(x,y) Newton s Method

Math 16A Second Midterm 6 Nov NAME (1 pt): Name of Neighbor to your left (1 pt): Name of Neighbor to your right (1 pt):

MATH Midterm 1 Sample 4

Implicit Functions, Curves and Surfaces

ISE I Brief Lecture Notes

2.10 Saddles, Nodes, Foci and Centers

Neural Network Training

Calculus Example Exam Solutions

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents

Chapter 2: Unconstrained Extrema

Differentiation. f(x + h) f(x) Lh = L.

Midterm 1 Solutions Thursday, February 26

Module Two: Differential Calculus(continued) synopsis of results and problems (student copy)

Test 3 Review. y f(a) = f (a)(x a) y = f (a)(x a) + f(a) L(x) = f (a)(x a) + f(a)

Part IA Differential Equations

MTH4101 CALCULUS II REVISION NOTES. 1. COMPLEX NUMBERS (Thomas Appendix 7 + lecture notes) ax 2 + bx + c = 0. x = b ± b 2 4ac 2a. i = 1.

MTAEA Differentiation

Engg. Math. I. Unit-I. Differential Calculus

EC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2

Review for the Final Exam

1 Functions of Several Variables Some Examples Level Curves / Contours Functions of More Variables... 6

Math 263 Assignment #4 Solutions. 0 = f 1 (x,y,z) = 2x 1 0 = f 2 (x,y,z) = z 2 0 = f 3 (x,y,z) = y 1

3.5 Quadratic Approximation and Convexity/Concavity

Optimality Conditions

1. Introduction. 2. Outlines

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION

Appendix 3. Derivatives of Vectors and Vector-Valued Functions DERIVATIVES OF VECTORS AND VECTOR-VALUED FUNCTIONS

z 2 = 1 4 (x 2) + 1 (y 6)

4.1 Analysis of functions I: Increase, decrease and concavity

Transcription:

Variational Methods & Optimal Control lecture 02 Matthew Roughan <matthew.roughan@adelaide.edu.au> Discipline of Applied Mathematics School of Mathematical Sciences University of Adelaide April 14, 2016 Variational Methods & Optimal Control: lecture 02 p.1/29

Revision, part ii Extrema of functions of multiple variables. Taylor s theorem and the chain rule in N-D. Hessians and classification of extrema. Variational Methods & Optimal Control: lecture 02 p.2/29

Extrema of functions of one variable Local extrema have f (x)=0 includes maxima, minima, and stationary points of inflection sin 2 (x) 1 0.8 0.6 0.4 0.2 0 0 1 2 3 4 5 6 7 x Variational Methods & Optimal Control: lecture 02 p.3/29

Classification of extrema Local extrema have f (x)=0 f (x)>0 local minima f (x)<0 local maxima f (x)=0 it might be a stationary point of inflection, depending on higher order derivatives, e.g. x 4. local max stationary point local min Variational Methods & Optimal Control: lecture 02 p.4/29

Functions of n variables Let Ω be a closed region of IR n, i.e. Ω IR n Let x=(x 1,x 2,...,x n ) Ω Let f : Ω IR A local minima if f( ) is point x such that there exists δ>0 where f(ˆx) f(x) for any ˆx B(x;δ). A global minima of f( ) on Ω is point x such that f(ˆx) f(x) for any ˆx Ω. Variational Methods & Optimal Control: lecture 02 p.5/29

2D example 1 f(x 1,x 2 )=x 2 1 x2 2 + x3 1 local maximum at( 2/3,0) saddle point at(0,0) Variational Methods & Optimal Control: lecture 02 p.6/29

2D example 2 f(x 1,x 2 )=r 1/2r 2, where r= x 2 1 + x2 2 global maxima on curve r= 1 local minima at r=0 Variational Methods & Optimal Control: lecture 02 p.7/29

2D example 3 f(x 1,x 2 )=x 3 2 3x2 1 x 2 Monkey saddle at(0,0) Variational Methods & Optimal Control: lecture 02 p.8/29

The Chain rule The derivative of a function f(x 1,x 2 ) along a line described parametrically by(x 1 (t),x 2 (t)) d f = f dx 1 x 1 + f dx 2 x 2 Another way to think of this is as the directional derivative formed from the dot product of grad and the direction of the line, e.g., d f = f dx ( f =, f ) x 1 x 2 ( dx1, dx ) 2 Variational Methods & Optimal Control: lecture 02 p.9/29

Chain Rule for more variables The chain rule (for a function of more than one variable) f(x)= f(x 1,x 2,...,x n ), where we want to find the derivative of a function f(x) along a line described parametrically by(x 1 (t),x 2 (t),...,x n (t)) then we take or alternatively df = d f = f dx 1 x 1 + f dx 2 x 2 f dx n + + x n ( f, f,..., f ) (. dx1 x 1 x 2 x n, dx 2,..., dx ) n = f. dx Variational Methods & Optimal Control: lecture 02 p.10/29

A graphical example For a function of two variables f(x,y) we get 3 d f = f x dx + f y dy f(x,y) = x+y x = cost y = sint z 2 1 0 1 2 3 2 1 0 1 y 2 1.5 1 0.5 0 0.5 1 1.5 Variational Methods x & Optimal Control: lecture 02 p.11/29

A graphical example For a function of two variables f(x,y) we get d f = f x dx + f y dy f(x,y) = x+y x = cost y = sint z 0 0.5 1 1.5 df 2 0 0.2 0.4 0.6 0.8 1 y 0.2 0 0.2 0.4 0.6 0.8 1 Variational Methods x & Optimal Control: lecture 02 p.11/29

A graphical example For a function of two variables f(x,y) we get d f = f x dx + f y dy f(x,y) = x+y x = cost y = sint z 0 0.5 1 1.5 df dx dy df 2 0 0.2 0.4 0.6 0.8 1 y 0.2 0 0.2 0.4 0.6 0.8 1 Variational Methods x & Optimal Control: lecture 02 p.11/29

Chain Rule Derivation By the definition df = lim ε 0 f(x(t+ ε),y(t+ ε)) f(x(t),y(t)) ε But note that from Taylor s theorem x(t+ ε)=x(t)+εx (t)+o(ε 2 ). As we consider the limit as ε 0 we may ignore the O(ε 2 ) term, to get df = lim ε 0 f(x(t)+εx (t),y(t)+εy (t)) f(x(t),y(t)) ε Variational Methods & Optimal Control: lecture 02 p.12/29

Chain Rule Derivation df f(x(t)+εx (t),y(t)+εy (t)) f(x(t),y(t)) = lim ε 0 ε f(x(t)+εx (t),y(t)+εy (t)) f(x(t),y(t)+εy (t)) = lim ε 0 ε f(x(t),y(t)+εy (t)) f(x(t),y(t)) +lim ε 0 ε = x f(x(t)+εx (t),y(t)+εy (t)) f(x(t),y(t)+εy (t)) (t)lim ε 0 εx (t) +y (t)lim ε 0 f(x(t),y(t)+εy (t)) f(x(t),y(t)) εy (t) Variational Methods & Optimal Control: lecture 02 p.13/29

Chain Rule Derivation df = x (t)lim ε 0 f(x(t)+εx (t),y(t)+εy (t)) f(x(t),y(t)+εy (t)) εx (t) = x (t) lim εx 0 +y (t)lim ε 0 f(x(t),y(t)+εy (t)) f(x(t),y(t)) εy (t) +y (t) lim εy 0 = x (t) f x + y (t) f y f(x(t)+ε x,y(t)+εy (t)) f(x(t),y(t)+εy (t)) ε x f(x(t),y(t)+ε y ) f(x(t),y(t)) ε y which is the chain rule! Variational Methods & Optimal Control: lecture 02 p.14/29

Chain Rule special case When we only have one variable, we simple want to calculate the derivative of a function f of another function x, e.g. d f(x(t))= df dx dx Another way of writing this is d f(x(t))= f [x(t)]x (t), which is the form you learnt in 1st year. Variational Methods & Optimal Control: lecture 02 p.15/29

Taylor s theorem in 2D f f f(x 1 + δx 1,x 2 + δx 2 )= f(x 1,x 2 )+δx 1 + δx 2 x 1 x 2 + 1 [ δx 2 2 f 2 f 1 2 x1 2 + 2δx 1 δx 2 + δx 2 2 ] f 2 x 1 x 2 x2 2 + Write(δx 1,δx 2 )=ε (η 1,η 2 ) f(x+εη)= f(x)+ε ( f f η 1 + η 2 x 1 x 2 + ε2 2 [ η 2 1 2 f x 2 1 ) 2 f + 2η 1 η 2 + η 2 2 f 2 x 1 x 2 x2 2 ] + O(ε 3 ) Variational Methods & Optimal Control: lecture 02 p.16/29

Taylor s theorem in N-D f(x+δx)= f(x)+ n f δx i + 1 i=1 x i 2 n i, j=1 2 f x i x j δx i δx j + O(δx 3 ) f(x+δx)= f(x)+δx T f(x)+ 1 2 δxt H(x)δx+O(δx 3 ) Where H(x) is the Hessian matrix H(x)= 2 f x 2 1 2 f x 2 x 1. 2 f x n x 1 2 f x 1 x 2 2 f x 2 2. 2 f x n x 2 2 f x 1 x n 2 f x 2 x n. 2 f x 2 n Variational Methods & Optimal Control: lecture 02 p.17/29

Maxima of N variables If a smooth function f(x) has a local extrema at x then f(x)= ( f, f,..., f ) T = 0 x 1 x 2 x n A sufficient condition for the extrema x to be a local minimum is for the quadratic form Q(δx 1,...,δx n )=δx T H(x)δx= to be strictly positive definite. n i=1 n j=1 2 f x i x j δx i δx j Variational Methods & Optimal Control: lecture 02 p.18/29

Quadratic forms A quadratic form Q(x)= a i j x i x j = x T Ax i, j is said to be positive definite if Q(x)>0 for all x 0. A quadratic form is positive definite iff every eigenvalue of A is greater than zero. A quadratic form is positive definite if all the principal minors in the top-left corner of A are positive, in other words a 11 a a 11 a 12 a 13 12 a 11 > 0, a 21 a 22 > 0, a 21 a 22 a 23 > 0, a 31 a 32 a 33 Variational Methods & Optimal Control: lecture 02 p.19/29

Notes on maxima and minima maxima of f(x) are minima of f(x). haven t said anything about non-differentiable functions if continuous in the interval, must achieve maximum (minimum) in the interval Variational Methods & Optimal Control: lecture 02 p.20/29

Calculus of variations We are not maximizing the value of a function. We are maximizing a functional a function of a function Can think of it as an -dimensional max. problem. can choose between different functions function sits in -dimensional vector space This might take some effort. Variational Methods & Optimal Control: lecture 02 p.21/29

Functionals A Functional maps an element of a vector space (e.g. a space containing functions) to a real number, e.g. F : S IR. Example Functionals F{y(x)} = y(0) F{y(x)} = max{y(x)} x F{y(x)} = dy dx x=1 F{y(x)} = y(0)+y(1) F{y(x)} = N a n y(n) n=0 Variational Methods & Optimal Control: lecture 02 p.22/29

Integral functionals Previous functionals not very interesting. Easy to find y(x) which minimizes these. Integral functionals are more interesting. Example integral functionals F{y} = F{y} = F{y} = b a b a b a y(x)dx f(x)y(x)dx 1+ ( ) 2 dy dx dx Variational Methods & Optimal Control: lecture 02 p.23/29

Some simple cases F{y} = df dε If a and b are fixed then b(ε) a(ε) y(x, ε)dx = y(b,ε) db dε y(a,ε)da dε + b(ε) a(ε) y(x, ε) dx ε da dε db dε = 0 = 0 and so the derivative of the integral becomes the integral of the derivative. Variational Methods & Optimal Control: lecture 02 p.24/29

Crude Brachystochrone Brachystochrone involves the functional F{y}= x1 x 0 1+y 2 Let us guess that the brachystochrone takes the form y y(x,ε)=(1 x) ε We could calculate the derivative WRT ε as above and compute the stationary points by finding df dε = 0 dx Variational Methods & Optimal Control: lecture 02 p.25/29

Crude Brachystochrone 1.5 (1 x), total time = 2.00 wire velocity 1.5 (1 x). 4, total time = 1.88 1.5 (1 x). 25, total time = 2.03 1 1 wire velocity 1 wire velocity 0.5 0.5 0.5 0 0 0.5 1 x 0 0 0.5 1 x 0 0 0.5 1 x Variational Methods & Optimal Control: lecture 02 p.26/29

Crude Brachystochrone 2 descent time 1.9 1.8 1.7 1.6 ε = 2.5 1.5 0 2 4 6 8 10 ε but what if the family of curves doesn t contain the maximum? Variational Methods & Optimal Control: lecture 02 p.27/29

Extra bits Variational Methods & Optimal Control: lecture 02 p.28/29

Notation f(x) : S IR denotes a function that maps the set S IR n to a real number. n f x n i denotes the nth partial derivative of f(x), with respect to x i. the ε-neighborhood under the Euclidean norm is B(x;ε)={ ˆx IR n ˆx x 2 < ε} The Euclidean norm in IR n is x 2 = n i=1 x 2 i F{y} denotes a functional of the function y(x). Variational Methods & Optimal Control: lecture 02 p.29/29