Derivatives in 2D. Outline. James K. Peterson. November 9, Derivatives in 2D! Chain Rule

Similar documents
Constrained Optimization in Two Variables

Constrained Optimization in Two Variables

The First Derivative and Second Derivative Test

Geometric Series and the Ratio and Root Test

' '-'in.-i 1 'iritt in \ rrivfi pr' 1 p. ru

The First Derivative and Second Derivative Test

f(x, y) = 1 sin(2x) sin(2y) and determine whether each is a maximum, minimum, or saddle. Solution: Critical points occur when f(x, y) = 0.

Uniform Convergence Examples

Fourier Sin and Cos Series and Least Squares Convergence

REVIEW OF DIFFERENTIAL CALCULUS

Geometric Series and the Ratio and Root Test

MATH 425, FINAL EXAM SOLUTIONS

Lecture 10. (2) Functions of two variables. Partial derivatives. Dan Nichols February 27, 2018

MAC 2311 Calculus I Spring 2004

Uniform Convergence Examples

LOWELL JOURNAL. MUST APOLOGIZE. such communication with the shore as Is m i Boimhle, noewwary and proper for the comfort

Lower semicontinuous and Convex Functions

3 Applications of partial differentiation

MTH4101 CALCULUS II REVISION NOTES. 1. COMPLEX NUMBERS (Thomas Appendix 7 + lecture notes) ax 2 + bx + c = 0. x = b ± b 2 4ac 2a. i = 1.

Function Composition and Chain Rules

Complex Variables. Chapter 2. Analytic Functions Section Harmonic Functions Proofs of Theorems. March 19, 2017

(x 3)(x + 5) = (x 3)(x 1) = x + 5. sin 2 x e ax bx 1 = 1 2. lim

MATH 31BH Homework 5 Solutions

Taylor Polynomials. James K. Peterson. Department of Biological Sciences and Department of Mathematical Sciences Clemson University

1 Lecture 24: Linearization

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017

THE NATIONAL UNIVERSITY OF IRELAND, CORK COLÁISTE NA hollscoile, CORCAIGH UNIVERSITY COLLEGE, CORK. Summer Examination 2009.

MATH COURSE NOTES - CLASS MEETING # Introduction to PDEs, Fall 2011 Professor: Jared Speck

MATH COURSE NOTES - CLASS MEETING # Introduction to PDEs, Spring 2018 Professor: Jared Speck

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima

Predator - Prey Model Trajectories are periodic

CIRCLES PART - II Theorem: The condition that the straight line lx + my + n = 0 may touch the circle x 2 + y 2 = a 2 is

Convergence of Fourier Series

PRACTICE PROBLEMS FOR MIDTERM I

Math 10C - Fall Final Exam

Function Composition and Chain Rules

Rudin Real and Complex Analysis - Harmonic Functions

Math 222 Spring 2013 Exam 3 Review Problem Answers

Convergence of Sequences

University of Alberta. Math 214 Sample Exam Math 214 Solutions

Integration and Differentiation Limit Interchange Theorems

Predator - Prey Model Trajectories are periodic

MATH20411 PDEs and Vector Calculus B

Consequences of Continuity

Consequences of Continuity

Multivariable Calculus and Matrix Algebra-Summer 2017

Example 2.1. Draw the points with polar coordinates: (i) (3, π) (ii) (2, π/4) (iii) (6, 2π/4) We illustrate all on the following graph:

MATH 425, HOMEWORK 3 SOLUTIONS

1 Integration in many variables.

Advanced Calculus I Chapter 2 & 3 Homework Solutions October 30, Prove that f has a limit at 2 and x + 2 find it. f(x) = 2x2 + 3x 2 x + 2

Bolzano Weierstrass Theorems I

3.4 Conic sections. Such type of curves are called conics, because they arise from different slices through a cone

1. Which one of the following points is a singular point of. f(x) = (x 1) 2/3? f(x) = 3x 3 4x 2 5x + 6? (C)

Note: Each problem is worth 14 points except numbers 5 and 6 which are 15 points. = 3 2

10.1 Curves Defined by Parametric Equation

SET 1. (1) Solve for x: (a) e 2x = 5 3x

Summer 2017 MATH Solution to Exercise 5

Proofs Not Based On POMI

e x = 1 + x + x2 2! + x3 If the function f(x) can be written as a power series on an interval I, then the power series is of the form

Calculus III - Problem Solving Drill 18: Double Integrals in Polar Coordinates and Applications of Double Integrals

Sin, Cos and All That

Calculus I Homework: Linear Approximation and Differentials Page 1

Convergence of Sequences

CHAPTER 3. Analytic Functions. Dr. Pulak Sahoo

MATH 19520/51 Class 5

Regression and Covariance

Calculus I Homework: Linear Approximation and Differentials Page 1

More On Exponential Functions, Inverse Functions and Derivative Consequences

MATH 32A: MIDTERM 2 REVIEW. sin 2 u du z(t) = sin 2 t + cos 2 2

Calculus I Review Solutions

7a3 2. (c) πa 3 (d) πa 3 (e) πa3

v v y = v sinθ Component Vectors:

Constructing Approximations to Functions

SOLUTIONS FOR PRACTICE FINAL EXAM

Complex Numbers. James K. Peterson. September 19, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

Complex Numbers. Outline. James K. Peterson. September 19, Complex Numbers. Complex Number Calculations. Complex Functions

Ch 4 Differentiation

Math Homework 2

Math 1501 Calc I Summer 2015 QUP SOUP w/ GTcourses

Inner Product Spaces 5.2 Inner product spaces

Derivatives and the Product Rule

MATH 32 FALL 2012 FINAL EXAM - PRACTICE EXAM SOLUTIONS

INVITATION FOR BID Exploratory Geothermal Drilling Services Department of Natural Resources (DNR)

Project One: C Bump functions

Higher-Order Equations: Extending First-Order Concepts

Matrices and Vectors

M412 Assignment 5 Solutions

Study Guide/Practice Exam 2 Solution. This study guide/practice exam is longer and harder than the actual exam. Problem A: Power Series. x 2i /i!

PDEs, Homework #3 Solutions. 1. Use Hölder s inequality to show that the solution of the heat equation

Optimization. Sherif Khalifa. Sherif Khalifa () Optimization 1 / 50

1 4 (1 cos(4θ))dθ = θ 4 sin(4θ)

Differentiating Series of Functions

A1. Let r > 0 be constant. In this problem you will evaluate the following integral in two different ways: r r 2 x 2 dx

Statistics 300B Winter 2018 Final Exam Due 24 Hours after receiving it

Precalculus Review. Functions to KNOW! 1. Polynomial Functions. Types: General form Generic Graph and unique properties. Constants. Linear.

Calculus I Homework: The Derivatives of Polynomials and Exponential Functions Page 1

Web Solutions for How to Read and Do Proofs

Differentiation. Timur Musin. October 10, University of Freiburg 1 / 54

Proofs Not Based On POMI

Math Maximum and Minimum Values, I

Transcription:

Derivatives in 2D James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 9, 2016 Outline Derivatives in 2D! Chain Rule

Let s go back to one dimensional calculus. If the function f is defined locally near x0 that means that f is defined in a circle Br (x0) = {x : x0 r < x < x0 + r} for some positive value of r. In this case, we can attempt to find the usual it as x approaches x0 that defines the derivative of f and x0: if this it exists, it is called f (x0) and f f (x) f (x0) (x0) =. x x0 x x0 This can be expressed in a different form. Recall that we can also use the ɛ δ notation to define a it. In this case, it means that if we choose a positive ɛ, then there is a positive δ so that 0 < x x0 < δ = f (x) f (x0) f (x0) x < ɛ. x0 Now define the error between the function value f (x) and the tangent line value f (x0) + f (x0)(x x0) to be E(x, x0). The above statement can be rewritten as 0 < x x0 < δ = f (x) f (x0) f (x0)(x x0) x x0 Then using the definition of error, E(x, x0), we see 0 < x x0 < δ = E(x, x0) x < ɛ. x0 This is the same as saying E(x, x0) x x0 x x0 = 0. < ɛ.

Now rewrite the inequality again to have 0 < x x0 < δ = E(x, x0) < ɛ x x0. Since we can do this for any positive ɛ, it works for the choice ɛ. Hence, there is a positive δ1 so that 0 < x x0 < δ1 = E(x, x0) < ɛ x x0 < ɛ δ1. But this work as long as 0 < x x0 < δ1. So it also works if 0 < x x0 < δ2 = min(δ1, ɛ) δ1! So 0 < x x0 < δ2 = E(x, x0) < ɛ x x0 < ɛ δ2 < ɛ ɛ < ɛ. So we can say x x0 E(x, x0) = 0 as well. This leads to the following theorem which we have already seen in the one variable part of these notes. Theorem Error Form of Differentiability For One Variable: If f is defined locally at x0, then f is differentiable at x0 if the error function E(x, x0) = f (x) f (x0) f (x0)(x x0) satisfies x x0 E(x, x0) = 0 and x x0 E(x, x0)/(x x0) = 0. Conversely, if there is a number L so that the error function E(x, x0) = f (x) f (x0) L(x x0) satisfies the same behavior, then f is differentiable at x0 with value f (x0) = L.

Proof If f is differentiable at x0, we have already outlined the argument. The converse argument is quite similar. Since we know E(x, x0)/(x x0) = 0, this tells us x x0 or f (x) f (x0) L(x x0) x x0 x x0 = 0 f (x) f (x0) L = 0. x x0 x x0 But this states that f is differentiable at x0 with value L. With this argument done, we have shown both sides of the statement are true. Note if f is differentiable at x0, f must be continuous at x0. This follows because f (x) = f (x0) + f (x0)(x x0) + E(x, x0) and as x x0, we have f (x) f (x0) which is the definition of f being continuous at x0. Hence, we can say Theorem Differentiable Implies Continuous: One Variable If f is differentiable at x0 then f is continuous at x0. Proof We have sketched the argument already.

We apply this idea to the partial derivatives of f (x, y). As long as f (x, y) is defined locally at (x0, y0), we have fx(x0, y0) and fy (x0, y0) exist if and only if there are error functions E1(x, y, x0, y0) and E2(x, y, x0, y0) so that f (x, y0) = f (x0, y0) + fx(x0, y0)(x x0) + E1(x, x0, y0) f (x0, y) = f (x0, y0) + fy (x0, y0)(y y0) + E2(y, x0, y0) with E1 0 and E1/(x x0) 0 as x x0 and E2 0 and E2/(y x0) 0 as y y0. Using the ideas we have presented here, we can come up with a way to define the differentiability of a function of two variables. Definition Error Form of Differentiability For Two Variables If f (x, y) is defined locally at (x0, y0), then f is differentiable at (x0, y0) if there are two numbers L1 and L2 so that the error function E(x, y, x0, y0) = f (x, y) f (x0, y0) L1(x x0) L2(y y0) satisfies (x,y) (x0,y0) E(x, y, x0, y0) = 0 and (x,y) (x0,y0) E(x, y, x0, y0)/ (x x0, y y0) = 0. Recall, the term (x x0, y y0) = (x x0) 2 + (y y0) 2.

Note if f is differentiable at (x0, y0), f must be continuous at (x0, y0). The argument is simple: f (x, y) = f (x0, y0) + L1 (x0, y0)(x x0) + L2 (y y0) + E(x, y, x0, y0) and as (x, y) (x0, y0), we have f (x, y) f (x0, y0) which is the definition of f being continuous at (x0, y0). Hence, we can say Theorem Differentiable Implies Continuous: Two Variables If f is differentiable at (x0, y0) then f is continuous at (x0, y0). Proof We have sketched the argument already. From this definition, we can show if f is differentiable at the point (x0, y0), then L1 = fx(x0, y0) and L2 = fy (x0, y0). The argument goes like this: since f is differentiable at (x0, y0), we can say (x,y) (x0,y0) f (x, y) f (x0, y0) L1(x x0) L2(y y0) (x x0) 2 + (y y0) 2 = 0. We can rewrite this using = x x0 and y = y y0 as (, y) (0,0) f (x0 +, y0 + y) f (x0, y0) L1 L2 y ()2 + ( y) 2 = 0. In particular, for y = 0, we find () 0 f (x0 +, y0) f (x0, y0) L1 () 2 = 0.

For > 0, we find () 2 = and so f (x0 +, y0) f (x0, y0) 0 + = L1. Thus, the right hand partial derivative fx(x0, y0) + exists and equals L1. On the other hand, if < 0, then () 2 = and we find, with a little manipulation, that we still have f (x0 +, y0) f (x0, y0) () 0 = L1. So the left hand partial derivative fx(x0, y0) exists and equals L1 also. Combining, we see fx(x0, y0) = L1. A similar argument shows that fy (x0, y0) = L2. Hence, we can say if f is differentiable at (x0, y0) then fx and fy exist at this point and we have f (x, y) = f (x0, y0) + fx(x0, y0)(x x0) + fy (x0, y0)(y y0) +Ef (x, y, x0, y0) where Ef (x, y, x0, y0) 0 and Ef (x, y, x0, y0)/ (x x0, y y0) 0 as (x, y) (x0, y0). Note this argument is a pointwise argument. It only tells us that differentiability at a point implies the existence of the partial derivatives at that point. Next, we look at 2D version of the chain rule.

Now that we know a bit about two dimensional derivatives, let s go for gold and figure out the new version of the chain rule. The argument we make here is very similar in spirit to the one dimensional one. You should go back and check it out! We assume there are two functions u(x, y) and v(x, y) defined locally about (x0, y0) and that there is a third function f (u, v) which is defined locally around (u0 = u(x0, y0), v0 = v(x0, y0)). Now assume f (u, v) is differentiable at (u0, v0) and u(x, y) and v(x, y) are differentiable at (x0, y0). Then we can say u(x, y) = u(x0, y0) + ux(x0, y0)(x x0) + uy (x0, y0)(y y0) + Eu(x, y, x0, y0) v(x, y) = v(x0, y0) + vx(x0, y0)(x x0) + vy (x0, y0)(y y0) + Ev (x, y, x0, y0) f (u, v) = f (u0, v0) + fu(u0, v0)(u u0) + fv (u0, v0)(v v0) + Ef (u, v, u0, v0) where all the error terms behave as usual as (x, y) (x0, y0) and (u, v) (u0, v0). Note that as (x, y) (x 0, y 0), u(x, y) u 0 = u(x 0, y 0) and v(x, y) v 0 = v(x 0, y 0) as u and v are continuous at the (u 0, v 0) since they are differentiable there. Let s consider the partial of f with respect to x. Let u = u(x 0 +, y 0) u(x 0, y 0) and v = v(x 0 +, y 0) v(x 0, y 0). Thus, u 0 + u = u(x 0 +, y 0) and v 0 + v = v(x 0 +, y 0). Hence, f (u 0 + u, v 0 + v) f (u 0, v 0) = fu(u0, v0)(u u0) + fv (u0, v0)(v v0) + E f (u, v, u 0, v 0) = f u(u 0, v 0) u u0 + f v (u 0, v 0) v v0 + E f (u, v, u 0, v 0)

Continuing f (u0 + u, v0 + v) f (u0, v0) ux(x0, y0)(x x0) + Eu(x, x0, y0) = fu(u0, v0) vx(x0, y0)(x x0) + Ev (x, x0, y0) + fv (u0, v0) Ef (u, v, u0, v0) + = fu(u0, v0) ux(x0, y0) + fv (u0, v0) vx(x0, y0) Eu(x, x0, y0) + + Ev (x, x0, y0) + Ef (u, v, u0, v0). As (x, y) (x 0, y 0), (u, v) (u 0, v 0) and so E f (u, v, u 0, v 0)/ 0. The other two error terms go to zero also as (x, y) (x 0, y 0). Hence, we conclude x A similar argument shows = u u x + v v x. y = u u y + v v y.

This result is known as the Chain Rule. Theorem The Chain Rule Assume there are two functions u(x, y) and v(x, y) defined locally about (x 0, y 0) and that there is a third function f (u, v) which is defined locally around (u 0 = u(x 0, y 0), v 0 = v(x 0, y 0)). Further assume f (u, v) is differentiable at (u 0, v 0) and u(x, y) and v(x, y) are differentiable at (x 0, y 0). Then f x and f y exist at (x 0, y 0) and are given by x y = u u = u x + v v x u y + v v y. Example Let f (x, y) = x 2 + 2x + 5y 4. Then if x = r cos(θ) and y = r sin(θ), using the chain rule, we find = x r x r + y y r = x θ x θ + y y θ This becomes ( ) ) = 2x + 2 cos(θ) + (20y 3 sin(θ) r ( )( ) )( ) = 2x + 2 r sin(θ) + (20y 3 r cos(θ) θ You can then substitute in for x and y to get the final answer in terms of r and θ (kind of ugly though!)

Example Let f (x, y) = 10x 2 y 4. Then if u = x 2 + 2y 2 and v = 4x 2 5y 2, using the chain rule, we find f (u, v) = 10u 2 v 4 and so This becomes x θ = = = u x u x + v v x = u y u y + v v y ) ) (20uv 4 2x + (40u 2 v 3 8x ) ) (20uv 4 4y + (40u 2 v 3 ( 10y) You can then substitute in for u and v to get the final answer in terms of x and y (even more ugly though!) Homework 34 34.1 Let f (x, y) = 2xy. Prove x 2 +y 2 (x,y) (0,0) f (x, y) = 0. This means 34.2 Let f (x, y) has a removeable discontinuity at (0, 0). Hint: x x 2 + y 2 and y x 2 + y 2. Do an ɛ δ proof here. f (x, y) = { x 2xy, (x, y) (0, 0) 2 +y 2 0, (x, y) = (0, 0) Find fx(0, 0) and fy (0, 0). ( They are both 0 ). For all (x, y) (0, 0) find fx and fy. Look at the paths (x, mx) for m 0 and show (x,y) (0,0) fx(x, y) and (x,y) (0,0) fy (x, y) do not exist. Hint: use x 0 + and x 0. You ll get its that depend on both the sign of x and the value of m. Explain how the result above shows the partials of f are not continuous at (0, 0).

Homework 34 34.3 Let f (x, y) = { x 2xy 2, (x, y) (0, 0) +y 2 0, (x, y) = (0, 0) We already know the partials of f exists at all points but that fx and fy are not continuous at (0, 0). If f was differentiable at (0, 0) there would be numbers L1 and L2 so that the error term E(x, y, 0, 0) = 2xy L1x L2y would satisfy x 2 +y 2 (x,y) (0,0) E(x, y, 0, 0) = 0 and (x,y) (0,0) E(x, y, 0, 0)/ (x, y) = 0. Show (x,y) (0,0) E(x, y, 0, 0) = 0 does exist L1, L2. Show (x,y) (0,0) E(x, y, 0, 0)/ (x, y) = 0 does not exist L1, L2 by looking at the paths (x, mx) like we did in HW 34.2. This example shows an f where the partials exist at all points locally around a point (x0, y0) ( here that point is (0, 0) but they fail to be continuous at (x0, y0) and f fails to be differentiable at (x0, y0).