Derivatives in 2D James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 9, 2016 Outline Derivatives in 2D! Chain Rule
Let s go back to one dimensional calculus. If the function f is defined locally near x0 that means that f is defined in a circle Br (x0) = {x : x0 r < x < x0 + r} for some positive value of r. In this case, we can attempt to find the usual it as x approaches x0 that defines the derivative of f and x0: if this it exists, it is called f (x0) and f f (x) f (x0) (x0) =. x x0 x x0 This can be expressed in a different form. Recall that we can also use the ɛ δ notation to define a it. In this case, it means that if we choose a positive ɛ, then there is a positive δ so that 0 < x x0 < δ = f (x) f (x0) f (x0) x < ɛ. x0 Now define the error between the function value f (x) and the tangent line value f (x0) + f (x0)(x x0) to be E(x, x0). The above statement can be rewritten as 0 < x x0 < δ = f (x) f (x0) f (x0)(x x0) x x0 Then using the definition of error, E(x, x0), we see 0 < x x0 < δ = E(x, x0) x < ɛ. x0 This is the same as saying E(x, x0) x x0 x x0 = 0. < ɛ.
Now rewrite the inequality again to have 0 < x x0 < δ = E(x, x0) < ɛ x x0. Since we can do this for any positive ɛ, it works for the choice ɛ. Hence, there is a positive δ1 so that 0 < x x0 < δ1 = E(x, x0) < ɛ x x0 < ɛ δ1. But this work as long as 0 < x x0 < δ1. So it also works if 0 < x x0 < δ2 = min(δ1, ɛ) δ1! So 0 < x x0 < δ2 = E(x, x0) < ɛ x x0 < ɛ δ2 < ɛ ɛ < ɛ. So we can say x x0 E(x, x0) = 0 as well. This leads to the following theorem which we have already seen in the one variable part of these notes. Theorem Error Form of Differentiability For One Variable: If f is defined locally at x0, then f is differentiable at x0 if the error function E(x, x0) = f (x) f (x0) f (x0)(x x0) satisfies x x0 E(x, x0) = 0 and x x0 E(x, x0)/(x x0) = 0. Conversely, if there is a number L so that the error function E(x, x0) = f (x) f (x0) L(x x0) satisfies the same behavior, then f is differentiable at x0 with value f (x0) = L.
Proof If f is differentiable at x0, we have already outlined the argument. The converse argument is quite similar. Since we know E(x, x0)/(x x0) = 0, this tells us x x0 or f (x) f (x0) L(x x0) x x0 x x0 = 0 f (x) f (x0) L = 0. x x0 x x0 But this states that f is differentiable at x0 with value L. With this argument done, we have shown both sides of the statement are true. Note if f is differentiable at x0, f must be continuous at x0. This follows because f (x) = f (x0) + f (x0)(x x0) + E(x, x0) and as x x0, we have f (x) f (x0) which is the definition of f being continuous at x0. Hence, we can say Theorem Differentiable Implies Continuous: One Variable If f is differentiable at x0 then f is continuous at x0. Proof We have sketched the argument already.
We apply this idea to the partial derivatives of f (x, y). As long as f (x, y) is defined locally at (x0, y0), we have fx(x0, y0) and fy (x0, y0) exist if and only if there are error functions E1(x, y, x0, y0) and E2(x, y, x0, y0) so that f (x, y0) = f (x0, y0) + fx(x0, y0)(x x0) + E1(x, x0, y0) f (x0, y) = f (x0, y0) + fy (x0, y0)(y y0) + E2(y, x0, y0) with E1 0 and E1/(x x0) 0 as x x0 and E2 0 and E2/(y x0) 0 as y y0. Using the ideas we have presented here, we can come up with a way to define the differentiability of a function of two variables. Definition Error Form of Differentiability For Two Variables If f (x, y) is defined locally at (x0, y0), then f is differentiable at (x0, y0) if there are two numbers L1 and L2 so that the error function E(x, y, x0, y0) = f (x, y) f (x0, y0) L1(x x0) L2(y y0) satisfies (x,y) (x0,y0) E(x, y, x0, y0) = 0 and (x,y) (x0,y0) E(x, y, x0, y0)/ (x x0, y y0) = 0. Recall, the term (x x0, y y0) = (x x0) 2 + (y y0) 2.
Note if f is differentiable at (x0, y0), f must be continuous at (x0, y0). The argument is simple: f (x, y) = f (x0, y0) + L1 (x0, y0)(x x0) + L2 (y y0) + E(x, y, x0, y0) and as (x, y) (x0, y0), we have f (x, y) f (x0, y0) which is the definition of f being continuous at (x0, y0). Hence, we can say Theorem Differentiable Implies Continuous: Two Variables If f is differentiable at (x0, y0) then f is continuous at (x0, y0). Proof We have sketched the argument already. From this definition, we can show if f is differentiable at the point (x0, y0), then L1 = fx(x0, y0) and L2 = fy (x0, y0). The argument goes like this: since f is differentiable at (x0, y0), we can say (x,y) (x0,y0) f (x, y) f (x0, y0) L1(x x0) L2(y y0) (x x0) 2 + (y y0) 2 = 0. We can rewrite this using = x x0 and y = y y0 as (, y) (0,0) f (x0 +, y0 + y) f (x0, y0) L1 L2 y ()2 + ( y) 2 = 0. In particular, for y = 0, we find () 0 f (x0 +, y0) f (x0, y0) L1 () 2 = 0.
For > 0, we find () 2 = and so f (x0 +, y0) f (x0, y0) 0 + = L1. Thus, the right hand partial derivative fx(x0, y0) + exists and equals L1. On the other hand, if < 0, then () 2 = and we find, with a little manipulation, that we still have f (x0 +, y0) f (x0, y0) () 0 = L1. So the left hand partial derivative fx(x0, y0) exists and equals L1 also. Combining, we see fx(x0, y0) = L1. A similar argument shows that fy (x0, y0) = L2. Hence, we can say if f is differentiable at (x0, y0) then fx and fy exist at this point and we have f (x, y) = f (x0, y0) + fx(x0, y0)(x x0) + fy (x0, y0)(y y0) +Ef (x, y, x0, y0) where Ef (x, y, x0, y0) 0 and Ef (x, y, x0, y0)/ (x x0, y y0) 0 as (x, y) (x0, y0). Note this argument is a pointwise argument. It only tells us that differentiability at a point implies the existence of the partial derivatives at that point. Next, we look at 2D version of the chain rule.
Now that we know a bit about two dimensional derivatives, let s go for gold and figure out the new version of the chain rule. The argument we make here is very similar in spirit to the one dimensional one. You should go back and check it out! We assume there are two functions u(x, y) and v(x, y) defined locally about (x0, y0) and that there is a third function f (u, v) which is defined locally around (u0 = u(x0, y0), v0 = v(x0, y0)). Now assume f (u, v) is differentiable at (u0, v0) and u(x, y) and v(x, y) are differentiable at (x0, y0). Then we can say u(x, y) = u(x0, y0) + ux(x0, y0)(x x0) + uy (x0, y0)(y y0) + Eu(x, y, x0, y0) v(x, y) = v(x0, y0) + vx(x0, y0)(x x0) + vy (x0, y0)(y y0) + Ev (x, y, x0, y0) f (u, v) = f (u0, v0) + fu(u0, v0)(u u0) + fv (u0, v0)(v v0) + Ef (u, v, u0, v0) where all the error terms behave as usual as (x, y) (x0, y0) and (u, v) (u0, v0). Note that as (x, y) (x 0, y 0), u(x, y) u 0 = u(x 0, y 0) and v(x, y) v 0 = v(x 0, y 0) as u and v are continuous at the (u 0, v 0) since they are differentiable there. Let s consider the partial of f with respect to x. Let u = u(x 0 +, y 0) u(x 0, y 0) and v = v(x 0 +, y 0) v(x 0, y 0). Thus, u 0 + u = u(x 0 +, y 0) and v 0 + v = v(x 0 +, y 0). Hence, f (u 0 + u, v 0 + v) f (u 0, v 0) = fu(u0, v0)(u u0) + fv (u0, v0)(v v0) + E f (u, v, u 0, v 0) = f u(u 0, v 0) u u0 + f v (u 0, v 0) v v0 + E f (u, v, u 0, v 0)
Continuing f (u0 + u, v0 + v) f (u0, v0) ux(x0, y0)(x x0) + Eu(x, x0, y0) = fu(u0, v0) vx(x0, y0)(x x0) + Ev (x, x0, y0) + fv (u0, v0) Ef (u, v, u0, v0) + = fu(u0, v0) ux(x0, y0) + fv (u0, v0) vx(x0, y0) Eu(x, x0, y0) + + Ev (x, x0, y0) + Ef (u, v, u0, v0). As (x, y) (x 0, y 0), (u, v) (u 0, v 0) and so E f (u, v, u 0, v 0)/ 0. The other two error terms go to zero also as (x, y) (x 0, y 0). Hence, we conclude x A similar argument shows = u u x + v v x. y = u u y + v v y.
This result is known as the Chain Rule. Theorem The Chain Rule Assume there are two functions u(x, y) and v(x, y) defined locally about (x 0, y 0) and that there is a third function f (u, v) which is defined locally around (u 0 = u(x 0, y 0), v 0 = v(x 0, y 0)). Further assume f (u, v) is differentiable at (u 0, v 0) and u(x, y) and v(x, y) are differentiable at (x 0, y 0). Then f x and f y exist at (x 0, y 0) and are given by x y = u u = u x + v v x u y + v v y. Example Let f (x, y) = x 2 + 2x + 5y 4. Then if x = r cos(θ) and y = r sin(θ), using the chain rule, we find = x r x r + y y r = x θ x θ + y y θ This becomes ( ) ) = 2x + 2 cos(θ) + (20y 3 sin(θ) r ( )( ) )( ) = 2x + 2 r sin(θ) + (20y 3 r cos(θ) θ You can then substitute in for x and y to get the final answer in terms of r and θ (kind of ugly though!)
Example Let f (x, y) = 10x 2 y 4. Then if u = x 2 + 2y 2 and v = 4x 2 5y 2, using the chain rule, we find f (u, v) = 10u 2 v 4 and so This becomes x θ = = = u x u x + v v x = u y u y + v v y ) ) (20uv 4 2x + (40u 2 v 3 8x ) ) (20uv 4 4y + (40u 2 v 3 ( 10y) You can then substitute in for u and v to get the final answer in terms of x and y (even more ugly though!) Homework 34 34.1 Let f (x, y) = 2xy. Prove x 2 +y 2 (x,y) (0,0) f (x, y) = 0. This means 34.2 Let f (x, y) has a removeable discontinuity at (0, 0). Hint: x x 2 + y 2 and y x 2 + y 2. Do an ɛ δ proof here. f (x, y) = { x 2xy, (x, y) (0, 0) 2 +y 2 0, (x, y) = (0, 0) Find fx(0, 0) and fy (0, 0). ( They are both 0 ). For all (x, y) (0, 0) find fx and fy. Look at the paths (x, mx) for m 0 and show (x,y) (0,0) fx(x, y) and (x,y) (0,0) fy (x, y) do not exist. Hint: use x 0 + and x 0. You ll get its that depend on both the sign of x and the value of m. Explain how the result above shows the partials of f are not continuous at (0, 0).
Homework 34 34.3 Let f (x, y) = { x 2xy 2, (x, y) (0, 0) +y 2 0, (x, y) = (0, 0) We already know the partials of f exists at all points but that fx and fy are not continuous at (0, 0). If f was differentiable at (0, 0) there would be numbers L1 and L2 so that the error term E(x, y, 0, 0) = 2xy L1x L2y would satisfy x 2 +y 2 (x,y) (0,0) E(x, y, 0, 0) = 0 and (x,y) (0,0) E(x, y, 0, 0)/ (x, y) = 0. Show (x,y) (0,0) E(x, y, 0, 0) = 0 does exist L1, L2. Show (x,y) (0,0) E(x, y, 0, 0)/ (x, y) = 0 does not exist L1, L2 by looking at the paths (x, mx) like we did in HW 34.2. This example shows an f where the partials exist at all points locally around a point (x0, y0) ( here that point is (0, 0) but they fail to be continuous at (x0, y0) and f fails to be differentiable at (x0, y0).