Mon Apr Second derivative test, and maybe another conic diagonalization example. Announcements: Warm-up Exercise:

Similar documents
Machine Learning for Data Science (CS 4786)

Where do eigenvalues/eigenvectors/eigenfunctions come from, and why are they important anyway?

Machine Learning for Data Science (CS 4786)

Symmetric Matrices and Quadratic Forms

Theorem: Let A n n. In this case that A does reduce to I, we search for A 1 as the solution matrix X to the matrix equation A X = I i.e.

Eigenvalues and Eigenvectors

For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3.

PROPERTIES OF AN EULER SQUARE

lil fit c tai an.ie't 1111 At I 6qei ATA I Atb Y Ex Find the linear regression today power law example forhw i gl.es 6 inner product spaces

(3) If you replace row i of A by its sum with a multiple of another row, then the determinant is unchanged! Expand across the i th row:

Lecture 8: October 20, Applications of SVD: least squares approximation

Topics in Eigen-analysis

CHAPTER I: Vector Spaces

Linear Regression Demystified

1 Last time: similar and diagonalizable matrices

Mon Feb matrix inverses. Announcements: Warm-up Exercise:

Brief Review of Functions of Several Variables

v = -!g(x 0 ) Ûg Ûx 1 Ûx 2 Ú If we work out the details in the partial derivatives, we get a pleasing result. n Ûx k, i x i - 2 b k

(3) If you replace row i of A by its sum with a multiple of another row, then the determinant is unchanged! Expand across the i th row:

Topic 9 - Taylor and MacLaurin Series

(VII.A) Review of Orthogonality

Singular value decomposition. Mathématiques appliquées (MATH0504-1) B. Dewals, Ch. Geuzaine

Representing Functions as Power Series. 3 n ...

a for a 1 1 matrix. a b a b 2 2 matrix: We define det ad bc 3 3 matrix: We define a a a a a a a a a a a a a a a a a a

Apply change-of-basis formula to rewrite x as a linear combination of eigenvectors v j.

5.1 Review of Singular Value Decomposition (SVD)

Taylor Polynomials and Approximations - Classwork

PRELIMINARY MATHEMATICS LECTURE 5

Notes for Lecture 11

Some examples of vector spaces

Solutions to home assignments (sketches)

State Space Representation

Session 5. (1) Principal component analysis and Karhunen-Loève transformation

MA131 - Analysis 1. Workbook 3 Sequences II

Differentiable Convex Functions

Combinatorics and Newton s theorem

Phys. 201 Mathematical Physics 1 Dr. Nidal M. Ershaidat Doc. 12

TMA4205 Numerical Linear Algebra. The Poisson problem in R 2 : diagonalization methods

18.S096: Homework Problem Set 1 (revised)

R is a scalar defined as follows:

Section 7. Gaussian Reduction

5.1. The Rayleigh s quotient. Definition 49. Let A = A be a self-adjoint matrix. quotient is the function. R(x) = x,ax, for x = 0.

CHAPTER 3. GOE and GUE

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.

CHAPTER 6c. NUMERICAL INTERPOLATION

The Discrete-Time Fourier Transform (DTFT)

The Jordan Normal Form: A General Approach to Solving Homogeneous Linear Systems. Mike Raugh. March 20, 2005

Some Variants of Newton's Method with Fifth-Order and Fourth-Order Convergence for Solving Nonlinear Equations

Math 10A final exam, December 16, 2016

Maths /2014. CCP Maths 2. Reduction, projector,endomorphism of rank 1... Hadamard s inequality and some applications. Solution.

Machine Learning for Data Science (CS4786) Lecture 4

We will conclude the chapter with the study a few methods and techniques which are useful

RADICAL EXPRESSION. If a and x are real numbers and n is a positive integer, then x is an. n th root theorems: Example 1 Simplify

CS321. Numerical Analysis and Computing

CMSE 820: Math. Foundations of Data Sci.

x c the remainder is Pc ().

AN INTRODUCTION TO SPECTRAL GRAPH THEORY

Math 778S Spectral Graph Theory Handout #3: Eigenvalues of Adjacency Matrix

Tridiagonal reduction redux

PAijpam.eu ON TENSOR PRODUCT DECOMPOSITION

14.2 Simplifying Expressions with Rational Exponents and Radicals

Optimization Methods: Linear Programming Applications Assignment Problem 1. Module 4 Lecture Notes 3. Assignment Problem

MATH 1080: Calculus of One Variable II Fall 2017 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

c 2006 Society for Industrial and Applied Mathematics

Error for power series (Day 2) YOU MAY USE YOUR CALCULATOR TO COMPUTE FRACTIONS AND OTHER SIMPLE OPERATIONS

Cov(aX, cy ) Var(X) Var(Y ) It is completely invariant to affine transformations: for any a, b, c, d R, ρ(ax + b, cy + d) = a.s. X i. as n.

Abstract Vector Spaces. Abstract Vector Spaces

CS537. Numerical Analysis and Computing

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n

Section 1.1. Calculus: Areas And Tangents. Difference Equations to Differential Equations

Root Finding COS 323

Notes for Lecture 5. 1 Grover Search. 1.1 The Setting. 1.2 Motivation. Lecture 5 (September 26, 2018)

Principle Of Superposition

CALCULATION OF FIBONACCI VECTORS

11 Correlation and Regression

CHAPTER 5. Theory and Solution Using Matrix Techniques

This chapter describes different methods to discretize the diffusion equation. f z 2 = 0. y ) x f

ORTHOGONAL MATRIX IN CRYPTOGRAPHY

REPRESENTING MARKOV CHAINS WITH TRANSITION DIAGRAMS

Basic Iterative Methods. Basic Iterative Methods

g () n = g () n () f, f n = f () n () x ( n =1,2,3, ) j 1 + j 2 + +nj n = n +2j j n = r & j 1 j 1, j 2, j 3, j 4 = ( 4, 0, 0, 0) f 4 f 3 3!

Numerical Integration Formulas

Achieving Stationary Distributions in Markov Chains. Monday, November 17, 2008 Rice University

a for a 1 1 matrix. a b a b 2 2 matrix: We define det ad bc 3 3 matrix: We define a a a a a a a a a a a a a a a a a a

14.1 Understanding Rational Exponents and Radicals

Advanced Analysis. Min Yan Department of Mathematics Hong Kong University of Science and Technology

Mon Nov Power laws and least squares for log-log data; introduction to inner product spaces. Announcements: Warm-up Exercise:

Polynomial Functions and Their Graphs

Lecture 3: Catalan Numbers

Math Solutions to homework 6

ADVANCED TOPICS ON VIDEO PROCESSING

Inverse Matrix. A meaning that matrix B is an inverse of matrix A.

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

Notes on iteration and Newton s method. Iteration

S Y Y = ΣY 2 n. Using the above expressions, the correlation coefficient is. r = SXX S Y Y

Probability, Expectation Value and Uncertainty

6a Time change b Quadratic variation c Planar Brownian motion d Conformal local martingales e Hints to exercises...

Ellipsoid Method for Linear Programming made simple

Representing transformations by matrices

Transcription:

Math 2270-004 Week 15 otes We will ot ecessarily iish the material rom a give day's otes o that day We may also add or subtract some material as the week progresses, but these otes represet a i-depth outlie o what we pla to cover These otes cover material or Moday I'll add course review material or Tuesday later Mo Apr 23 72 Secod derivative test, ad maybe aother coic diagoalizatio example Aoucemets: Warm-up Exercise:

From last week Spectral Theorem Let A be a symmetric matrix The all o the eigevalues o A are real, ad there exists a orthoormal eigebasis B u 1, u 2, u cosistig o eigevectors or A Eigespaces with dieret eigevalues are automatically orthogoal to each other I ay eigespace has dimesio greater tha 1, its orthoormal basis may be costructed via Gram Schmidt Diagoalizatio o quadratic orms: Let Q x i, j 1 or a symmetric matrix A, with real etries A symmetric orthoormal eigebasis B u 1, u 2, u a i j x T A x For the correspodig orthogoal matrix P u 1 u 2 u by the spectral theorem there exists a D P T A P, where D is the diagoal matrix o eigevalues correspodig to the eigevectors i P Ad we have x P y where y x B ad P P E B Thus Q x x T A x y T P T AP y y T D y i 1 i y 2 i So by the orthogoal chage o variables all cross terms have bee removed Applicatios iclude coic curves, quartic suraces, multivariable secod derivative test, pricipal compoet aalysis (PCA) i statistics, sigular value matrix decompositio (SVD) i geometry ad computer sciece, ad more

Deiitio: The quadratic orm Q x positive deiite i i, j 1 a i j x T A x (or A a symmetric matrix) is called Q or all We see that this is the same as sayig that all o the eigevalues o A are positive Deiitio: The quadratic orm Q x egative deiite i i, j 1 a i j x T A x (or A a symmetric matrix) is called Q or all We see that this is the same as sayig that all o the eigevalues o A are egative

First ad secod derivative tests rom multivariable calculus, revisited It turs out that a lot o multivariable calculus is easier to uderstad oce you kow liear algebra This is just oe example o where that happes (Math majors will see this, ad quite a bit more, i Math 3220) Let : x 1 x 1, x 2,, x x 2 : x, u a uit vector The d D u x dt 0 t u t 0 is the rate o chage o i the directio o u, at ("The directioal derivative o, at, i the directio o u" This geeralizes pure partial derivatives, which are rates o chage i the stadard coordiatedirectios) Usig the multivariable versio o the chai rule we compute d dt t u i 1 t u d d t, i t u i t u u So at t 0 this rate o chage is computed via D u u i 1 t u u i Deiitio: Let be a dieretiable uctio as above The is a critical poit or i ad oly i x 1, x 1, x 0 I other words, a critical poit is a poit at which all directioal derivatives are zero Local extrema o dieretiable uctios will oly occur at critical poits, but ot all critical poits are the locatios o local extrema The ways i which thigs ca go wrog are more iterestig tha i the sigle-variable case, where we used the secod derivative test

I your irst multivariable calculus class you were probably show a secod derivative test or uctios o (oly) two variables The oe you were probably show obscures what's really goig o, which is actually simpler to uderstad i geeral oce you kow liear algebra Here's what you were probably show (take rom the begiig o the Wikipedia article o this topic): https://ewikipediaorg/wiki/secod_partial_derivative_test

Cotiuig the discussio about directioal derivatives, Deiitio: Let,, u be as above The secod derivative o at, i the u directio is deied by D u u d 2 dt 2 t u t 0 We compute this expressio with the chai rule, startig with our expressio or the irst directioal derivative, rom the previous pages: d 2 dt 2 i 1 t u d dt i 1 i 1 j 1 d dt t u u i t u u i 2 t u u j u i At t 0 ad recallig that 2 D u u d 2 dt 2 2 this reads t u t 0 i 1 j 1 2 u i u j Deiitio: The Hessia matrix o at, D 2 is the (symmetric) matrix o secod partial derivatives; etry i j D 2 2 xi : x1 x 1 x1 x 2 x1 x D 2 x2 x 1 x2 x 2 x2 x : : x x 1 x x 2 x x So, D u u u T D 2 u

Theorem The uctio is cocave up i every directio u at i ad oly i the Hessia matrix D 2 is positive deiite The uctio is cocave dow i every directio u at i ad oly i the Hessia matrix D 2 is egative deiite The irst case happes i ad oly i all o the eigevalues o D 2 are positive, ad the secod case happes i ad oly i they are all egative I is a critical poit or, the i the irst case is a local miimum value; ad i the secod case it is a local maximum value I the Hessia has some egative ad some positive eigevalues, the is either a local miimum or a local maximum I all the eigevalues are o-egative, or i they are all o-positive, but some are zero, the urther work is required to determie whether is a local extreme value

Exercise 1 Explai the (more complicated) secod derivative test you were taught i multivariable calculus or uctios o just two variables, as a special case o the the more geeral oe that uses eigevalues Hit: has roots det xx xy yx yy xx yy xx yy 2 2 xx yy xx yy xy 2 2 4 xx yy xy 2

Exercise 2) Which o the ollowig uctios has a local miimum at the origi, i ay? Could you diagoalize the associated quadratic orms ad sketch level sets? 2a) x, y x 2 4 x y y 2 2b) x, y x 2 x y y 2

> with plots : plot3d x 2 4 x y y 2, x 1 1, y 1 1 ; plot3d x 2 x y y 2, x 1 1, y 1 1 ; >