x k 34 k 34. x 3
|
|
- Rudolph Cox
- 5 years ago
- Views:
Transcription
1
2 A A A A A B A =, C =. f k k x k l f f 3 = k 3 k 3 x k 34 k 34 x 3 k 3 l 3 k 34 l 34. f 4 x 4 K K = [ ] 4 K = K (K) = (K)
3 I m m R I m m = [e, e,, e m ] R = [a, a,, a m ] R = (r ij ) r r r m R = r r m. r mm I m m = R R r r r m [e, e,, e m ] = [a, a,, a m ] r r m. r mm R R m i= r ii r ii R e = r a a = r e C m (k) {v = v v Cm : v i = i > k} v m C m () C m () C m (m) = C m C m (k) C m a C m () k i a k C m (k) I m m = RR e i+ = m a k r k(i+) = k= a i+ = r (i+)(i+) ( i a k r k(i+) + a i+ r (i+)(i+). k= e i+ ) m a k r k(i+) C m (i + ). k= a k C m (k) k m R d d d c c F d 8 f j (i) d C 8 c C 8 F c = d F C 8 F {} d c Ad = c c C 8 AF c = c c C 8 A = F A F ij = f j (i) c c 8 A A = A A
4 m m m x + x = (x i + x i ) = (x i + x i) + x i x i = x + x + x x = x + x, i= i= k i= x i = k i= x i k n n+ i= i= n x i = x i + x n+ = i= n n+ x i + x n+ = x i. n N Ax = λx x C \ {} λ x = λx x = x (λx) = x Ax = x A x = (Ax) x = (λx) x = λ x x = λ x. x λ = λ Ax = λx Ay = µy λ µ λy x = y (Ax) = (y A)x = (y A )x = (Ay) x = (µy) x = µy x, µ = µ λ µ y x = x y A λ A x x λ = i= i= x x = x (A A)x = (Ax) (Ax) = (λx) (λx) = λ x x. A (A + A ) (A A ) is (is) = i S = i( S) = is λ S λi is λ x (I S)x = x = Sx x x = (Sx) x = x S x = x Sx = x x. x x = x (I S) = {} I S
5 A = [(I S) (I + S)] [(I S) (I + S)] U (U ) = (U ) A = (I + S) (I S) (I S) (I + S) = (I + S )[(I S)(I S) ] (I + S) = (I S)[(I S)(I + S)] (I + S) = (I S)(I S ) (I + S). A = A(I S)(I S) = (I S)[(I S ) (I S )](I S) = (I S)(I S) = I Q u = u A x C \ {} Ax = x + uv x = x = u(v x) u x = αu α C Ax = αu + u(v (αu)) = αu( + v u) =. v u = v u = α \{} A(αu) = αu + uv (αu) = αu( + v u) = A A v u = (A) = {αu : α C} u A A A [x,, x m ] AA = (I + uv )[x,, x m ] = [x + uv x,, x m + uv x m ] = I = [e,, e m ]. x i +uv x i = e i ( i m) A [e θ u,, e m θ m u] = I uθ θ = θ θ m I = AA = (I + uv )(I uθ ) = I uθ + uv uv uθ, = uθ i +uv i u(v u)θ i ( i m) θ i θ i = v i +v u A = I uv +v u [ ] H k+ = Hk H k (H k ) = H k H k [ H k H k H k H k ]. H k α k α = [ ] [ Hk+ T H T = k Hk T Hk H ] = α k k = α k H k+. H T k H T k H k H k H k (k N) α k = k
6 x W = W x x W = W x = W x = x = W x + y W = W (x + y) = W x + W y W x + W y = x W + y W. αx W = W (αx) = α(w x) = α W x = α x W λ A x C m \ {} Ax = λx λ = Ax x Ay = A. y C m \{} y x i = i m x i x = x i m i= x i = x x = αe i (α C, i m) x i = i m x i x = m i= x i m i= x i = m x i = m x x = α (α C) x C n \ {} Ax Ax x = n Ax. n x x A n A x C n \ {} Ax x m Ax x. A m A A i i A A i i A B A
7 p u C m v C n E = uv x C n \ {} Ex x = uv x x = v x u x u v, E u v x = v E = u v E ij = u i v j m n m n m n E F = E ij = u i v j = ( u i )( v j ) = u F v F. i= j= i= j= i= j= x x ( x x ) x = x x = x = x + x = y (x + x ) ( y x + y x ) y x + y x = x + x. y = y = y = y = α C αx = y = y (αx) = y = α y x = α x x z C m z x = z x z = e iθ z / z θ = (z x) z = z x = e iθ z z x = z x z = x =. Bx = (yz )x = y(z x) = y B = Bw = yz w = z w y = w z = z =. w = w = w = w = z B = yz Bx = y B = B Bx = y B = z x = = x z = x X l X l(x) = x l = l(x) = l( m i= x ie i ) = [l(e ),, l(e m )] x z z = [l(e ),, l(e m )] l(w) = z w w C m z x = l(x) = x z = l = x m
8 A = σ Av σ w = v c + v s c s c + s = Aw < σ Aw < σ {σ j } {σj } AA A A {u j } AA σj σ j AA σj AA u j {v j } A A = UΣV A AA = U(ΣΣ )U (AA )U = U(ΣΣ ) U [u, u,, u m ] { (AA [σ )[u, u,, u m ] = u, σu,, σmu m ] m n [σu,, σnu n,,, ] m > n. p = {m, n} σ σ σ p AA u u u p U m n U σ σ σ m m > n n U σ σ σ n m n U AA U A p A V (A A)V = V (ΣΣ ) ΣΣ Σ Σ U = I Σ = U = [ ] Σ = [ ] [ ] 3 V = [ ] 3 V = U = I 3 3 Σ = V = U = I Σ = U = [ ] [ ] [ ] V = [ ] Σ = [ ] [ ] V = [ ] {u, w, v} = [{{, }, {, }}]
9 A = a a a n B = a m a m, a a m a m a mn a mn a m,n a n B A T A B A B On pre image plane 3 On image plane
10 A U Σ V B U Σ V A B Q A = QBQ A = Q(U Σ V )Q = (QU )Σ (QV ) (QU )Σ (QV ) A Σ = Σ A B [ ] [ ] A = B = A B A
11 C m R m A A A A v v v n V = [v,, v n ] λ i v i (i =,, n) A A λ λ λ n r λ r > λ r+ = = λ n = (A A) = (A) x (A A) A Ax = (Ax) (Ax) = x A Ax = Ax = x (A). σ i = λ i u i = Avi σ i i =,, r {u,, u r } u i u j = (Av i) (Av j ) σ i σ j = v i (A A)v j λi λ j = λ jvi v j = δ ij. λi λ j {u,, u r } R m {u,, u r, u r+,, u m } U = [u,, u m ] U AV (A A) = (A) u { U AV = u A[v,, v n ] = (u i Av j ), u j > r i Av j = σ i δ ij j r. u m U AV Σ A = UΣV A [ ] [ ] AA = = [ ] [ ] 5 4, (λi AA λ 5 4 ) = = λ 9λ λ 4 AA λ, = 9± 65 ( 9+ A σ(a) = ) / ( σ 9 (A) = ) / A UΣV ε > A ε = U(Σ + εi m n )V I ij = δ ij Σ (Σ + εi m n ) A ε A A ε = U(εI m n )V = ε. ε A A ε = A ε C m n
12 U AA T = UΣΣU T [ ] [ ] AA T = = [ ] ([ ]) 5 λ 3 = λ λ + 6 = (λ 8)(λ ). 3 5 λ AA T 8 5 = 5 = 5 A 5 [ ] [ ] U = U 3 5 U a, b {, } V = A T UΣ = [ ] 5 U = [ a a [ a a ] b, b ] b b [ 5 ] [ 3 = U V a = b = 5 a 4 5 b 4 5 a 3 5 b A 5 [ ] [ ] [ ] [ ] 3/5 4/5 / ± ± ± 4/5 3/5 / / ± / R A ]. On pre image plane 5 On image plane A = A F = + 5 = 5 A = 6 A = 5
13 A = UΣV T U V [ A = (UΣV T ) = V Σ U T 3 = ] [ 5 ] [ ] = [ ] 5. A [ ] λ = λ 3λ +. 5 λ A λ, = (3 ± 39i)/ A = 5 ( ) = λ λ = 4 (9 + 39) = σ σ = 5 = (A) = [ ] A A = = [ ] [ ] [ ] V Σ U Im m UΣV UΣV = I m m V Σ U [ ] [ ] [ ] [ ] Im m U Σ V I m m V Σ U. [ ] Im m I m m [ ] [ ] U Im m V I m m [ ] A A [ ] V U = [ Im m I m m [ ] [ ] U U V V ] [ U V [ ] [ Im m U I m m V ] ] [ ] Im m. I m m ([ ] [ ]) ([ ] [ ]) ([ ] [ Im m U Σ Im m Im m U = I m m V Σ I m m I m m V ([ ] [ ]) [ ] ([ ] [ ]) Im m U Σ Im m U = I m m V Σ. I m m V Σ Σ = Σ [ ] [ ] [ ] [ ] [ ] [ ] Σ Im m I m m Im m I = m m Σ Im m I, m m Im m I m m = I Σ I m m I m m I m m I m m Σ I m m I m m I m m I m m. m m ])
14 [ ] Σ = Σ X = [ ] Im m I m m I m m I m m [ Σ Σ [ Im m I m m [ ] Σ Σ ] ] [ U V [ ] Σ X X Σ ] [ Im m [ ] Im m I m m I m m I m m I m m [ A A ] I m m I m m ], (I P ) = I P = I P (I P ) (I P ) = [(I P ) P ] = (I P ) (I P )P P (I P ) + P = I P + P = I, I P F = I E = 4 (I + F + F ) = I+F = E E F F = F = F E = E E E E = A A = UΣV A Σ A A = V Σ ΣV A A Σ A A A A A x A Ax = Ax = x A Ax = A A x Ax = A Ax = A A
15 A A a = / / a = P A P x = a xa + a xa = a a x + a a x = ( ) a / / (a, a ) a x = AA x x C 3 P = AA = / / P (,, 3) (,, ) P = B(B B) B = P (,, 3) (,, ) v (P ) P v = v P = x Cm \{} P x x UΣV x C m P = P x = x P P x = x V Σ U UΣV x = x V Σ V x. P x ΣV x Σx = x C m \{} x x C m \{} V = = Σ. x x C m \{} x P P = Σ = Σ = I P = UΣV UΣV = P = UΣV V UΣ = I Σ = I U = V P A A A = ( ) = = B = (b, b ) ( ) r r (b, b ) = (q, q ) r r = b = q = b / b = (/,, / ) T r = q b = (/,, / ) b r q = / / = =
16 r = 3 B / / 3 B = / ( ) 3 / /. 3 3 B / 6 q 3 = q q = / 6 /. 6 / / 3 / 6 B = / 3 / 6 / / 3 / 3. 6 ˆR r ij = i > j i j i j A A = A A = QR A A = Q R = m j= r jj r ij = qi a j(i j) r jj = a j j i= r ijq i j r jj = a j i= j r ij q i = (a j i= j r ij qi )(a j i= j r ij q i ) = a j i= j rij + i= r ij a j. A = m j= r jj m j= a j m = P P () P () P x () y () x () y () 3 (x (), y () ) q () 3 Q x () y () 3 (x (), y () ) q () 3 Q x () y () v P q () 3 q () 3 Q 3 (q () ) v 3 3, q() 3 A = (a, a,, a n ) ˆQ = (q, q,, q n ) r r r n ˆQ = r r n. r nn
17 A k k n a k a a k r kk = r kk = k k n k a k = r ik q i a,, a k. i= a k a a k A A n ˆR k a,, a k = q,, q k. A k A k ˆQ ˆR = (q, q, q 3 ) = (q,, q ) = A. (A) = > k = q v = a q = v v m m R m m q m + m = 3m m v v q 3m q q q j q j v j = P qj P q P q a j. P q a j = a j q q a j m (m ) q a j m q (q a j ) m a j q (q a j ) m+(m )+m+m = 4m (j ) v j (4m )(j ) q q j = vj v j 3m q j (4m )(j )+3m n j= [(4m )(j )+3m] = (4m ) n(n ) + 3mn k = a = k = a =
18
19 R j R j = = r jj r j(j+) r jj r jn r jj r jj r j(j+) r jn q i = v i /r ii v j = v j r ij q i
20
21 slope = logarithm of max error between the first 4 discrete and continuous Legendre polynomials (base /) power of the grid spacing A A A m A N m m N (m ) N m = A = (I +N) = I N +(N) +( ) m (N) m m = 3 A =, A = 4. σ m A σ m = A. σ m A A e m = (( ) m, ( ) m,, ) A (( ) m, ( ) m,, ) = 4 m 3 σ m 3 4m. m = 3 σ m.8 m = 5 σ m
22 F I qq q λ x x qq x = λx ( λ)x = (q x)q q = {x : q x = } H λ x q x = µq µ ( λ)µ = µ λ = q = {µq : µ C R} x x xe H F = I F = I F = ± a a m F F m i= a i = F m i= a i = m a i p + = {a i : a i =, i m} p = {a i : a i =, i m} { p + + p = m p + p = m, p + = m p = F = m i= a i = F = H H H H (m ) F = F = F F
23
24 Q R Q = , R = Q R Q = , R = Q R Q = , R = QR = (q,, q n ) Q R r r n ( x F = y) ( ( ) c s x = s c) y ( ) cx + sy. sx + cy
25 ( x y) F ( x y ) [( ) x + F y ( x F y) ( x y )] [( x y ( c x + s y ) ( x y) s + c+ y ) F F ( ( ) ( ) x θ θ r α J = y) θ θ r α J θ ( )] x = [ (x, y) y ( ) x (x, y)f F y ( ) θ α + θ α = r = r θ α + θ α ( )] x =. y ( ) (α θ). (α θ) m = n ( ) m > n A ˆQ ˆR ˆQ = ˆR ˆQ ˆR n n ˆQ n n A = ˆQ ˆR A + = (A A) A = ( ˆR ˆQ ˆQ ˆR) ( ˆR ˆQ ) = ˆR ˆQ = A ˆQ ˆQ. A + A ˆQ ˆQ ˆQ ˆQ ˆQ ( ) m (m n) H ( ˆQ, H) H H H H n (m n) H (m n) (m n) b R m C m ( ) ( ) ˆQ ˆQ b ˆQ H ˆQ b ˆQ H ( ) = ˆQ b ( ˆQ b H b) ˆQ ˆQ A + A ) = ( ˆQ, H b = b. Q m m ( ) Q Q Q =, Q Q Q n n Q n (m n) Q (m n) n Q (m n) (m n) Q Q = Q Q z R n C n ( ) ( ) Q z Q Q z = z Q Q,
26 a a a 3 [ a e x + a x + a 3 Γ(x) x ] dx F (a, a, a 3 ) = [ a e x + a x + a 3 Γ(x) x ] dx a F (a, a, a 3 ) = a F (a, a, a 3 ) = a 3 F (a, a, a 3 ) = ex dx ex xdx Γ(x)ex dx ex xdx xdx Γ(x) xdx ex Γ(x)dx xγ(x)dx Γ (x)dx a a a 3 = x x Γ(x) x e x x dx dx. dx
27 A m > n n x
28
29 x x x 3 a a a a 3 a 4 a 5 a 6 a 7 a 8 a 9 a a x 4 x 5 x 6 a a a a 3 a 4 a 5 a 6 a 7 a 8 a 9 a a
x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7
Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationbe a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u
MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =
More informationSpectrum and Pseudospectrum of a Tensor
Spectrum and Pseudospectrum of a Tensor Lek-Heng Lim University of California, Berkeley July 11, 2008 L.-H. Lim (Berkeley) Spectrum and Pseudospectrum of a Tensor July 11, 2008 1 / 27 Matrix eigenvalues
More informationLinear Algebra and Matrix Inversion
Jim Lambers MAT 46/56 Spring Semester 29- Lecture 2 Notes These notes correspond to Section 63 in the text Linear Algebra and Matrix Inversion Vector Spaces and Linear Transformations Matrices are much
More informationMath 6610 : Analysis of Numerical Methods I. Chee Han Tan
Math 6610 : Analysis of Numerical Methods I Chee Han Tan Last modified : August 18, 017 Contents 1 Introduction 5 1.1 Linear Algebra.................................... 5 1. Orthogonal Vectors and Matrices..........................
More informationENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition
ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition
More informationEigenvalues and Eigenvectors
/88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix
More informationMatrix Analysis and Algorithms
Matrix Analysis and Algorithms Andrew Stuart Jochen Voss 4th August 2009 2 Introduction The three basic problems we will address in this book are as follows. In all cases we are given as data a matrix
More informationMaths for Signals and Systems Linear Algebra in Engineering
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE
More informationBasic Elements of Linear Algebra
A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,
More informationLeast Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo
Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 26, 2010 Linear system Linear system Ax = b, A C m,n, b C m, x C n. under-determined
More informationDef. The euclidian distance between two points x = (x 1,...,x p ) t and y = (y 1,...,y p ) t in the p-dimensional space R p is defined as
MAHALANOBIS DISTANCE Def. The euclidian distance between two points x = (x 1,...,x p ) t and y = (y 1,...,y p ) t in the p-dimensional space R p is defined as d E (x, y) = (x 1 y 1 ) 2 + +(x p y p ) 2
More informationLinear Algebra Review
Linear Algebra Review Contents 1 Basic Concepts and Notations 2 2 Matrix Operations and Properties 3 21 Matrix Multiplication 3 211 Vector-Vector Products 3 212 Matrix-Vector Products 4 213 Matrix-Matrix
More informationIntroduction and preliminaries
Chapter Introduction and preliminaries Partial differential equations What is a partial differential equation? ODEs Ordinary Differential Equations) have one variable x). PDEs Partial Differential Equations)
More information9. Banach algebras and C -algebras
matkt@imf.au.dk Institut for Matematiske Fag Det Naturvidenskabelige Fakultet Aarhus Universitet September 2005 We read in W. Rudin: Functional Analysis Based on parts of Chapter 10 and parts of Chapter
More informationLegendre s Equation. PHYS Southern Illinois University. October 18, 2016
Legendre s Equation PHYS 500 - Southern Illinois University October 18, 2016 PHYS 500 - Southern Illinois University Legendre s Equation October 18, 2016 1 / 11 Legendre s Equation Recall We are trying
More informationNumerical Methods for Solving Large Scale Eigenvalue Problems
Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems
More informationMath 3108: Linear Algebra
Math 3108: Linear Algebra Instructor: Jason Murphy Department of Mathematics and Statistics Missouri University of Science and Technology 1 / 323 Contents. Chapter 1. Slides 3 70 Chapter 2. Slides 71 118
More information18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in
86 Problem Set - Solutions Due Thursday, 29 November 27 at 4 pm in 2-6 Problem : (5=5+5+5) Take any matrix A of the form A = B H CB, where B has full column rank and C is Hermitian and positive-definite
More informationAM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition
AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized
More information5 Selected Topics in Numerical Linear Algebra
5 Selected Topics in Numerical Linear Algebra In this chapter we will be concerned mostly with orthogonal factorizations of rectangular m n matrices A The section numbers in the notes do not align with
More informationAyuntamiento de Madrid
9 v vx-xvv \ ü - v q v ó - ) q ó v Ó ü " v" > - v x -- ü ) Ü v " ñ v é - - v j? j 7 Á v ü - - v - ü
More informationPositive Definite Matrix
1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function
More informationKrylov subspace projection methods
I.1.(a) Krylov subspace projection methods Orthogonal projection technique : framework Let A be an n n complex matrix and K be an m-dimensional subspace of C n. An orthogonal projection technique seeks
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationJim Lambers MAT 610 Summer Session Lecture 1 Notes
Jim Lambers MAT 60 Summer Session 2009-0 Lecture Notes Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationSingular Value Decomposition
Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =
More information) k ( 1 λ ) n k. ) n n e. k!(n k)! n
k ) λ ) k λ ) λk k! e λ ) π/!. e α + α) /α e k ) λ ) k λ )! λ k! k)! ) λ k λ k! λk e λ k! λk e λ. k! ) k λ ) k k + k k k ) k ) k e k λ e k ) k EX EX V arx) X Nα, σ ) Bp) Eα) Πλ) U, θ) X Nα, σ ) E ) X α
More informationMath113: Linear Algebra. Beifang Chen
Math3: Linear Algebra Beifang Chen Spring 26 Contents Systems of Linear Equations 3 Systems of Linear Equations 3 Linear Systems 3 2 Geometric Interpretation 3 3 Matrices of Linear Systems 4 4 Elementary
More informationMath 489AB Exercises for Chapter 2 Fall Section 2.3
Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial
More informationSteven J. Leon University of Massachusetts, Dartmouth
INSTRUCTOR S SOLUTIONS MANUAL LINEAR ALGEBRA WITH APPLICATIONS NINTH EDITION Steven J. Leon University of Massachusetts, Dartmouth Boston Columbus Indianapolis New York San Francisco Amsterdam Cape Town
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationSolutions for Chapter 3
Solutions for Chapter Solutions for exercises in section 0 a X b x, y 6, and z 0 a Neither b Sew symmetric c Symmetric d Neither The zero matrix trivially satisfies all conditions, and it is the only possible
More informationMatrix Theory. A.Holst, V.Ufnarovski
Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different
More information3 Applications of partial differentiation
Advanced Calculus Chapter 3 Applications of partial differentiation 37 3 Applications of partial differentiation 3.1 Stationary points Higher derivatives Let U R 2 and f : U R. The partial derivatives
More informationRational constants of monomial derivations
Rational constants of monomial derivations Andrzej Nowicki and Janusz Zieliński N. Copernicus University, Faculty of Mathematics and Computer Science Toruń, Poland Abstract We present some general properties
More informationLECTURE 16 GAUSS QUADRATURE In general for Newton-Cotes (equispaced interpolation points/ data points/ integration points/ nodes).
CE 025 - Lecture 6 LECTURE 6 GAUSS QUADRATURE In general for ewton-cotes (equispaced interpolation points/ data points/ integration points/ nodes). x E x S fx dx hw' o f o + w' f + + w' f + E 84 f 0 f
More informationCPSC 313 Introduction to Computability
CPSC 313 Introduction to Computability Grammars in Chomsky Normal Form (Cont d) (Sipser, pages 109-111 (3 rd ed) and 107-109 (2 nd ed)) Renate Scheidler Fall 2018 Chomsky Normal Form A context-free grammar
More informationOrthogonalization and least squares methods
Chapter 3 Orthogonalization and least squares methods 31 QR-factorization (QR-decomposition) 311 Householder transformation Definition 311 A complex m n-matrix R = [r ij is called an upper (lower) triangular
More informationLinear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg
Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector
More informationThe Singular Value Decomposition and Least Squares Problems
The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving
More informationChapter y. 8. n cd (x y) 14. (2a b) 15. (a) 3(x 2y) = 3x 3(2y) = 3x 6y. 16. (a)
Chapter 6 Chapter 6 opener A. B. C. D. 6 E. 5 F. 8 G. H. I. J.. 7. 8 5. 6 6. 7. y 8. n 9. w z. 5cd.. xy z 5r s t. (x y). (a b) 5. (a) (x y) = x (y) = x 6y x 6y = x (y) = (x y) 6. (a) a (5 a+ b) = a (5
More informationMTH5102 Spring 2017 HW Assignment 1: Prob. Set; Sec. 1.2, #7, 8, 12, 13, 20, 21 The due date for this assignment is 1/18/17.
MTH5102 Spring 2017 HW Assignment 1: Prob. Set; Sec. 1.2, #7, 8, 12, 13, 20, 21 The due date for this assignment is 1/18/17. 7. Let S = {0, 1} and F = R. In F (S, R), show that f = g and f + g = h, where
More informationMATH 532: Linear Algebra
MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline
More informationMatrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =
30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can
More informationvibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...
6 Eigenvalues Eigenvalues are a common part of our life: vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow, The simplest
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More informationEconomics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010
Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010 Diagonalization of Symmetric Real Matrices (from Handout Definition 1 Let δ ij = { 1 if i = j 0 if i j A basis V = {v 1,..., v n } of R n
More informationMTH50 Spring 07 HW Assignment 7 {From [FIS0]}: Sec 44 #4a h 6; Sec 5 #ad ac 4ae 4 7 The due date for this assignment is 04/05/7 Sec 44 #4a h Evaluate the erminant of the following matrices by any legitimate
More informationMethods of Mathematical Physics X1 Homework 2 - Solutions
Methods of Mathematical Physics - 556 X1 Homework - Solutions 1. Recall that we define the orthogonal complement as in class: If S is a vector space, and T is a subspace, then we define the orthogonal
More informationEngg. Math. I. Unit-I. Differential Calculus
Dr. Satish Shukla 1 of 50 Engg. Math. I Unit-I Differential Calculus Syllabus: Limits of functions, continuous functions, uniform continuity, monotone and inverse functions. Differentiable functions, Rolle
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationMATH 431: FIRST MIDTERM. Thursday, October 3, 2013.
MATH 431: FIRST MIDTERM Thursday, October 3, 213. (1) An inner product on the space of matrices. Let V be the vector space of 2 2 real matrices (that is, the algebra Mat 2 (R), but without the mulitiplicative
More informationExamples: Solving nth Order Equations
Atoms L. Euler s Theorem The Atom List First Order. Solve 2y + 5y = 0. Examples: Solving nth Order Equations Second Order. Solve y + 2y + y = 0, y + 3y + 2y = 0 and y + 2y + 5y = 0. Third Order. Solve
More informationGMRES: Generalized Minimal Residual Algorithm for Solving Nonsymmetric Linear Systems
GMRES: Generalized Minimal Residual Algorithm for Solving Nonsymmetric Linear Systems Tsung-Ming Huang Department of Mathematics National Taiwan Normal University December 4, 2011 T.-M. Huang (NTNU) GMRES
More informationFurther Mathematical Methods (Linear Algebra)
Further Mathematical Methods (Linear Algebra) Solutions For The Examination Question (a) To be an inner product on the real vector space V a function x y which maps vectors x y V to R must be such that:
More informationProblems of Eigenvalues/Eigenvectors
67 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Spectrum Decomposition
More informationComputational Methods. Eigenvalues and Singular Values
Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations
More informationChapter 2 Vector-matrix Differential Equation and Numerical Inversion of Laplace Transform
Chapter 2 Vector-matrix Differential Equation and Numerical Inversion of Laplace Transform 2.1 Vector-matrix Differential Equation A differential equation and a set of differential (simultaneous linear
More informationLinear algebra for computational statistics
University of Seoul May 3, 2018 Vector and Matrix Notation Denote 2-dimensional data array (n p matrix) by X. Denote the element in the ith row and the jth column of X by x ij or (X) ij. Denote by X j
More informationDimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas
Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx
More informationThe Corrected Trial Solution in the Method of Undetermined Coefficients
Definition of Related Atoms The Basic Trial Solution Method Symbols Superposition Annihilator Polynomial for f(x) Annihilator Equation for f(x) The Corrected Trial Solution in the Method of Undetermined
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More informationforms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms
Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.
More informationEigenvalue Problems and Singular Value Decomposition
Eigenvalue Problems and Singular Value Decomposition Sanzheng Qiao Department of Computing and Software McMaster University August, 2012 Outline 1 Eigenvalue Problems 2 Singular Value Decomposition 3 Software
More informationVectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =
Linear Algebra Review Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1 x x = 2. x n Vectors of up to three dimensions are easy to diagram.
More informationLinear Algebra. Week 7
Linear Algebra. Week 7 Dr. Marco A Roque Sol 10 / 09 / 2018 If {v 1, v 2,, v n } is a basis for a vector space V, then any vector v V, has a unique representation v = x 1 v 1 + x 2 v 2 + + x n v n where
More information1 Arithmetic calculations (calculator is not allowed)
1 ARITHMETIC CALCULATIONS (CALCULATOR IS NOT ALLOWED) 1 Arithmetic calculations (calculator is not allowed) 1.1 Check the result Problem 1.1. Problem 1.2. Problem 1.3. Problem 1.4. 78 5 6 + 24 3 4 99 1
More informationOQ4867. Let ABC be a triangle and AA 1 BB 1 CC 1 = {M} where A 1 BC, B 1 CA, C 1 AB. Determine all points M for which ana 1...
764 Octogon Mathematical Magazine, Vol. 24, No.2, October 206 Open questions OQ4867. Let ABC be a triangle and AA BB CC = {M} where A BC, B CA, C AB. Determine all points M for which 4 s 2 3r 2 2Rr AA
More informationa 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.
Chapter 1 LINEAR EQUATIONS 11 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,, a n, b are given real
More informationSTAT200C: Review of Linear Algebra
Stat200C Instructor: Zhaoxia Yu STAT200C: Review of Linear Algebra 1 Review of Linear Algebra 1.1 Vector Spaces, Rank, Trace, and Linear Equations 1.1.1 Rank and Vector Spaces Definition A vector whose
More informationInternational Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994
International Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994 1 PROBLEMS AND SOLUTIONS First day July 29, 1994 Problem 1. 13 points a Let A be a n n, n 2, symmetric, invertible
More informationMatrix Differentiation
Matrix Differentiation CS5240 Theoretical Foundations in Multimedia Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore Leow Wee Kheng (NUS) Matrix Differentiation
More informationOrdinary Differential Equations (ODEs)
Chapter 13 Ordinary Differential Equations (ODEs) We briefly review how to solve some of the most standard ODEs. 13.1 First Order Equations 13.1.1 Separable Equations A first-order ordinary differential
More informationAlgebra Workshops 10 and 11
Algebra Workshops 1 and 11 Suggestion: For Workshop 1 please do questions 2,3 and 14. For the other questions, it s best to wait till the material is covered in lectures. Bilinear and Quadratic Forms on
More informationCALCULUS JIA-MING (FRANK) LIOU
CALCULUS JIA-MING (FRANK) LIOU Abstract. Contents. Power Series.. Polynomials and Formal Power Series.2. Radius of Convergence 2.3. Derivative and Antiderivative of Power Series 4.4. Power Series Expansion
More informationNon-Solvable Ordinary Differential Equations With Applications
Non-Solvable Ordinary Differential Equations With Applications Linfan MAO (Chinese Academy of Mathematics and System Science, Beijing 100080, P.R.China) E-mail: maolinfan@163.com Abstract: Different from
More informationLinear Least Squares. Using SVD Decomposition.
Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any
More informationFor more information visit
If the integrand is a derivative of a known function, then the corresponding indefinite integral can be directly evaluated. If the integrand is not a derivative of a known function, the integral may be
More informationELEMENTARY LINEAR ALGEBRA
ELEMENTARY LINEAR ALGEBRA K R MATTHEWS DEPARTMENT OF MATHEMATICS UNIVERSITY OF QUEENSLAND First Printing, 99 Chapter LINEAR EQUATIONS Introduction to linear equations A linear equation in n unknowns x,
More information2. Matrix Algebra and Random Vectors
2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns
More informationSpectral radius, symmetric and positive matrices
Spectral radius, symmetric and positive matrices Zdeněk Dvořák April 28, 2016 1 Spectral radius Definition 1. The spectral radius of a square matrix A is ρ(a) = max{ λ : λ is an eigenvalue of A}. For an
More informationIntegration by Parts
Calculus 2 Lia Vas Integration by Parts Using integration by parts one transforms an integral of a product of two functions into a simpler integral. Divide the initial function into two parts called u
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 5: Numerical Linear Algebra Cho-Jui Hsieh UC Davis April 20, 2017 Linear Algebra Background Vectors A vector has a direction and a magnitude
More informationCharacteristic Polynomial
Linear Algebra Massoud Malek Characteristic Polynomial Preleminary Results Let A = (a ij ) be an n n matrix If Au = λu, then λ and u are called the eigenvalue and eigenvector of A, respectively The eigenvalues
More informationSection Let A =. Then A has characteristic equation λ. 2 4λ + 3 = 0 or (λ 3)(λ 1) = 0. Hence the eigenvalues of A are λ 1 = 3 and λ 2 = 1.
Sectio 63 4 3 Let A The A has characteristic equatio λ 2 4λ + 3 or (λ 3)(λ ) Hece the eigevalues of A are λ 3 ad λ 2 λ 3 The correspodig eigevectors satisfy (A λ I 2 )X, or 3 3 or equivaletly x 3y Hece
More informationVector Space and Linear Transform
32 Vector Space and Linear Transform Vector space, Subspace, Examples Null space, Column space, Row space of a matrix Spanning sets and Linear Independence Basis and Dimension Rank of a matrix Vector norms
More informationMa/CS 6b Class 20: Spectral Graph Theory
Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x
More information4 Differential Equations
Advanced Calculus Chapter 4 Differential Equations 65 4 Differential Equations 4.1 Terminology Let U R n, and let y : U R. A differential equation in y is an equation involving y and its (partial) derivatives.
More informationSome notes on Linear Algebra. Mark Schmidt September 10, 2009
Some notes on Linear Algebra Mark Schmidt September 10, 2009 References Linear Algebra and Its Applications. Strang, 1988. Practical Optimization. Gill, Murray, Wright, 1982. Matrix Computations. Golub
More informationProblems of Eigenvalues/Eigenvectors
5 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Singular Value Decomposition
More informationThe Laplace Transform. Background: Improper Integrals
The Laplace Transform Background: Improper Integrals Recall: Definite Integral: a, b real numbers, a b; f continuous on [a, b] b a f(x) dx 1 Improper integrals: Type I Infinite interval of integration
More informationUNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationSolutions and Proofs: Optimizing Portfolios
Solutions and Proofs: Optimizing Portfolios An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Covariance Proof: Cov(X, Y) = E [XY Y E [X] XE [Y] + E [X] E [Y]] = E [XY] E [Y] E
More informationElementary operation matrices: row addition
Elementary operation matrices: row addition For t a, let A (n,t,a) be the n n matrix such that { A (n,t,a) 1 if r = c, or if r = t and c = a r,c = 0 otherwise A (n,t,a) = I + e t e T a Example: A (5,2,4)
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More information1. The graph of a function f is given above. Answer the question: a. Find the value(s) of x where f is not differentiable. Ans: x = 4, x = 3, x = 2,
1. The graph of a function f is given above. Answer the question: a. Find the value(s) of x where f is not differentiable. x = 4, x = 3, x = 2, x = 1, x = 1, x = 2, x = 3, x = 4, x = 5 b. Find the value(s)
More information