Eigenvalues and Eigenvectors

Size: px
Start display at page:

Download "Eigenvalues and Eigenvectors"

Transcription

1 /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra

2 Eigenvalue Problem /88

3 Eigenvalue Equation By definition, the eigenvalue equation for matrix A is Ax = λx Note that A must be square. This equation is satisfied by a vector x that, under the transform of A, is mapped to a vector in the subspace spanned by x. Solving an eigenvalue equation is an eigenvalue problem. 3/88

4 Consider the unknowns λ and x in the eigenvalue equation for A Ax = λx A solution for λ is an eigenvalue of A. The set of eigenvalues of A is called the spectrum of A. A solution for x is an eigenvector of A. Note that if x is an eigenvector of A, then cx is also an eigenvector of A for c 0. 4/88

5 Solving an Eigenvalue Equation An eigenvalue equation Ax = λx is non-linear, as it involves product of unknowns λx. How do we find a solution? The key is to turn the above non-linear equation to systems of linear equations. Find the eigenvalues by solving a non-linear equation in λ. For each eigenvalue, find the eigenvectors with that eigenvalue by solving a system of linear equations for x. 5/88

6 Finding Eigenvalues An eigenvalue λ for A must satisfy the characteristic equation of A as follows A λi = 0 Why? x 0 such that Ax = λx x 0 such that (A λi)x = 0 (A λi) is singular A λi = 0 6/88

7 Finding Eigenvectors For an eigenvalue λ of A, the eigenspace of A with λ is {x 0 Ax = λx} This set contains the eigenvectors of A with eigenvalue λ. Apart from a null vector, the eigenspace with eigenvalue λ is the same as the nullspace of (A λi). 7/88

8 Example Solve the eigenvalue equation for A = A λi = 0 λ λ = 0 λ =, 5 λ = s = λ = s = 8/88

9 Example: Projection Matrix Solve the eigenvalue equation for P = [ ] P λi = 0 λ λ = 0 λ =, 0 λ = s = λ = 0 s = 9/88

10 Example: Complex Eigenvalues Solve the eigenvalue equation for K = 0 0 K λi = 0 λ + = 0 λ = i, i λ = i s = i λ = i s = i 0/88

11 Number of Eigenvalues A matrix of order n n has at most n distinct eigenvalues. Why? The determinant a λ... a n A λi =..... a n... a nn λ is a polynomial of λ of order n, with no more than n roots. Thus, we can denote the spectrum of A by {λ,..., λ k }, k n /88

12 Eigenvalues vs. Distinct Eigenvalues Let the spectrum of an n n matrix A be {λ,..., λ k } Then the characteristic polynomial of A can be written as k A λi = c (λ λ i ) γ i i= where k γ i = n i= Sometimes we denote the (not-necessarily-distinct) eigenvalues by λ,..., λ n /88

13 Trace and Sum of Eigenvalues The sum of the diagonal elements of a square matrix is its trace. The sum of eigenvalues of a matrix equals its trace. a λ... a n A λi =..... = c(λ λ )... (λ λ n ) a n... a nn λ Equating the λ n and λ n terms on both sides ( λ) n = cλ n c = ( ) n ( λ) n (a + + a nn ) = cλ n (( λ ) + + ( λ n )) λ + + λ n = a + + a nn 3/88

14 Determinant and Product of Eigenvalues The product of the eigenvalues of a matrix equals its determinant. Why? n A λi = ( ) n (λ λ i ) i= n A = ( ) n ( λ i ) i= n = λ i i= 4/88

15 Diagonalization and Decomposition 5/88

16 Multiplicity of Eigenvalue Let A be a matrix of order n n with spectrum {λ,..., λ k } The characteristic polynomial of A can be written as f (λ) = A λi = ( ) n γ i is the algebraic multiplicity of λ i. k i= (λ λ i ) γ i For the eigenspace of A associated with λ i N (A λ i I) {0} g i = dim N (A λ i I) is the geometric multiplicity of λ i. 6/88

17 Sum of Multiplicities For any i γ i g i Since k γ i = n i= we have k g i n i= 7/88

18 Defective Matrix An n n matrix is defective if It is non-defective if k g i < n i= k g i = n i= For example [ 0 ] 0 0 is defective, while [ 0 ] is not. 8/88

19 Set of Independent Eigenvectors For a non-defective matrix of order n n, there is a set of n linearly independent eigenvectors (i.e. a basis for R n ) of the matrix. Why? Eigenvalue λ i corresponds to an eigenspace of dimension g i, contributing g i linearly independent eigenvectors to the basis. Eigenvectors associated with distinct eigenvalues are linearly independent. So n linearly independent eigenvectors can be found, forming a basis. 9/88

20 Eigenvalue Decomposition Let s,..., s n be linearly independent eigenvectors of a non-defective matrix A with eigenvalues λ,..., λ n. Then where A = SΛS S = [ s... s n ], Λ = diag(λ,..., λ n ) This is called eigenvalue decomposition of a (non-defective) matrix. As i = λ i s i AS = SΛ A = SΛS 0/88

21 Diagonalization It follows that Λ = S AS This is called diagonalization of a (non-defective) matrix. P = [ ] [ ] [ ] = K = = i i 0 i i i 0 0 i /88

22 Common Eigenvectors Non-defective matrices A and B have a common eigenvector matrix if they commute. Suppose AB = BA. Let x be an eigenvector of B with eigenvalue λ. Consider ABx and ABx. BAx = ABx = Aλx = λax So Ax is also an eigenvector of B with eigenvalue λ, which means Ax = λ x That is, x is an eigenvector of A for some eigenvalue λ. /88

23 Simultaneous Diagonalization Non-defective matrices A and B have a common eigenvector matrix only if they commute AB = BA Suppose A and B have a common eigenvector matrix S, with A = SΛ S, B = SΛ S Then AB = SΛ S SΛ S = SΛ Λ S = SΛ Λ S = SΛ S SΛ S = BA 3/88

24 Difference Equations 4/88

25 Power of a Matrix Suppose A is diagonalizable. The power of A can be simplifies by A k = (SΛS ) k = (SΛS )(SΛS )... (SΛS ) = SΛ(S S)Λ(S S)... (S S)ΛS = SΛ k S Note that SΛ k S is much easier to compute than A k. 5/88

26 Fibonacci Sequence Fibonacci numbers are defined recursively by F k+ = F k + F k We will look at the Fibonacci sequence which starts with F 0 = 0, F = The first few Fibonacci numbers are /88

27 Representation of Recursion with Matrix Combining the recurrence relation with a trivial equality F k+ = F k + F k F k = F k + 0 F k we have a system of matrix equations for the Fibonacci numbers Fk+ F k = Fk 0 F k with initial condition F F 0 = 0 7/88

28 Simplification by Matrix Define u k = Fk+ F k, A = 0 Let S be an eigenvector matrix of A and Λ be the associated eigenvalue matrix. We have u k = Au k = A(Au k ) = = A k u = A k u 0 = SΛ k S u 0 = [ s s ] [ λ k ] c λ k, c c c = S u 0 = c λ k s + c λ k s 8/88

29 Eigenvalue Problem of A Eigenvalues Eigenvectors A λi = 0 λ = + 5, λ = 5 [ λi λ i Fitting the initial condition c c = S u 0 = ] s i = 0 s i = [ λ λ ] [ 0 λi ] λ = λ λ λ 9/88

30 Fibonacci Numbers Thus u k = c λ k s + c λ k s = λ k λ λ λ So the Fibonacci numbers are λ λ λ k λ ( ) F k = λ k λ λ λ k = ( ) k ( ) k + 5 5, k = 0,, /88

31 A Markov Process Every year of the people outside Asia move in, and 0 people inside Asia move out. 0 of the Let y k be the number of people outside Asia and z k be the number of people inside Asia at the end of year k. Then yk+ z k yk = z k or simply x k+ = Ax k, k = 0,,... Matrix A is Markov, as the elements of A are non-negative the elements in every column of A sum to 3/88

32 Population as a Function of k Using the eigenvalue decomposition of A is we have A = SΛS = [ ] x k = Ax k = A(Ax k ) = = A k x 0 = SΛ k S x 0 That is yk z k = [ = (y 0 + z 0 )() k [ 3 3 ] k 0 y z 0 ] + (y 0 z 0 )(0.7) k [ 3 3 ] 3/88

33 Steady State We can see that x k converges to the first term x = y z = (y 0 + z 0 ) [ 3 3 ] Note that x is the eigenvector of A with eigenvalue Ax = x Thus x is called the steady state since it stays if entered. 33/88

34 Linear Differential Equations 34/88

35 Linear Differential Equation Consider an unknown function u(t) that satisfies where a is a constant. du(t) dt = au(t) The equation is a differential equation as it involves a derivative. What is the solution of such a linear differential equation? 35/88

36 Solution It is easy to verify that u(t) = u(0)e at where u(0) is the initial value. 36/88

37 System of Linear Differential Equations Next, consider two unknown functions, v(t) and w(t), that satisfy a system of linear differential equations where a, b, c, d are constants. dv = av + bw dt dw = cv + dw dt 37/88

38 Matrix Representation A system of linear differential equations can be represented by matrices. For the system of linear differential equations in the previous slide, we have du dt = Au where v a b u =, A = w c d 38/88

39 Sneak Preview It turns out the solution of can be expressed as du dt = Au u(t) = e At u(0) with an appropriate definition of matrix exponential e At 39/88

40 Working out an Example Consider the case with That is u(t) = v(t), A = w(t) dv = 4v 5w dt dw = v 3w dt /88

41 Looking for a Solution An exponential function, after differentiation, is still an exponential function. So we assume v(t) = e αt y, w(t) = e αt z Substitute into the differential equations, we have Eliminating e αt, we get αe αt y = 4e αt y 5e αt z αe αt z = e αt y 3e αt z αy = 4y 5z αz = y 3z which has the form of an eigenvalue equation Ax = αx 4/88

42 General Solution via A A solution of the differential equations consists of y, z and α, which are related to eigenvector and eigenvalue of A. If λ is an eigenvalue of A with associated eigenvectors x, then d ( e λt x ) = λe λt x = e λt Ax = A ( e λt x ) dt By linearity, any linear combination u = i c i e λ i t x i where λ i is an eigenvalue and x i is an eigenvector, satisfies the differential equation du dt = Au 4/88

43 The Solution Given initial condition u(0) = u 0, the solution of is u(t) = S du dt = Au where S is an eigenvector matrix of A. e λ t 0 0 e λ t S u 0 e u(t) = c e λt s +c e λt λ t 0 c c s = S 0 e λ t u(0) = u c 0 = S c c e = S λ t 0 u c 0 u(t) = S 0 e λ t S u 0 43/88

44 An Example of Initial Condition Solve du dt = Au, u = v(t), A = w(t) with initial condition v(0) = 8, w(0) = 5. A λi = 0 λ =, s = e t 0 u(t) = S 0 e t S u 0 = ] ] 4 5 3, s = [ 5 e t 0 0 e t ] [ 5 5 ] 8 5 = 3e t [ + e t [ 5 44/88

45 Matrix Exponential Let A be a square matrix. Matrix exponential is defined by e A k=0 A k k! = I + A + A! + A3 3! +... It has the same form as a scalar exponential e x = k=0 x k k! = + x + x! + x 3 3! /88

46 Diagonal Matrix For a diagonal matrix D = diag(d,..., d n ) e D = I + D + D! +... (( = diag + d + d! +... = diag ( ) e d,..., e dn ),..., ( + d n + d n! +... )) 46/88

47 Diagonalizable Matrices For a diagonalizable matrix A = SΛS e A = I + A + A! + = I + SΛS + (SΛS ) )! = S (I + Λ + Λ! +... S = Se Λ S /88

48 Solution of Differential Equation in Matrix The solution for is u(t) = i du dt = Au c i e λ i t s i = Se Λt S u(0) = e At u(0) since ( Se Λt S = S I + Λt + Λ t ) +...! S = I + (SΛS )t + (SΛS ) t +...! = I + At + A t +...! = e At 48/88

49 Higher-order Linear Differential Equations A high-order linear differential equation can be decomposed into a system of first-order linear differential equations. For example, a third-order linear differential equation can be converted to where y + by + cy = 0 u = Au y 0 0 u = v, A = 0 0 w 0 c b 49/88

50 Linear Partial Differential Equations A partial differential equation can be transformed to a system of linear differential equations if one variable is discretized. Consider a heat equation u t = u x When x is discretized, we have du dt = Au by defining u u =, A = u N 50/88

51 Complex Matrix 5/88

52 Vectors with Complex Numbers Inner product (x, y) = x y + + x n y n Length x = (x, x) = x x + + x n x n = x + + x n Orthogonality (x, y) = 0 x y 5/88

53 Example Decide the inner product, lengths, and orthogonality for x = 3 i + i and 5 y = i 53/88

54 Hermitian The Hermitian of matrix A is defined as A H = (A ) T Basically Hermitian = conjugation + transposition It can be shown that ( ) A H H = A (AB) H = B H A H 54/88

55 Hermitian and Inner Product The inner product of two vectors can be written as (x, y) = x H y It follows that (x, Ay) = x H Ay = (A H x) H y = (A H x, y) (Ax, y) = (Ax) H y = x H A H y = (x, A H y) 55/88

56 Hermitian Matrix By definition, a matrix A is Hermitian if A H = A For example is Hermitian, since [ ] 3 3i A = 3 + 3i 5 [ ] T [ ] A H = (A ) T 3 + 3i 3 3i = = = A 3 3i i 5 56/88

57 Properties of a Hermitian Matrix Let A be Hermitian. x H Ax is real for any x x H Ax = x H A H x = x H A H (x H ) H = (x H Ax) H = (x H Ax) The eigenvalues of A are real As i = λ i s i s H i As i = λ i s H i s i λ i = sh i As i s H i s i R Eigenvectors of A associated with distinct eigenvalues are orthogonal (As, s ) = (s, As ) (λ λ )(s, s ) = 0 (s, s ) = 0 57/88

58 Example [ ] 3 3i A = 3 + 3i 5 A λi = 0 λ = 8, (A 8I) s = 0 s = + i (A + I) s = 0 s = ( ) +i ( (s, s ) = + ( + i) + i ) = 0 58/88

59 Unitary Matrix By definition, a matrix U is unitary if U = U H i.e. UU H = U H U = I Vector length is invariant under transformation by U Ux = (Ux, Ux) = (x, U H Ux) = (x, x) = x The eigenvalues are of unit modulus Us i = λ i s i s i = Us i = λ i s i = λ i s i λ i = Eigenvectors associated with distinct eigenvalues are orthogonal (Us, Us ) = (s, U H Us ) = (s, s ) (Us, Us ) = (λ s, λ s ) = λ λ (s, s ) ( λ λ )(s, s ) = 0 (s, s ) = 0 59/88

60 Examples of Unitary Matrix Rotation matrix Fourier matrix Permutation matrix cos t sin t U = sin t cos t F = ω ω ω 3 ω ω 4 ω 6 ω 3 ω 6 ω P = /88

61 Skew-Hermitian By definition, a matrix K is skew-hermitian if K H = K Let [ ] [ 3 3i i K = ia = i = 3 + 3i i ] 3 + 3i 5i Indeed K H = (ia) H = ia H = ia = K [ K H i = 3 + 3i ] H 3 + 3i = 5i i 3 3i = K 3 3i 5i 6/88

62 Similarity Transformation 6/88

63 Similar Matrices By definition, matrix B is similar to matrix A if Similarity relation is denoted by M such that B = M AM B A Note that similarity is an equivalence relation. Thus, we also say A and B are similar. 63/88

64 Example A = 0, M 0 0 = M = b, M 0 = b, M = 0 A B = M AM = A B = M AM = [ ] b 0 0 [ ] 64/88

65 Eigenvalues of Similar Matrices If matrix A and matrix B are similar, then they have the same eigenvalues. Suppose A and B are similar, with B = M AM Let λ be an eigenvalue of B. Then B λi = 0 M AM λm M = 0 M (A λi)m = 0 A λi = 0 So λ is an eigenvalue of A. 65/88

66 Eigenvectors of Similar Matrices An eigenvector x of A corresponds to an eigenvector y = M x of B. Furthermore, x and y are associated with the same eigenvalue. Let λ be an eigenvalue of A with an associated eigenvector x. Then Ax = λx M Ax = λm x ( M AM ) ( M x ) = λ ( M x ) By = λy 66/88

67 Example A = 0, B 0 0 = b, B 0 0 = [ ] The eigenvalues of A are and 0, with eigenvectors 0, 0 B A. The eigenvectors of B are M = 0 0, M B A. The eigenvectors of B are ] M = 0 [, M 0 = 0 = b 67/88

68 Change of Basis The matrix representation for a linear transformation depends on the used basis. When there is a change of basis, the matrix changes to a similar matrix (with a similarity transformation). 68/88

69 Representation of a Vector Through a basis, a vector in a vector space can be represented by a column. The representation depends on the used basis. Specifically, suppose B = {v,..., v n } and then x = c v + + c n v n [x] B = {c i } 69/88

70 Representation of a Linear Transformation Through a basis for the domain D and a basis for the range R, a linear transformation from D to R can be represented by a matrix. The matrix representation of a linear transformation T : D R depends on the used bases. Here we use notation [T] UV for the matrix representation for T using basis U for D and basis V for R. 70/88

71 Construction of Matrix Representation Let U = {u,..., u n } be a basis for D and V = {v,..., v m } be a basis for R. Consider linear transformation Suppose T : D R m T(u j ) = a ij v i, i= j =,..., n Using basis U for D and basis V for R, we have [T] UV = {a ij } so that T(x) = y [y] V = [T] UV [x] U for x D and y R. 7/88

72 Proof Using basis U for D and basis V for R, suppose n m x = x j u j, T(x) = y = y i v i j= i= Using linearity of T n n m m n T(x) = x j T(u j ) = x j a ij v i = a ij x j v i j= j= i= i= j= Thus i.e. y i = a ij x j [y] V = [T] UV [x] U 7/88

73 Two Bases Consider a linear transformation within a vector space T : S S With basis G = {V,..., V n }, the matrix representation for T is b... b n n T(V j ) = b ij V i [T] GG =..... = B i= b n... b nn With basis H = {v,..., v n }, the matrix representation for T is a... a n n T(v j ) = a ij v i [T] HH =..... = A i= a n... a nn 73/88

74 Identity Transformation Consider the identity transformation I(x) = x Obviously I is linear, with [I] GG = I, [I] HH = I. How about [I] GH? Suppose the basis vectors are related by We have so n V j = m ij v i, j =,..., n i= n I(V j ) = V j = m ij v i, j =,..., n i= [I] GH = {m ij } = M 74/88

75 Change of Vector Representation How are [x] G = {x i } and [x] H = {x i } related? n n n n n x = x j V j = x j m ij v i = m ij x j v i j= j= i= i= j= n = x i v i i= Thus x i = j m ij x j in matrix form [x] H = [I] GH [x] G We also have [x] G = [I] GH [x] H = [I] HG [x] H [I] HG = [I] GH 75/88

76 Change of Matrix Representation How are [T] GG and [T] HH related? ( ) T(V j ) = T m ij v i = m ij (T(v i )) = i i k = b ij V i = b ij m ki v k, i i k i Thus a ki m ij = m ki b ij, j, k i i In matrix form [T] HH [I] GH = [I] GH [T] GG m ij a ki v k i 76/88

77 Relation to Similarity Transformation The identity [T] HH [I] GH = [I] GH [T] GG can be re-arranged as similarity transformation [T] HH = [I] GH [T] GG [I] GH [T] GG = [I] HG [T] HH [I] HG [T] GG = [I] HG [T] HH [I] GH (B = M A M) 77/88

78 Eigenvector of a Linear Transformation Consider a linear transformation T within a vector space. An eigenvector of T is defined by T(s) = λs The matrix representation of T using an eigenbasis G = {s,..., s n } consisting of eigenvectors of T is particular simple: [T] GG = diag(λ,..., λ n ) since Ts i = λ i s i 78/88

79 Eigenvalue Decomposition Let H = {v,..., v n } be another basis. Suppose s j = i s ij v i Then [I] GH = {s ij } = S The representation of T with H can be constructed by [T] HH = [I] GH [T] GG [I] GH 79/88

80 Example: Projection Let T be the projection to the line L at angle θ to x-axis. With an eigenbasis [T] GG = Λ = Let H be the standard basis. Then cos θ sin θ [I] GH = S = sin θ cos θ [ cos [T] HH = A = SΛS = ] θ cos θ sin θ cos θ sin θ sin θ 80/88

81 Normal Matrix and Orthonormal Eigenbasis 8/88

82 Schur Lemma For any matrix A, there exists a unitary matrix U such that T = U H AU is an upper-triangular matrix. Since A U H AU = T A is similar to an upper-triangular matrix. 8/88

83 Normal Matrix By definition, a matrix is normal if it commutes with its Hermitian, i.e. NN H = N H N 83/88

84 Diagonalizaility of a Normal Matrix A normal matrix can be diagonalized. Let A be a normal matrix. According to Schur lemma, there exists a unitary matrix U such that is upper-triangular. Furthermore T = U H AU TT H = U H AU(U H AU) H = U H AA H U = U H A H AU = U H A H UU H AU = T H T 84/88

85 Consider the first diagonal element Similarly (TT H ) = (T H T) t k tk = tkt k = t k k t k = 0, k > (TT H ) = (T H T) t k tk = tkt k = t k k t k = 0, k > The same argument applies to the remaining rows. Thus, T is diagonal, i.e., A is diagonalizable. 85/88

86 Complete Orthonormal Eigenvectors A normal matrix has sufficient orthonormal eigenvectors for a basis. Since T is diagonal U H AU = T AU = UT Au i = t ii u i so u i is an eigenvector of A associated with eigenvalue λ i = t ii. Since U is unitary U H U = I so the vectors u,..., u n are orthonormal. 86/88

87 Spectral Theorem A real symmetric matrix A can be factorized by A = QΛQ T where Λ is real and diagonal Q is real and orthogonal A is Hermitian, so the eigenvalues are real. The eigenvectors are also real, and orthonormal. 87/88

88 Example A = [ ] = [ ] [ ] [ ] = 3 [ ] + [ ] 88/88

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in 806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13-14, Tuesday 11 th November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Eigenvectors

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems

Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 1 of 45 Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems Peter J. Hammond latest revision 2017 September

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra Massachusetts Institute of Technology Department of Economics 14.381 Statistics Guido Kuersteiner Lecture Notes on Matrix Algebra These lecture notes summarize some basic results on matrix algebra used

More information

Symmetric and self-adjoint matrices

Symmetric and self-adjoint matrices Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that

More information

Chapter 6 Inner product spaces

Chapter 6 Inner product spaces Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS October 4, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

A Review of Linear Algebra

A Review of Linear Algebra A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS September 26, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

More information

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Math 307 Learning Goals

Math 307 Learning Goals Math 307 Learning Goals May 14, 2018 Chapter 1 Linear Equations 1.1 Solving Linear Equations Write a system of linear equations using matrix notation. Use Gaussian elimination to bring a system of linear

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

CS 143 Linear Algebra Review

CS 143 Linear Algebra Review CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see

More information

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 489AB Exercises for Chapter 2 Fall Section 2.3 Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Math Homework 8 (selected problems)

Math Homework 8 (selected problems) Math 102 - Homework 8 (selected problems) David Lipshutz Problem 1. (Strang, 5.5: #14) In the list below, which classes of matrices contain A and which contain B? 1 1 1 1 A 0 0 1 0 0 0 0 1 and B 1 1 1

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

EE5120 Linear Algebra: Tutorial 7, July-Dec Covers sec 5.3 (only powers of a matrix part), 5.5,5.6 of GS

EE5120 Linear Algebra: Tutorial 7, July-Dec Covers sec 5.3 (only powers of a matrix part), 5.5,5.6 of GS EE5 Linear Algebra: Tutorial 7, July-Dec 7-8 Covers sec 5. (only powers of a matrix part), 5.5,5. of GS. Prove that the eigenvectors corresponding to different eigenvalues are orthonormal for unitary matrices.

More information

Matrix Theory. A.Holst, V.Ufnarovski

Matrix Theory. A.Holst, V.Ufnarovski Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

ACM 104. Homework Set 5 Solutions. February 21, 2001

ACM 104. Homework Set 5 Solutions. February 21, 2001 ACM 04 Homework Set 5 Solutions February, 00 Franklin Chapter 4, Problem 4, page 0 Let A be an n n non-hermitian matrix Suppose that A has distinct eigenvalues λ,, λ n Show that A has the eigenvalues λ,,

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

Lecture 15, 16: Diagonalization

Lecture 15, 16: Diagonalization Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Lec 2: Mathematical Economics

Lec 2: Mathematical Economics Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen

More information

Linear Equations and Matrix

Linear Equations and Matrix 1/60 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Gaussian Elimination 2/60 Alpha Go Linear algebra begins with a system of linear

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) AMS526: Numerical Analysis (Numerical Linear Algebra for Computational and Data Sciences) Lecture 14: Eigenvalue Problems; Eigenvalue Revealing Factorizations Xiangmin Jiao Stony Brook University Xiangmin

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Linear Algebra Formulas. Ben Lee

Linear Algebra Formulas. Ben Lee Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Symmetric and anti symmetric matrices

Symmetric and anti symmetric matrices Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

Math 310 Final Exam Solutions

Math 310 Final Exam Solutions Math 3 Final Exam Solutions. ( pts) Consider the system of equations Ax = b where: A, b (a) Compute deta. Is A singular or nonsingular? (b) Compute A, if possible. (c) Write the row reduced echelon form

More information

The Spectral Theorem for normal linear maps

The Spectral Theorem for normal linear maps MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question

More information

Linear Algebra using Dirac Notation: Pt. 2

Linear Algebra using Dirac Notation: Pt. 2 Linear Algebra using Dirac Notation: Pt. 2 PHYS 476Q - Southern Illinois University February 6, 2018 PHYS 476Q - Southern Illinois University Linear Algebra using Dirac Notation: Pt. 2 February 6, 2018

More information

Chapter 3. Matrices. 3.1 Matrices

Chapter 3. Matrices. 3.1 Matrices 40 Chapter 3 Matrices 3.1 Matrices Definition 3.1 Matrix) A matrix A is a rectangular array of m n real numbers {a ij } written as a 11 a 12 a 1n a 21 a 22 a 2n A =.... a m1 a m2 a mn The array has m rows

More information

Math 21b. Review for Final Exam

Math 21b. Review for Final Exam Math 21b. Review for Final Exam Thomas W. Judson Spring 2003 General Information The exam is on Thursday, May 15 from 2:15 am to 5:15 pm in Jefferson 250. Please check with the registrar if you have a

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

Lecture 4 Eigenvalue problems

Lecture 4 Eigenvalue problems Lecture 4 Eigenvalue problems Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn

More information

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

More information

Math 408 Advanced Linear Algebra

Math 408 Advanced Linear Algebra Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x

More information

Diagonalization of Matrix

Diagonalization of Matrix of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. Determinants Ex... Let A = 0 4 4 2 0 and B = 0 3 0. (a) Compute 0 0 0 0 A. (b) Compute det(2a 2 B), det(4a + B), det(2(a 3 B 2 )). 0 t Ex..2. For

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University Lecture 14 Eigenvalue Problems Songting Luo Department of Mathematics Iowa State University MATH 562 Numerical Analysis II Songting Luo ( Department of Mathematics Iowa State University[0.5in] MATH562

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

Numerical Methods for Solving Large Scale Eigenvalue Problems

Numerical Methods for Solving Large Scale Eigenvalue Problems Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

More information

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0 Math 12 Final Exam - Dec 14 - PCYNH 122-6pm Fall 212 Name Student No. Section A No aids allowed. Answer all questions on test paper. Total Marks: 4 8 questions (plus a 9th bonus question), 5 points per

More information

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued) 1 A linear system of equations of the form Sections 75, 78 & 81 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written in matrix

More information