Linear System Theory
|
|
- Imogene James
- 6 years ago
- Views:
Transcription
1 Linear System Theory Wonhee Kim Lecture 4 Apr. 4, / 40
2 Recap Vector space, linear space, linear vector space Subspace Linearly independence and dependence Dimension, Basis, Change of Basis 2 / 40
3 Linear Maps (Linear Operators) Recall that Function: f : X Y for any x X, f assigns a unique f (x) = y Y X : domain, Y : codomain {f (x) : x X }: range f : function, operator, map, transformation injective, surjective, bijective 3 / 40
4 Linear Maps (Linear Operators) Linear Mapping: Let (V, F) and (W, F) be (finite-dimensional) linear vector spaces on the SAME FIELD!!!. Let A be a map from V to W, i.e., A : V W such that A(v) = w, v V and w W. Then A is said to be a linear mapping (equiv. linear transformation, linear operator, linear map), if and only if A(α 1 v 1 + α 2 v 2 ) = α 1 A(v 1 ) + α 2 A(v 2 ), α 1, α 2 F We will see that if A is linear, then A is a matrix. In other words, we will see that any linear mapping between finite-dimensional linear spaces can be represented as matrix multiplication 4 / 40
5 Linear Maps (Linear Operators) Example: Consider the following mapping on the set of polynomials of degree 2: A : as 2 + bs + c cs 2 + bs + a, where a, b, c R. Is this a linear map? Let = v 1 = a 1 s 2 + b 1 s + c 1 and v 2 = a 2 s 2 + b 2 s + c 2. Then for any α 1, α 2 R A(α 1 v 1 + α 2 v 2 ) = A(α 1 (a 1 s 2 + b 1 s + c 1 ) + α 2 a 2 s 2 + b 2 s + c 2 ) = (α 1 c 1 + α 2 c 2 )s 2 + (α 1 b 1 + α 2 b 2 )s + α 1 a 1 + α 2 a 2 = α 1 A(v 1 ) + α 2 A(v 2 ) 5 / 40
6 Linear Maps (Linear Operators) Example: The mapping A(v) where w = v / 40
7 Linear Maps (Linear Operators) Theorem Let (V, F) have a basis {v 1,..., v n }, and let (W, F) have a basis {w 1,..., w m }. Then w.r.t. these bases, the linear map A is represented by m n matrix. Namely, y = A(x) = Ax, where A is a m n matrix. Proof: For any x V, there exists a unique α 1,..., α n F such that By linearity, x = n α i v j = [v]α j=1 y = A(x) = α 1 A(v 1 ) + + α n A(v n ) = n α j A(v j ) Note that A : V W; hence for each A(v i ) W, it has a unique representation w.r.t. {w 1,..., w m }: j=1 A(v j ) = a 1j w 1 + a mj w m, j = 1, 2,..., n 7 / 40
8 Linear Maps (Linear Operators) Hence, (A1 (v 1 ) A 1 (v 2 ) A n (v n ) ) = ( ) y 1 y 2 y n a 11 a 12 a 1n = ( ) a 21 a 22 a 2n w 1 w 2 w m = [w]a a m1 a m2 a mn Note that A is a m n matrix, and ith column of A is the representation of y i w.r.t basis of W. Now, for any y = Ax W, there exists a unique β 1,..., β m F w.r.t. basis {w 1,..., w m } such that y = β 1 w β m w m = [w]β = α 1 A(v 1 ) + α n A(v n ) = [A(v)]α = [w]aα Hence, β = Aα. Since we have unique α and β, we have the desired result 8 / 40
9 Linear Maps (Linear Operators) We will now use A instead of A for a given basis. i)the matrix A gives the relation between the representations (coordinates) α and β, not x and y. ii) This also means that A depends on bases. So, with different bases, we have the different representation of the same operator A. This means that A is not unique under different bases. The most important case is when V = W, in which case we can use the same basis for the domain and codomain. For a different basis, we may obtain Ā, which is a different representation of A. 9 / 40
10 Similarity Transformation Suppose that v = {v 1,..., v n } and e = {e 1,..., e n } are bases for V. Then for the linear operator A, it can have different representations: A w.r.t. v or Ā w.r.t. e. We now see the relationship between A and Ā. Think about the change of basis, we have shown that given two bases, we can change the coordinate (representation) of a vector by using the transformation matrix. This means that if the domain and the codomain are the same, then we can use the same transformation matrix 10 / 40
11 Similarity Transformation 11 / 40
12 Similarity Transformation We have β = Āᾱ = Pβ = PAα = PAP 1 ᾱ This implies Ā = PAP 1. Similarly, Ā = Q 1 AQ Moreover, A = P 1 ĀP = QĀQ 1 Similar Matrix: A and Ā are similar if there exists a nonsingular matrix P (or Q) satisfying the above transformation. If P (or Q) exists, the above transformation is called a similarity transformation Note: all the matrix representations (w.r.t the different bases) of the same operator are similar 12 / 40
13 Similarity Transformation Example: Second-order differential equation with x 1 = x and x 2 = ẋ = ẋ 1 ( ) ( ẍ + 3ẋ + 2x = u ẋ = x + u 2 3 1) The eigenvalue and eigenvector of A Let z = Qx. Then ż = Qẋ. Hence, det(si A) = 0 s = 1, 2 Q = ( ( ) ) 1 1 e 1 e 1 = 1 2 ( ) Q = P = 1 1 ż = Q 1 AQz + Q 1 u = ( ) 1 0 z ( ) 1.41 u / 40
14 Range and Null Spaces Definition: Range Space Suppose that (V, F) and (W, F) are n and m dimensional vector spaces, respectively. The range of a linear operator A from V to W is the set R(A) defined by R(A) = {y W : y = Ax for some x V} Theorem: The range of a linear operator A is a subspace of (W, F). Proof: 1) We have R(A) W 2) For any y 1, y 2 R(A), and α, β F, we have α 1 y 1 + α 2 y 2 = α 1 Ax 1 + α 2 Ax 2 = A(α 1 x 1 + α 2 x 2 ) (why??). Hence, by the definition of R(A), α 1 y 1 + α 2 y 2 R(A). 14 / 40
15 Range and Null Spaces R(A) W, and R(A) is a linear space Let x = (x 1,..., x n ) T, and A = (A 1,..., A n ), where A i is the ith column of A. Then Ax = y can be written as y = A 1 x 1 + A n x n R(A) is the set of all the possible linear combinations of the columns of A. Note that R(A) is a linear subspace of (W, F). Hence, dim(r(a)) is the maximum number of linearly independent columns of A. 15 / 40
16 Range and Null Spaces Definition: Null Space Suppose that (V, F) and (W, F) are n and m dimensional vector spaces, respectively. The null space of a linear operator A from V to W is the set N(A) defined by N(A) = {x V : Ax = 0} The null space of a linear operator A is a subspace of (V, F). The null space is a subspace of domain, whereas the range space is a subspace of (W, F) 16 / 40
17 Range and Null Spaces Rank and Nullity The dimension of R(A) is denoted by rank(a) The dimension of N(A) is denoted by nullity(a) Theorem: Dimension Theorem or Rank-Nullity Theorem Suppose that (V, F) and (W, F) are n and m dimensional vector spaces, respectively. Then for a linear operator A from V to W, we have rank(a) + nullity(a) = dim(v) = n 17 / 40
18 Range and Null Spaces R(A T ) = {x V : x = A T y, for some y W} (row space) N(A T ) = {y W : A T y = 0} rank(a T ) + nullity(a T ) = m rank(a) = number of linearly independent columns of A rank(a T ) = number of linearly independent raws of A rank(a) = rank(a T ) A square matrix matrix A in nonsingular if and only if all the rows and columns of A are linearly independent 18 / 40
19 Range and Null Spaces Example: A 1 = ( ) 1 2 3, A = We can check ( ) ( ) ( ) ( x 1 + x x 8 3 =, x = (1, 1, 1) 12 0) T, (2, 0, 1) T v v 2 6 = 0, v = (3, 1) T Hence, nullity(a 1 ) = 2, rank(a 1 ) = 1, nullity(a 2 ) = 1, rank(a 2 ) = 1 19 / 40
20 Invariant Subspace Invariant subspace Let A be a linear operator of (V, F) into itself. A subspace Y of V is said to be an invariant subspace of V under A, or an A-invariant subspace of V if A(Y) Y Ay Y, y Y Trivial Example: A : R n R n : R n is an invariant subspace 20 / 40
21 Eigenvalues and Eigenvectors Definition: Eigenvalues and Eigenvectors Let A be a linear operator that maps (C n, C) into itself. Then a scalar λ C is called an eigenvalue of A if there exists a nonzero x C n such that Ax = λx Any nonzero x satisfying Ax = λx is called an eigenvector of A associated with the eigenvalue λ (A λi )x = 0 or (λi A)x = 0 For a nonzero x C n, λ is an eigenvalue of A if and only if det(λi A) = 0 or det(a λi ) = 0 det(λi A) is called the characteristic polynomial of A The degree of the characteristic polynomial of A is n. Hence, the n n matrix A has n eigenvalues 21 / 40
22 Eigenvalues and Eigenvectors Question: For a n n square matrix A, are all the eigenvalues of A distinct? Answer: Not necessarily. There might be a repetition of some eigenvalues. Namely, it can be λ i = λ j for some i j i, j = 1, 2,.., n Case I: We consider the case when all the eigenvalues of A are distinct Theorem: Let λ 1,..., λ n be distinct eigenvalues of A, and v 1,..., v n are associated eigenvectors. Then the set {v 1,..., v n } is linearly independent, and for a n n matrix V = [v] = (v 1,..., v n ), rank(v ) = n. 22 / 40
23 Eigenvalues and Eigenvectors We prove the first part. Assume that the set {v 1,..., v n } is linearly dependent. Then there exist nonzero α 1,..., α n C such that α 1 v α n v n = 0 Without loss of generality, assume that α 1 0. Note that n (A λ 2 I )(A λ 3 I ) (A λ n I ) α i v i = 0 We have (A λ j I )v i = (λ i λ j )v i, j i (A λ i I )v i = 0 i=1 α 1 (λ 1 λ 2 )(λ 1 λ 3 ) (λ 1 λ n )v 1 = 0 Since all the eigenvalues are distinct, we have n (λ 1 λ i )v 1 = 0 α 1 i=2 which implies α 1 = 0. This is a contradiction. Hence, {v 1,..., v n } is linearly independent 23 / 40
24 Eigenvalues and Eigenvectors Theorem: A is diagonalizable if and only if it has n linearly independent eigenvectors. A consequence of diagonalization A = Q 1 ΛQ det(λi A) = det(λi Q 1 ΛQ) = det(q 1 (λi Λ)Q) = det(λi Λ) Example: Second-order differential equation with x 1 = x and x 2 = ẋ = ẋ 1 ( ) ( ) ẍ + 3ẋ + 2x = u ẋ = x + u The eigenvalue and eigenvector of A det(si A) = 0 s = 1, 2 Q = ( ( ) ( ) ) 1 1 e 1 e 1 =, Q = P = Let z = Qx. Then ż = Qẋ. Hence, ż = Q 1 AQz + Q 1 u = ( ) 1 0 z ( ) 1.41 u / 40
25 Eigenvalues and Eigenvectors Case II: We consider the case when eigenvalues of A are not distinct Example I: The case when A has repeated eigenvalues, but has linearly independent eigenvectors A is diagonalizable (theorem) [v 1 v 2 v 3 ] forms a basis A = 0 1 0, det(λi A) = 0, λ = 1, 1, v 1 = (1, 1, 0) T, v 2 = (0, 1, 0) T, v 3 = ( 1, 0, 1) T Example II: The case when A has repeated eigenvalues, and has linear dependent eigenvectors A = 0 1 3, det(λi A) = 0, λ = 1, 1, v 1 = (1, 0, 0) T, v 2 = (1, 0, 0) T, v 3 = (5, 3, 1) T [v 1 v 2 v 3 ] is not sufficient to form a basis 25 / 40
26 Jordan Form As we have seen, there is repeated eigenvalues of A, A can be diagonalizable or not depending on the corresponding eigenvectors. Assume that A has an eigenvalue of λ i with multiplicity of m i. Then the number of linearly independent eigenvectors associated with an eigenvalue λ i is equal to the dimension of N(λ i I A). Hence, we have q i = n rank(λ i I A) = nullity(λ i I A) Previous examples: λ i = 1 with m i = (I A) 0 0 0, / 40
27 Jordan Form We want to diagonalize a matrix A when A has repeated eigenvalues We want to fine some other eigenvectors of A Generalized eigenvectors: A vector v is said to be a generalized eigenvector of grade k 1 if and only if (A λi ) k v = 0, (A λi ) k 1 v 0 k = 1: v is an eigenvector of A Example II A = 0 1 3, det(λi A) = 0, λ = 1, 1, v 1 = (1, 0, 0) T = v 2, v 3 = (5, 3, 1) T We compute the generalized eigenvector of A associated with λ = / 40
28 Jordan Form B := (A I ) 2 = 0 0 3, Bv 2 = 0, v 2 = (0, 1, 0) T Note that (A I )v 0. Then Q = (v 1, v 2, v 3 ) = 0 1 3, J = Q 1 AQ = J: Jordan matrix N := J diag{1, 1, 2} = N: Nilpotent 28 / 40
29 Jordan Form Assume that A is 5 5 matrix with λ 1 with multiplicity of 4, and with λ 2 with multiplicity of 1 (simple). Then we may have the following Jordan forms λ λ λ A 1 = 0 0 λ λ 1 0, A 0 λ = 0 0 λ λ λ λ 2 λ λ λ A 3 = 0 0 λ λ 1 0, A 0 λ = 0 0 λ λ λ λ 2 A 5 = diag{λ 1, λ 1, λ 1 λ 1, λ 2 } J 1 : order 4, J 2 : order 3 J 3 : two order 2, J 4 : two order 1, order 2 29 / 40
30 Jordan Form Theorem: Suppose that A C n n. Then there exists a nonsingular matrix T C n n and an integer 1 p n such that J 1 0 T 1 J 2 AT = J =..., 0 J p where J k are Jordan matrix (or Jordan block) with order n k 30 / 40
31 Caley-Hamilton Theorem Caley-Hamilon Theorem Let us define the characteristic polynomial of A Then (λ) = det(λi A) = λ n + α 1 λ n 1 + α 2 λ n α n 1 λ + α n (A) = A n + α 1 A n α n 1 A + α n I = 0 Simple proof: when A is diagonalizable (general proof: HW) Similarity transformation implies that Consider (A) = Q Λ = Q 1 AQ A = QΛQ 1 det(a) = det(q) det(λ) det(q) 1 = det(λ) A 2 = QΛ 2 Q 1,..., A k = QΛ k Q 1 [ ] Λ n + α 1 Λ n α n 1 Λ + α n I Q 1 = 0 31 / 40
32 Caley-Hamilton Theorem Note that A n = α 1 A n 1 α n I It is a linear combination of {A n 1,..., I }!!! Similarly A n+1 = α 1 A n α n A It is a linear combination of {A n 1,..., I }!!!, which is also a linear combination of {A n 1,..., I } Given any polynomial function f (λ), f (A) can be expressed as follows f (A) = β 0 I + β 1 A + + β n 1 A n 1 We can extend f (λ) to any function, that is, f (λ) is not necessarily a polynomial. 32 / 40
33 Caley-Hamilton Theorem For A with n n, let λ 1,..., λ m be distinct eigenvalues with multiplicity n 1,..., n m, respectively. (note that m i=1 n i = n) Theorem (Theorem 3.5 of the textbook) Given f (λ) and n n matrix A with Define (A) = Π m i=1(λ λ i ) n i h(λ) = β 0 + β 1 λ + + β n 1 λ n 1 The coefficients {β 0,..., β n 1 } are to be solved from the following set of n equations: Then df l (λ) dλ λ=λ i = dhl (λ) dλ λ=λ i, l = 0, 1,..., n i 1, i = 1, 2,..., m f (A) = h(a) 33 / 40
34 Caley-Hamilton Theorem Example: ( ) 0 1 A = 1 2 We want to compute A 100. det(λi A) = λ 2 + 2λ + 1 = (λ + 1) 2. We let h(λ) = β 0 + β 1 λ, and f (λ) = λ 100. We have f ( 1) = h( 1) ( 1) 100 = β 0 β 1 f ( 1) = h ( 1) 100( 1) 99 = β 1 β 1 = 100, β 0 = 1 + β 1 = 99, h(λ) = λ ( ) f (A) = A = h(a) = 99I 100A = / 40
35 Caley-Hamilton Theorem Example: A = We want to compute e At. det(λi A) = (λ 1) 2 (λ 2). Let h(λ) = β 0 + β 1 λ + β 2 λ 2, and f (λ) = e λt. Then We have f (1) = h(1) e t = β 0 + β 1 + β 2 f (1) = h (1) te t = β 1 + 2β 2 f (2) = h(2) e 2t = β 0 + 2β 1 + 4β 2 β 0 = 2te t + e 2t, β 1 = 3te t + 2e t 2e t β 2 = e 2t e t + te t f (A) = e At = h(a) = β 0 I + β 1 A + β 2 A 2 35 / 40
36 Positive (semi)-definite Matrix A matrix A is symmetric if A = A T For any n n matrix A, and x R n, x T Ax is called a quadratic form Definition: A symmetric matrix A is said to be positive definite if x T Ax > 0 for all x 0 positive semi-definite if x T Ax 0 for all x 0 Theorem: A symmetric matrix A is positive definite (positive semi-definite) if and only if all eigenvalues of A are positive (nonnegative) A symmetric positive definite matrix is invertible For a symmetric positive definite matrix A, det(a) > 0 36 / 40
37 Norm and Inner Product Normed vector space (Normed linear space): Length of the vector Let (V, F) be a n-dimensional vector space. A function x : V R, where x V, is said to be a norm if the following properties hold x 0 and x = 0 if and only if x = 0 (separate points) αx = α x (absolute homogeneity) x + y x + y (triangular inequality) Example: Let (R n, R). Then the norm can be chosen as x 1 := n ( n x i, x 2 := x i 2) 1/2, x := max x i i i=1 Example: signal norm for the real-valued continuous function f (t) where 1 p < i=1 ( t f p = 0 ) 1/p f (t) p dt 37 / 40
38 Norm and Inner Product Inner Product: measure angle of two vectors An inner product between two vectors, x, y, on the vector space (V, C) is a function that maps from V V to C such that the following properties hold x, y = y, x complex conjugate x, α 1 y 1 + α 2 y 2 = α 1 x, y 1 + α 2 x, y 2, x, y 1, y 2 V, α 1, α 2 C x, x 0 and x, x = 0 if and only if x = 0 Example: Let (R n, R). Then the inner product is x 2 2 = x, x = n x i 2, x, y = x T y = i=1 n x i y i Example: signal inner product for the real-valued continuous function f (t) f 2 2 = t 0 f (t) 2 dt, f, g = t 0 i=1 f (t)g(t)dt where 1 p < 38 / 40
39 Orthogonal and Orthonormal Vectors Two vectors x and y are said to be orthogonal if and only if x, y = 0. If x, y = x T y, then x and y are orthogonal if and only if x T y = 0. Note that if x T y = 0, then the angle between two vectors is 90 Fact: Let A be a symmetric matrix, that is, A = A T. If A has distinct eigenvalues, then the corresponding eigenvectors are orthogonal. Example: A 1 = ( ) det(λi A 1 ) = 0, λ = 1, 3, (v 1, v 2 ) = v T 1 v 2 = 0, orthogonal ( ) / 40
40 Orthogonal and Orthonormal Vectors A set of vectors, x 1, x 2,..., x m is said to be orthonormal if { xi T 0 if i j x j = 1 if i = j 40 / 40
Chap 3. Linear Algebra
Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More informationELE/MCE 503 Linear Algebra Facts Fall 2018
ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2
EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko
More informationFinal Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015
Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,
More informationEigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.
Chapter 3 Linear Algebra In this Chapter we provide a review of some basic concepts from Linear Algebra which will be required in order to compute solutions of LTI systems in state space form, discuss
More informationLinear Algebra Lecture Notes-II
Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND
More informationReview of Some Concepts from Linear Algebra: Part 2
Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set
More informationRecall : Eigenvalues and Eigenvectors
Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector
More informationMAT Linear Algebra Collection of sample exams
MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More informationLecture 1: Review of linear algebra
Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations
More informationThe Jordan Normal Form and its Applications
The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens
More informationProblem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show
MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,
More informationDimension. Eigenvalue and eigenvector
Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,
More informationQuadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.
Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationMA 265 FINAL EXAM Fall 2012
MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More information1. In this problem, if the statement is always true, circle T; otherwise, circle F.
Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationSolutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015
Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See
More informationLecture 11: Diagonalization
Lecture 11: Elif Tan Ankara University Elif Tan (Ankara University) Lecture 11 1 / 11 Definition The n n matrix A is diagonalizableif there exits nonsingular matrix P d 1 0 0. such that P 1 AP = D, where
More informationChapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015
Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal
More informationEE263: Introduction to Linear Dynamical Systems Review Session 5
EE263: Introduction to Linear Dynamical Systems Review Session 5 Outline eigenvalues and eigenvectors diagonalization matrix exponential EE263 RS5 1 Eigenvalues and eigenvectors we say that λ C is an eigenvalue
More informationContents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2
Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition
More informationLinear Algebra Formulas. Ben Lee
Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries
More informationLec 2: Mathematical Economics
Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen
More informationLinear System Theory
Linear System Theory Wonhee Kim Lecture 3 Mar. 21, 2017 1 / 38 Overview Recap Nonlinear systems: existence and uniqueness of a solution of differential equations Preliminaries Fields and Vector Spaces
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More informationElementary linear algebra
Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The
More informationav 1 x 2 + 4y 2 + xy + 4z 2 = 16.
74 85 Eigenanalysis The subject of eigenanalysis seeks to find a coordinate system, in which the solution to an applied problem has a simple expression Therefore, eigenanalysis might be called the method
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationDM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini
DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationMATH 235. Final ANSWERS May 5, 2015
MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationMath 308 Practice Test for Final Exam Winter 2015
Math 38 Practice Test for Final Exam Winter 25 No books are allowed during the exam. But you are allowed one sheet ( x 8) of handwritten notes (back and front). You may use a calculator. For TRUE/FALSE
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationSolutions to practice questions for the final
Math A UC Davis, Winter Prof. Dan Romik Solutions to practice questions for the final. You are given the linear system of equations x + 4x + x 3 + x 4 = 8 x + x + x 3 = 5 x x + x 3 x 4 = x + x + x 4 =
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationMath 315: Linear Algebra Solutions to Assignment 7
Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationMath Final December 2006 C. Robinson
Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationUniversity of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm
University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions
More information6 Inner Product Spaces
Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space
More informationLinear algebra II Tutorial solutions #1 A = x 1
Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta
More informationMath 323 Exam 2 Sample Problems Solution Guide October 31, 2013
Math Exam Sample Problems Solution Guide October, Note that the following provides a guide to the solutions on the sample problems, but in some cases the complete solution would require more work or justification
More informationSolutions Problem Set 8 Math 240, Fall
Solutions Problem Set 8 Math 240, Fall 2012 5.6 T/F.2. True. If A is upper or lower diagonal, to make det(a λi) 0, we need product of the main diagonal elements of A λi to be 0, which means λ is one of
More informationMath Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88
Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationEigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto
7.1 November 6 7.1 Eigenvalues and Eigenvecto Goals Suppose A is square matrix of order n. Eigenvalues of A will be defined. Eigenvectors of A, corresponding to each eigenvalue, will be defined. Eigenspaces
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More information1. Select the unique answer (choice) for each problem. Write only the answer.
MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +
More information1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)
1 A linear system of equations of the form Sections 75, 78 & 81 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written in matrix
More informationhomogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45
address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test
More informationMath 24 Spring 2012 Sample Homework Solutions Week 8
Math 4 Spring Sample Homework Solutions Week 8 Section 5. (.) Test A M (R) for diagonalizability, and if possible find an invertible matrix Q and a diagonal matrix D such that Q AQ = D. ( ) 4 (c) A =.
More informationNumerical Linear Algebra
University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationLinear Algebra II Lecture 13
Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationMAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:
MAC Module Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to: Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationRecall the convention that, for us, all vectors are column vectors.
Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationSolutions to Final Exam
Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns
More informationMath 215 HW #9 Solutions
Math 5 HW #9 Solutions. Problem 4.4.. If A is a 5 by 5 matrix with all a ij, then det A. Volumes or the big formula or pivots should give some upper bound on the determinant. Answer: Let v i be the ith
More information4. Linear transformations as a vector space 17
4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation
More informationft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST
me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following
More information3. Associativity: (a + b)+c = a +(b + c) and(ab)c = a(bc) for all a, b, c F.
Appendix A Linear Algebra A1 Fields A field F is a set of elements, together with two operations written as addition (+) and multiplication ( ), 1 satisfying the following properties [22]: 1 Closure: a
More informationBasic Elements of Linear Algebra
A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,
More informationOnline Exercises for Linear Algebra XM511
This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue
More informationW2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.
MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have
More informationFinal A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017
Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More informationand let s calculate the image of some vectors under the transformation T.
Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =
More informationSECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =
SECTION 3.3. PROBLEM. The null space of a matrix A is: N(A) {X : AX }. Here are the calculations of AX for X a,b,c,d, and e. Aa [ ][ ] 3 3 [ ][ ] Ac 3 3 [ ] 3 3 [ ] 4+4 6+6 Ae [ ], Ab [ ][ ] 3 3 3 [ ]
More informationDiagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics
Diagonalization MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Motivation Today we consider two fundamental questions: Given an n n matrix A, does there exist a basis
More informationLecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,
2. REVIEW OF LINEAR ALGEBRA 1 Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε, where Y n 1 response vector and X n p is the model matrix (or design matrix ) with one row for
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationMath 310 Final Exam Solutions
Math 3 Final Exam Solutions. ( pts) Consider the system of equations Ax = b where: A, b (a) Compute deta. Is A singular or nonsingular? (b) Compute A, if possible. (c) Write the row reduced echelon form
More information(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.
1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationAgenda: Understand the action of A by seeing how it acts on eigenvectors.
Eigenvalues and Eigenvectors If Av=λv with v nonzero, then λ is called an eigenvalue of A and v is called an eigenvector of A corresponding to eigenvalue λ. Agenda: Understand the action of A by seeing
More information