BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Size: px
Start display at page:

Download "BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x"

Transcription

1 BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,, x n ) T F n Let v w = v T w be the dot product of the vectors v, w F n Writing A = (A, A 2,, A n ) where A i F m are the columns of A, we obtain the formula Ax = x A + + x n A n Writing A A 2 A = Ax = A m where A j F n are the rows of A, we obtain the formula A x A T A 2 x x A T 2 x A m x = A T m x 2 Computation of the inverse of a matrix Suppose that A is an n n matrix Transform the n 2n matrix (A I n ) into a reduced row echelon form (C B) A is invertible iff C = I n If A is invertible, then B = A 3 Computation of a basis of the span of a set of row vectors Suppose that v,, v m F n Transform the m n matrix v v 2 v m into a reduced row echelon form B The nonzero rows of B form a basis of Span({v,, v m }) 4 Computation of a subset of a set of column vectors which is a basis of the span of the set Suppose that w,, w n F m Transform the m n matrix (w, w 2,, w n ) into a reduced row echelon form B Let σ() < σ(2) < < σ(r) be the indices of the columns B i of B which contain a leading Then {w σ(),, w σ(r) } is a basis of Span({w, w 2,, w n })

2 5 Extension of a set of linearly independent row vectors to a basis of F n Suppose that w,, w m F n are linearly independent Let {e,, e n } be the standard basis of F n Transform the m n matrix w w 2 w m into a reduced row echelon form B Let σ() < σ(2) < < σ(n m) be the indexes of the columns of B which do not contain a leading Then {w,, w m, e σ(),, e σ(n m) } is a basis of F n (Some different algorithms are given later in the pages on inner product spaces) 6 Computation of a basis of the solution space of a homogeneous system of equations Let A = (a ij ) be an m n matrix, and X = (x i ) be a n matrix of indeterminates Let N(A) be the null space of the matrix A (the subspace of F n of all X F n such that AX = 0 m ) A basis for N(A) can be found by solving the system AX = 0 m using Gaussian elimination to find the general solution, putting the general solution into a column vector and expanding with indeterminate coefficients The vectors in this expansion are a basis of N(A) 2

3 Calculation of the Matrix of a Linear Map Coordinate vectors Suppose that V is a vector space, with a basis β = {v,, v n } Suppose that v V Then there is a unique expansion v = c v + + c n v n with c i R The coordinate vector of v with respect to the basis β is (v) β = (c,, c n ) T M n 2 The transition matrix between bases Suppose that V is a vector space, and β = {v,, v n }, β = {w,, w n } are bases of V The transition matrix M β β from the basis β to the basis β is the unique n n matrix M β β for all v V It follows that M β β (v) β = (v) β M β β = ((v ) β, (v 2 ) β,, (v n ) β ) which has the property that We have that M β β = (M β β ), and if β is a third basis of V, then M β β M β β = M β β The n 2n matrix (w, w 2,, w n, v,, v n ) is transformed by elementary row operations into the reduced row echelon form (I n, M β β ) 3 The matrix of a linear map Suppose that F : V W is a linear map Let β = {v,, v n } be a basis of V, and β = {w,, w m } be a basis of W The matrix M β β (F ) of the linear map F with respect to the bases β of V and β of W is the unique m n matrix M β β (F ) which has the property that for all v V It follows that M β β (F )(v) β = (F (v)) β M β β (F ) = ((F (v )) β, (F (v 2 )) β,, (F (v n )) β ) If F is the identity map id (so that V = W ), then M β β (id) is the transition matrix M β β defined above Suppose that G : W X is a linear map, and β is a basis of X The composition G F : V X of F and G can be represented by the diagram V F W G X We have M β β (G F ) = M β β (G)M β β (F ) A particularly important application of this formula is M β β (F ) = S M β β (F )S, where F : V V is linear, β and β are bases of V, and S = M β β A convenient method for computing M β β (F ) is the following Let β be a basis of W which is easy to compute with (such as a standard basis of W ) The m (m + n) matrix ((w ) β, (w 2 ) β,, (w m ) β, (F (v )) β,, (F (v n )) β ) is transformed by elementary row operations into the reduced row echelon form (I m, M β β (F )) 3

4 Inner Product Spaces The Orthogonal Space Suppose that A is an m n matrix with coefficients in a field F R(A) is the column space of A, and N(A) is the solution space to Ax = 0 R(A) is a subspace of F m, which has a nondegenerate inner product given by the dot product v w = v T w for v, w F m For x R n, we have the formula and thus we have the formulas A A m x = A T x A T m x, N(A) = [R(A T )] and N(A T ) = R(A) 2 Pythagoras s Theorem Suppose that V is a finite dimensional real vector space with positive definite inner product <, > Suppose that v, w V and v w Then v + w 2 = v 2 + w 2 3 The Gram Schmidt Process Suppose that V is a finite dimensional real vector space with positive definite inner product <, >, and that {x,, x n } is a basis of V Let v = x u = x x v 2 = x 2 < x 2, u > u u 2 = v 2 v 2 v 3 = x 3 < x 3, u > u < x 3, u 2 > u 2 u 3 = v 3 v 3 Then {u, u 2,, u n } is an orthonormal (ON) basis of V 4 Coordinate Vector With Respect to an ON Basis Suppose that V is a finite dimensional real vector space with positive definite inner product <, > Suppose that β = {u,, u n } is an ON basis of V and v V Then (v) β = (< v, v >, < v, v 2 >,, < v, v n >) T 5 Projection Onto a Subspace Suppose that V is a finite dimensional real vector space with positive definite inner product <, >, and that W is a subspace of V Then V = W W ; that is, every element v V has a unique decomposition V = w + w with w W and w W This allows us to define the projection π W : V W by π W (v) = w π W is a linear map onto W π W (v) is the element of W which is the closest to v; if x W and x π W (v) then v π W (v) < v x 4

5 π W (v) can be computed as follows Let {u,, u s } be an orthonormal basis of W For v V, let c i =< v, u i > be the component of v along u i Then s π W (v) = c k u k Now let us restrict to the case where V is R n with < v, w >= v T w Let {u,, u s } be an orthonormal basis of W Let U be the n s matrix U = (u,, u s ) Let A = UU T, an n n matrix The linear map L A : R n R n is the projection π W : R n W, followed by inclusion of W into R n 6 Orthogonal Matrices An n n real matrix A is orthogonal if A T A = I n In the following theorem, we view R n as an inner product space with the dot product Theorem 0 The following are equivalent for an n n real matrix A ) A is an orthogonal matrix 2) A T = A 3) The linear map L A : R n R n preserves length ( Ax = x for all x R n ) 4) The columns of A form an ON basis of R n k= 7 Least Squares Solutions Let A be an m n real matrix and b R m A least squares solution of the system Ax = b is a vector x = ˆx R n which minimizes b Ax for x R n The least squares solutions of the system Ax = b are the solutions to the (consistent) system A T Ax = A T b 8 Fourier Series Let C[ π, π] be the continuous (real valued) functions on [ π, π] C[ π, π] is a real vector space, with the positive definite inner product < f, g >= π π π f(t)g(t)dt The norm of f C[ π, π] is defined by f 2 =< f, f > For a positive integer n, let T n be the subspace of C[ π, π] which has the orthonormal basis { 2, sin(t), cos(t), sin(2t), cos(2t),, sin(nt), cos(nt)} Suppose that f C[ π, π] The projection of f on T n (T is for trigonometric functions ) is f n (t) = a b sin(t) + c cos(t) + + b n sin(t) + c n cos(nt) where and for k n, a 0 =< f(t), >= π 2 2π b k =< f(t), sin(kt) >= π c k =< f(t), cos(kt) >= π 5 π π π π π f(t)dt f(t) sin(kt)dt, f(t) cos(kt)dt

6 a 0, b k, c k are called the Fourier coefficients of f f n is the best approximation of f in T n, in the sense that g = f n minimizes f g for g T n The infinite series g(t) = a b sin(t) + c cos(t) + + b n sin(t) + c n cos(nt) + converges to f on the interval [ π, π] g(t) is defined everywhere on R g(t) is periodic of period 2π; that is g(a + 2π) = g(a) for all a R Thus in general, g(t) will only be equal to f(t) on the interval [ π, π] There is an infinite Parseval s formula, f 2 = a b 2 + c Extension of a set of LI vectors to a basis of R n Let v,, v s be a set of linearly independent vectors in R n Let v T v T 2 A = vs T Let {v s+,, v n } be a basis of N(A) (which can be computed using Gaussian elimination) Then {v,, v s, v s+,, v n } is a basis of R n Warning: This algorithm does not work in C n The reason is that C n might not be equal to W W if W is a subspace of C n To fix this problem, we need the notion of Hermitian inner product An extension of this algorithm that works over any subfield F of C will be given below in 6

7 0 Hermitian Inner Product Spaces Suppose that V is a complex vector space Then the notion of positive definite inner product is generalized to that of Hermitian inner product (warning: an Hermitian inner prouduct is not a bilinear form, so it is not a nondegenerate inner product) The dot product on C n is not Hermitian C n has the Hermitian inner product < v, w >= v T w for v, w C n (here w is the complex conjugate of w) The statements of through 9 above all generalize to Hermitian inner products (with R replaced by C) The statement of for a complex matrix A becomes A A m x = < A T, x > < A T m, x > = < x, A T > < x, A m T > and thus R(A T ) = N(A) and R(A) = N(A T ) The projection matrix A of 5 becomes A = UU T The orthogonal matrix defined in 6 generalizes to a unitary matrix An n n complex matrix A is unitary if A T A = I n The criterion of 2) of the theorem of 6 then becomes A T = A The least squares solutions to Ax = b are the solutions to A T Ax = A T b The inner product in 8 becomes < f, g >= π π π f(t)g(t)dt Extension of a set of LI vectors to a basis of F n Let F be a subfield of C, and let v,, v s be a set of linearly independent vectors in F n Let v T v T 2 A = v T s Let {v s+,, v n } be a basis of N(A) (which can be computed using Gaussian elimination) Then {v,, v s, v s+,, v n } is a basis of F n, 7

8 Eigenvalues and Diagonalization Eigenvalues and Eigenvectors Suppose that A M n n (F ) is an n n matrix λ F is an eigenvalue of A if there exists a nonzero vector v F n such that Av = λv Such a v is called an eigenvector of A with eigenvalue λ For λ F, E(λ) = {v F n Av = λv} is a subspace of F n λ is an eigenvalue of A if and only if E(λ) {0} The nonzero elements of E(λ) are the eigenvectors of A with eigenvalue λ If λ is an eigenvalue of A, then E(λ) is called an eigenspace of A Thus A is not invertible if and only if λ = 0 is an eigenvalue of A The eigenspace E(λ) is the solution space N(A λi n ) 2 The Characteristic Polynomial The characteristic polynomial of A M n n (F ) is P A (t) = Det(tI n A) Observe that P A (t) = ( ) n Det(A ti n ) The roots of P A (t) = 0 are the eigenvalues of A 3 Diagonalization of Matrices Suppose that A M n n (F ) We say that A is diagonalizable (over F ) if A is similar to a diagonal matrix; that is, there exists an invertible n n matrix B M n n (F ) such that B AB = D is a diagonal matrix Let β = {e,, e n } be the standard basis of F n By 2, we have that a matrix A M n n (F ) has only finitely many distinct eigenvalues, say λ, λ 2,, λ r Suppose that dim E(λ i ) = s i for i r For i r, let v i,,, v i,si be a basis of E(λ i ) Then v,,, v,s, v 2,,, v r,sr is a linearly independent set of vectors (It can be proven that if w,, w s are eigenvectors for A with distinct eigenvalues, then w + w w s = 0 implies w i = 0 for all i) If they form a basis β of F n, then we have an equation λ M β β AM β λ β = D = λ 2 where all nondiagonal entries of D are zero Thus we have diagonalized A The matrix M β β = (v,,, v,s, v 2,,, v r,sr ) and M β β = (M β β ) Working backwards through this construction, we see that an n n matrix A is diagonalizable over F if and only if F n has a basis of eigenvectors of A In summary, we always have that s + + s r n, and A is diagonalizable if and only if s + + s r = n 4 Eigenvalues and Diagonalization of Operators Everything above generalizes to an operator T : V V, where V is an n dimensional vector space over a field F λ F is an eigenvalue of T if there exists a nonzero vector v V such that T v = λv Such a v is called an eigenvector of T with eigenvalue λ We can then form the eigenspace E(λ) of an eigenvalue λ of T, which is a subspace of V Suppose that β is a basis of V Then we can compute the matrix M β β (T ) of T with respect to the basis β Further, we can compute 8 λ r

9 the characteristic polynomial P M β β (T )(t) of M β β (T ) This polynomial is independent of the choice of basis β of V Thus we can define the characteristic polynomial of T to be P T (t) = P M β β (T )(t), computed from any choice of basis β of V We have that the roots of P T (t) = 0 are the eigenvalues of T We say that T is diagonalizable if there exists a basis β of V consisting of eigenvectors of T In this case, the matrix M β β (T ) is a diagonal matrix 5 Diagonalization of Real Symmetric Matrices Suppose that A M n n (R) is a symmetric matrix Then the spectral theorem tells us that all eigenvalues of A are real and that R n has a basis of eigenvectors of A Further, eigenvectors with distinct eigenvalues are perpendicular Thus R n has an orthonormal basis of eigenvectors This means that we may refine our diagonalization algorithm of 3, adding an extra step, using Gram Schmidt to obtain an ON basis u i,,, u i,si of E(λ i ) from the basis v i,,, v i,si Since eigenvectors with distinct eigenvalues are perpendicular, we may put all of these ON sets of vectors together to obtain an ON basis u,,, u,s, u 2,,, u r,sr of R n Let U = (u,,, u,s, u 2,,, u r,sr ) U is an orthogonal matrix, so that U = U T We have orthogonally diagonalized A, λ U T λ AU = D = λ 2 where all nondiagonal entries of D are zero, and U is an orthogonal matrix λ r 9

10 6 Triangularization of Matrices Suppose that A M n,n (F ) A triangularization of A (over F ) is a factorization P AP = T where P, T M n,n (F ), P is invertible and T is upper triangular A is triangularizable over F if and only if all of the eigenvalues of A are in F (this will always be true if F = C is the complex numbers) The following algorithm produces a triangularization of A Let v,, v s be a maximal set of linearly independent eigenvectors for A, with respective eigenvalues λ, λ 2,, λ s Extend {v,, v s } to a basis v,, v n of F n (This can be done for any F by algorithm 5 in Matrices and applications, or by algorithm 9 or its extension algorithm in Inner product spaces if F is contained in R or C) Then P = (v, v 2,, v n ) satisfies P AP = λ λ λ s B, where B is an (n s) (n s) matrix 2 The eigenvalues of B are a subset of the eigenvalues of A 3 If Q BQ = S is an upper triangular matrix (Q triangularizes B), then ( ) Is 0 P 2 = s (n s) 0 (n s) (n s) Q triangularizes A 0

11 Jordan Form For λ C, the Jordan block B n (λ) is the n n matrix λ λ B n (λ) = λ λ B n (λ) has the characteristic polynomial P A (t) = Det(tI n B n (λ)) = (t λ) n The only eigenvalue of B n (λ) is λ The eigenspace of λ for B n (λ) is the solution space to (B n (λ) λi n )X = B n (λ) λi n = So the solutions are x 2 = x 3 = = x n = 0, and a basis of the eigenspace E(λ) of B n (λ) consists of the single vector In particular, B n (λ) is diagonalizable if and only if n = In this special case, B (λ) = (λ) A Jordan Matrix J is a matrix B n (λ ) J = 0 B nr (λ ) B n2 (λ 2 ) B nsrs (λ s ) where J is a block (partitioned) matrix whose diagonal elements are the Jordan blocks B nij (λ i ) Set t i = n i + n i2 + + n iri for i s J is an n n matrix where n = t + t t s The characteristic polynomial of J is P J (t) = Det(tI n J) = (t λ ) t (t λ 2 ) t2 (t λ s ) ts

12 The eigenvalues of J are λ,, λ s Let e(i) be the column vector of length n with a in the ith place and zeros everywhere else A basis for E(λ ) is A basis for E(λ 2 ) is and a basis of E(λ s ) is {e(), e(n + ),, e(n + + n,r + )} {e(t + ),, e(t + n n 2,r2 + )} {e(t + + t s + ),, e(t + + t s + n s + + n s,rs + )} In particular, E(λ i ) has dimension r i, the number of Jordan blocks of J with eigenvalue λ i Example A = A is a Jordan matrix with 3 Jordan blocks: ( ) 3, (2), Theorem 02 Every square matrix A with complex coefficients is similar to a Jordan Matrix J; that is, there is an invertible complex matrix C such that J is called a Jordan form of A J = C AC The Jordan form of a matrix A is uniquely determined, up to permuting the Jordan blocks of a Jordan form This theorem fails over the reals Even if A is a real matrix, it will in general not be similar to a real Jordan matrix The essential point that makes everything work out over the complex numbers is the fundamental theorem of algebra which states that a nonconstant polynomial with complex coefficients has a complex root, so that it must factor into a product of linear factors (with complex coefficients) Thus every complex matrix has a complex eigenvalue (since the characteristic polynomial must have a complex root) However, there are real matrices which do not have a real eigenvalue Example 2 Suppose that P A (t) = (t 2) 2 (t + 3) 2 Then A has (up to permuting Jordan blocks) one of the following Jordan forms: F = , F 2 = ,

13 F 3 = , F 4 = Suppose that A is an n n matrix with complex coefficients Let J be a Jordan form of A (with all of the above notation), so that P A (t) = P J (t) There is a factorization P A (t) = (t λ ) t (t λ 2 ) t2 (t λ s ) ts where λ i are the distinct complex eigenvalues of A, and t +t 2 + +t s = n The algebraic multiplicity of A for λ i is t i, and the geometric multiplicity of A for λ i is dim E(λ i ), the dimension of the eigenspace of λ i for A For each eigenvalue λ i of A, we have dim E(λ i ) t i A is diagonalizable if and only if we have equality of the algebraic and geometric multiplicities for all eigenvalues λ i of A A polynomial f(t) C[t] is monic if its leading coefficient is ; that is, f(t) has the form f(t) = t n + a n t n + + a 0 with a 0, a,, a n C The minimal polynomial q A (t) of A is the (unique) monic polynomial in C[t] which has the property that q A (A) = 0, and if f(t) C[t] satisfies f(a) = 0, then q A (t) divides f(t) If B is similar to A, then q B (t) = q A (t), so that q A (t) = q J (t) Let ϕ(i) = max{n ij i r i } Then q A (t) = (t λ ) ϕ() (t λ 2 ) ϕ(2) (t λ s ) ϕ(s) The Cayley-Hamilton theorem tells us that p A (A) = 0, so that q A (t) divides p A (t) This gives us a method of computing q A (t) Assuming that we are able to factor the characteristic polynomial of a matrix A, we can thus calculate fairly easily a lot of information about the Jordan form For matrices of small size, just knowing the characteristic polynomial, the minimal polynomial, and the geometric multiplicities will often uniquely determine the Jordan form Of course, this is not enough information to compute the Jordan form for general matrices! Exercises on Jordan Form Which of the following are Jordan matrices? ( ) 0, , , What are the possible Jordan forms of A (up to permutation of Jordan blocks) if A has the given characteristic polynomial? a) P A (t) = (t 4) 2 t(t + 2) 2 b) P A (t) = (t 2) 3 3 Suppose that A is a 4 4 matrix with eigenvalues 2 and 5 Suppose that E(2) has dimension and E(5) has dimension 3 What are the possible Jordan forms of A (up to permutation of Jordan blocks)? 3,

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler. Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Online Exercises for Linear Algebra XM511

Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

More information

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Quizzes for Math 304

Quizzes for Math 304 Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

ANSWERS. E k E 2 E 1 A = B

ANSWERS. E k E 2 E 1 A = B MATH 7- Final Exam Spring ANSWERS Essay Questions points Define an Elementary Matrix Display the fundamental matrix multiply equation which summarizes a sequence of swap, combination and multiply operations,

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45 address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

1 9/5 Matrices, vectors, and their applications

1 9/5 Matrices, vectors, and their applications 1 9/5 Matrices, vectors, and their applications Algebra: study of objects and operations on them. Linear algebra: object: matrices and vectors. operations: addition, multiplication etc. Algorithms/Geometric

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

MTH 464: Computational Linear Algebra

MTH 464: Computational Linear Algebra MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix Professor Joana Amorim, jamorim@bu.edu What is on this week Vector spaces (continued). Null space and Column Space of a matrix............................. Null Space...........................................2

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Math 307 Learning Goals

Math 307 Learning Goals Math 307 Learning Goals May 14, 2018 Chapter 1 Linear Equations 1.1 Solving Linear Equations Write a system of linear equations using matrix notation. Use Gaussian elimination to bring a system of linear

More information

Algorithms to Compute Bases and the Rank of a Matrix

Algorithms to Compute Bases and the Rank of a Matrix Algorithms to Compute Bases and the Rank of a Matrix Subspaces associated to a matrix Suppose that A is an m n matrix The row space of A is the subspace of R n spanned by the rows of A The column space

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

More information

City Suburbs. : population distribution after m years

City Suburbs. : population distribution after m years Section 5.3 Diagonalization of Matrices Definition Example: stochastic matrix To City Suburbs From City Suburbs.85.03 = A.15.97 City.15.85 Suburbs.97.03 probability matrix of a sample person s residence

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017 Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

More information

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V. MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

(Refer Slide Time: 2:04)

(Refer Slide Time: 2:04) Linear Algebra By Professor K. C. Sivakumar Department of Mathematics Indian Institute of Technology, Madras Module 1 Lecture 1 Introduction to the Course Contents Good morning, let me welcome you to this

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information