1. Elements of linear algebra

Size: px
Start display at page:

Download "1. Elements of linear algebra"

Transcription

1 Elements of linear algebra Contents Solving systems of linear equations 2 Diagonal form of a square matrix 3 The Jordan normal form of a square matrix 4 The Gram-Schmidt orthogonalization process 5 The matrix exponential function Solving systems of linear equations Consider the following system of linear equations: a x + a 2 x a n x n = b a 2 x + a 22 x a 2n x n = b 2 a m x + a m2 x a mn x n = b m () Here a ij R, b j R The system () can be written in the form A X = B (2) where A = a a 2 a n a 2 a 22 a 2n a m a m2 a mn, X = x x 2 x n, B = b b 2 b m It is well known that if the system () or (2) is consistent, that is, there is a vector (a particular solution) X () = x () x () 2 x () n, satisfying (2), then each solution X of (2) can be written in the form X = X () + t X () + t 2 X (2) + + t r X (r) (3) Here t R,, t r R are free variables and {X (), X (2),, X (r) } is a fundamental system of solutions for the homogeneous system X (j) = x (j) x (j) 2 x (j) n A X =, (4), j =,, r

2 Thus, the general solution of () can be represented as follows x = x () + t x () + t 2 x (2) + + t r x (r) x 2 = x () 2 + t x () 2 + t 2 x (2) t r x (r) 2 x n = x () n + t x () n + t 2 x (2) n + + t r x (r) n (5) The solution (5) can be found, for instance, due to Gauss-Jordan algorithm Example x +2 x 2 +3 x 3 = 4x +5 x 2 +6 x 3 = 7x +8 x 2 +9 x 3 = 2 The system (6) can be expressed in the matrix form as follows (7) Using the Gauss-Jordan algorithm one can reduce (7) first to the row echelon form ( ) and then to the reduced row echelon form ( 2 28/3 29/3 (6) (8) ) (9) The variable x 3 can be regarded as a free variable, x 3 = t It follows from (9) that is the solution of (6) x = t x 2 = t x 3 = t > restart; with(linearalgebra): > A := Matrix([[,2,3],[4,5,6],[7,8,9]]); > GaussianElimination(A); > ReducedRowEchelonForm(<A>); A :=

3 > B:=Vector([,,2]); 2 B := > X:=LinearSolve(A, B,free= t ); # the general solution t 3 X := t 3 2 t 3 Example 2 x +2 x 2 +3 x 3 = 4x +5 x 2 +6 x 3 = 7x +8 x 2 +9 x 3 = 3 If we try to solve this system with Maple we obtain the following result > with(linearalgebra): > A := Matrix([[,2,3],[4,5,6],[7,8,9]]); > B:=Vector([,,3]); A := > X:=LinearSolve(A, B,free= t ); B := 3 () Error, (in LinearAlgebra:-LA_Main:-LinearSolve) inconsistent system Thus, the system () is inconsistent 2 Diagonal form of a square matrix Consider a matrix A = (a ij ) n n which is a square n n matrix with real (or complex) entries The main objective is to reduce this matrix to a diagonal form It means that one should find a square non-degenerate n n matrix C, a transition matrix, and a matrix D for which the equality C A C = D () 3

4 holds Here D is a diagonal matrix, D = λ λ 2 λ n Remark Note that the problem of diagonalization can have no solution even if we consider the complex matrices For instance, the matrix ( ) M = is not diagonalizable In general, there is a matrix equality of type () with a matrix D having Jordan normal form The diagonal form is a particular case of Jordan normal form Eigenvectors and eigenvalues of a square matrix Let us suppose that matrices C and D exist for a given matrix A In order to obtain the diagonal matrix D we should find all eigenvalues of A Then matrix C can be determined in terms of eigenvectors of A (see the general case below) Definition A non-trivial vector X = x x 2 x n, X, is called eigenvector of a square n n matrix A if there is a number λ C such that the equality holds In this case λ is called eigenvalue of A Note that the equality (2) can be rewritten in the form A X = λ X (2) (A λe) X = (3) where E is the identity matrix The matrix equality (3) can be regarded as a homogeneous system of linear equations It is well known that it has a non-trivial solution if and only if det(a λe) = Recall that the polynomial χ A (λ) = det(a λe) is called characteristic polynomial of a matrix A Thus, a complex number λ is an eigenvalue of a matrix A if and only if λ is a root of characteristic polynomial of A Example 3 Find the eigenvalues of the matrix A =

5 The characteristic polynomial 5 λ 8 6 det 8 25 λ 46 = 6 + 5λ + 2λ 2 λ λ The roots of the characteristic polynomial, the eigenvalues of A, are equal to 2,, 3 Remark The diagonal entries of the matrix D in () are eigenvalues of A In view of () there exists a non-degenerate matrix C such that C C = It can be showed that as rows of such a transition matrix C the corresponding eigenvectors can be chosen (see the general case below) > restart; with(linearalgebra): > A := Matrix([[-5,-8,-6],[-8,-25,-46],[2,7,32]]); A := > CharacteristicPolynomial(A,lambda); λ 2 λ 2 + λ 3 > solve(%,lambda);, 2, 3 > L:=Eigenvalues(A,output= list ); L := [, 2, 3] > E:=Matrix(3,3,shape=identity); #The identity matrix E := > B:=Vector(3,shape=zero); B := > X:=LinearSolve(A-L[2]*E,B,free= t ); > #this is eigenvector for the > eigenvalue L[2]=-2 X := 2 t 3 t 3 > X2:=LinearSolve(A-L[]*E,B,free= t ); > #this is eigenvector for the > eigenvalue L[]= 5

6 X2 := 4 t 3 t 3 > X3:=LinearSolve(A-L[3]*E,B,free= t ); > #this is eigenvector for the > eigenvalue L[3]=3 > t[3]:=; X3 := t 3 := t 3 t 3 t 3 t 3 > C:=Matrix([X,X2,X3]); # A transition matrix 4 C := 2 > Diag:=C^(-)AC; # this is to check the result 2 Diag := 3 Thus, we have obtained a transition matrix 4 C = 2 Remark The characteristic polynomial is defined up to sign ± The diagonal form and the transition matrix in general case Suppose that A is a diagonalizable matrix and let χ A (λ) = det(a λe) = ±(λ λ ) k (λ λ s ) ks (4) be a decomposition of characteristic polynomial of A over C The eigenvalues λ,, λ s are pairwise distinct and k + + k s = n where n is the dimension of A The following theorem provides a necesary and sufficient condition for the existence of diagonal form of A Theorem In previous notation suppose that for each l =,, s the system (A λ l E)X = has k l linearly independent solutions X l,,, X l, kl to the eigenvalue λ l Then C A C = D where which are eigenvectors of A corresponding 6

7 D = λ λ λ s is a diagonal matrix with eigenvalues of A as diagonal entries (λ l appears k l times) and the columns of C are linearly independent eigenvectors X,,, X, k,, X s,,, X s, ks In particular, if the characteristic polynomial of A has no multiple roots (ie all k l = in (4), the total number of different eigenvalues is equal to n) then the diagonal form of A exists Example 4 Let us find the transition matrix C in the equality C C = of Example 3 For the eigenvalue λ = 2 we obtain an eigenvector X = λ 2 = an eigenvector X 2 = X 3 = Then 4 λ s 2, for the eigenvalue and finally, for the eigenvalue λ 3 = 3 an eigenvector C = 4 2 > restart; with(linearalgebra): > A := «-5,-8,2> <-8,-25,7> <-6,-46,32»; A := > K:=CharacteristicPolynomial(A,lambda); K := 6 5 λ 2 λ 2 + λ 3 > factor(k); (λ ) (λ 3) (λ + 2) > (lambda,c):=eigenvectors(a); 7

8 > C; λ, C := 3 2, > C^(-); #this is the inverse matrix > Diag:=C^(-)AC; 3 Diag := 2 3 The Jordan normal form of a square matrix Sometimes the characteristic polynomial of a square matrix A has multiple roots and A has no diagonal form However, instead of diagonal form there is a Jordan normal form of A, that is, there is a matrix equality C A C = J with a block diagonal matrix J and some non-degenerate matrix C The matrix J is of block diagonal type J m (λ ) J mt(λ ) (5) J mj (λ s ) J mk (λ s) where λ λ λ J p (λ) = λ λ is a p p matrix with entries equal to λ on the diagonal and entries equal to just immediately over the diagonal The other entries are trivial J p (λ) is called a Jordan cell 8

9 In the form (5) the number and the dimensions of Jordan cells for each eigenvalue λ k are uniquely defined by the matrix A Remark The diagonal form is a particular case of Jordan form In the diagonal form all Jordan cells are matrices Example 5 Let us find the Jordan form of a matrix A = The characteristic polynomial is equal (up to the sign) to The Jordan form (λ + )(λ 3) 2 J = 3 3 has one -cell J ( ) and one 2-cell J 2 (3) Note that A has no diagonal form > with(linearalgebra): > A:=Matrix([[23,28,52],[,6,27],[-4,-9,-34]]); A := > CharacteristicPolynomial(A,lambda); > factor(%); > Eigenvalues(A); > J:=JordanForm(A); λ 5 λ 2 + λ 3 (λ + ) (λ 3) 2 J := The Gram-Schmidt orthogonalization process Consider the euclidian space R n, that is, the vector space R n equipped with the inner product The inner product two vectors X = (x,, x n ), Y = (y,, y n ) 9

10 is defined as follows X Y = x y + + x n y n Recall that the euclidian norm (the length) X of a vector X is the following number X = X X = x x 2 n Definition A vector X R n is called normalized if X = Definition Two vectors X R n and Y R n are called orthogonal if X Y = A system of vectors S is called orthogonal if all vectors of the system are pairwise orthogonal A system of vectors S is called orthonormal if it is orthogonal and each vector in S is normalized Note that any orthogonal system consists of linearly independent vectors In linear algebra the following orthogonalization problem is often needed to be solved Orthogonalization problem For a given system of vectors to find an orthogonal (orthonormal) system S = {V,, V k }, V i R n, Σ = {U,, U m }, m k, such that the linear subspaces spanned by S and Σ coincide: V,, V k = U,, U m This problem can be solved with the help of the Gram-Schmidt orthogonalization process The Gram-Schmidt orthogonalization process This process works inductively Without loss of generality we may assume that there is no trivial vector in S Step Let U = V Step 2 It follows from the first step that V = U We will seek a vector U 2 in the form The condition U 2 U = is required and consequently, It follows that U 2 = V 2 a 2 U (6) = U 2 U = (V 2 a 2 U ) U = V 2 U a 2 U U a 2 = V 2 U U U, U and U 2 are orthogonal If U 2 then it is included in the system Σ (otherwise we omit trivial vector) Now we go on to the next step Step 3 Due to (6) V, V 2 = U, U 2 We will seek a vector U 3 in the form U 3 = V 3 a 3 U a 32 U 2 (7)

11 The conditions U 3 U = and U 3 U 2 = are required and consequently, = U 3 U = (V 3 a 3 U a 32 U 2 ) U = V 3 U a 3 U U, = U 3 U 2 = (V 3 a 3 U a 32 U 2 ) U = V 3 U 2 a 32 U 2 U 2 since U U 2 = U 2 U = It follows that a 3 = V 3 U U U, a 32 = V 3 U 2 U 2 U 2 and U, U 2, U 3 are pairwise orthogonal If U 3 then it is included in the system Σ and the process is going on Step l Suppose that we have already determined pairwise orthogonal vectors U, U 2,, U s such that V, V 2, V l = U, U 2,, U s where Then as above U s = V l a l, U a l,s U s (8) a l,j = V l U j U j U j (9) If U s then it is included in Σ and so on Since the number of vectors is finite then this process will terminate after k steps and we obtain a desired orthogonal system Σ The following transform (normalization) is to be done in order to make Σ orthonormal: {U,, U m } { U U,, } U m U m The Gram-Schmidt orthogonalization process from a geometrical point of view The l-th step of the Gram-Schmidt orthogonalization process can be viewed as follows (see Figure below) Denote L s = U, U 2,, U s A given vector V l should be represented as the (unique) sum V l = U s + (V l U s ) where U s is orthogonal to the subspace L s (the vector U s is called the orthogonal component of V l with respect to L s ) and V l U s L s (the vector V l U s is the orthogonal projection of V l onto L s ) It is sufficient to determine (cf (8)) since V l U s = a l, U + + a l,s U s (2) U s = V l (V l U s ) The coefficients a l,j in (2) are defined by the formula (9)

12 U s V l L s = U, U 2,, U s V l U s Fig Note that the distance between the endpoint of V l and the subspace L s is equal to the length U s Example 6 Determine the orthogonal basis of the subspace L R 4 spanned by the vectors V = (, 2, 2, ), V 2 = (,, 5, 3), V 3 = (4, 5, 3, 8) Solution Apply the Gram-Schmidt orthogonalization process to the system S = {V, V 2, V 3 } Step Step 2 U = V = (, 2, 2, ) where Consequently, Step 3 a 2 = V 2 U U U = U 2 = V 2 a 2 U ( 5) + ( ) ( ) 2 = U 2 = V 2 ( )U = (2, 3, 3, 2) where Hence, U 3 = V 3 a 3 U a 32 U 2 a 3 = V 3 U U U = 2, a 32 = V 3 U 2 U 2 U 2 = 3 U 3 = V 3 ( 2)U 3U 2 = Thus, U 3 is not included in the orthogonal basis and the subspace L is 2-dimensional The orthogonal basis is Σ = {U, U 2 } The normalized system Σ norm is { } {( U U, U 2 =, U 2 > with(linearalgebra): 2, 2, ) ( 2,, , 3, )} 2 26

13 > v:=vector[row]([,2,2,-]); v := [, 2, 2, ] > v2:=vector[row]([,,-5,3]); v2 := [,, 5, 3] > v3:=vector[row]([4,5,-3,8]); v3 := [4, 5, 3, 8] > Basis([v,v2,v3]); #this is to determine > the dimension of subspace <v,v2,v3> [[, 2, 2, ], [,, 5, 3]] > res:=gramschmidt([v,v2,v3]); res := [[, 2, 2, ], [2, 3, 3, 2]] > A:=Matrix([[res[]],[res[2]]]); [ ] 2 2 A := > C:=ATranspose(A); # this must be the > diagonal matrix of inner products C := [ 26 > Normalize(res[],2); # 2 stands for the euclidian norm [ ],,, 5 5 > Normalize(res[2],2); [ ] , 26, 26, > res_norm:=gramschmidt([v,v2,v3],normalized); res_norm := [[ ],,,, 5 5 ] [ ]] , 26, 26, Example 7 Determine the distance d between the (endpoint of the) vector X =(, 5,, ) and the subspace L R 4 given as the subspace of solutions of the linear homogeneous system { 4x L : 2 x 3 + 3x 4 = 2x + 2x 2 + x 3 + x 4 = Solution First of all, let us find a basis of L As a basis a fundamental system of solutions of the above linear system can be taken > restart;with(linearalgebra): > A := Matrix([[,4,-,3],[2,2,,]]); [ 4 3 A := 2 2 > B:=Vector([,]); ] 3

14 [ B := > X:=LinearSolve(A, B,free= t ); > t[2]:=;t[4]:=; X := ] 3 t 2 2 t 4 t 2 4 t t 4 t 4 t 2 := t 4 := > v:=transpose(x); > t[2]:=;t[4]:=; > v2:=transpose(x); v := [ 2,, 3, ] t 2 := t 4 := v2 := [ 3,, 4, ] Thus, the vectors V = ( 2,, 3, ) and V 2 = ( 3,, 4, ) form a basis of L The second step is the orthogonalization of the basis S = {V, V 2 } We obtain the orthogonal basis {U, U 2 } of L where U = ( 2,, 3, ), U 2 = ( 3, 7,, 9) The last step is the orthogonalization of the system S = {U, U 2, X} (see Figure There the vector V l should be thought as X, the vector U s corresponds to the resulting vector Y below) As a result we get the orthogonal system {U, U 2, Y } where ( 3 Y = 5, 3 5, 5, 9 ) 5 Consequently, > with(linearalgebra): d = Y = = 65 5 > v:=vector[row]([-2,,3,]); v := [ 2,, 3, ] > v2:=vector[row]([-3,,4,]); v2 := [ 3,, 4, ] > U:=GramSchmidt([v,v2]); > u:=u[]; U := [ [ 2,, 3, ], 4 [ 3 7,, 7, 9 ]] 7

15 u := [ 2,, 3, ] > u2:=7*u[2]; # we scale the vector U[2] by a non-trivial factor u2 := [ 3, 7,, 9] > X:=Vector[row]([-,5,,-]); X := [, 5,, ] > U:=GramSchmidt([u,u2,X]); > Y:=U[3]; U := > distance:=norm(y,2); [ [ 2,, 3, ], [ 3, 7,, 9], Y := [ 3 5, 3 5, 5, 9 ] 5 distance := [ 3 5, 3 5, 5, 9 ]] 5 Remark For this example we split the whole Maple worksheet into two parts according to the algorithm It is more convenient to compute the distance using a single worksheet 5 The matrix exponential function Given a square matrix A with real or complex entries the matrix exponential function exp is defined as follows exp(a) = E + A + A2 2! + A3 3! + + An n! + = n= A n n! (2) Here E stands for the identity matrix The matrix series converges to the matrix exp(a) for any matrix A The exponential matrix plays a big role in solving linear systems of differential equations Example 8 Determine exp(a) if A = λ λ 2 λ 3 Solution Since λ n A n = λ n 2 λ n 3 and we have the expansion series for any number z C e z = exp z = + z + z2 2! + z3 3! + + zn n! + = z n n!, then it follows from the definition (2) that e λ exp(a) = e λ 2 e λ 3 5 n=

16 Example 9 Determine exp(ta) (here t R) if ( ) A = > with(linearalgebra): > A:=Matrix([[,],[-,]]); [ A := > with(linalg): exponential(a*t); [ e t cos(t) e t sin(t) e t sin(t) e t cos(t) As a result we obtain the following matrix ( e exp(ta) = t cos t e t sin t e t sin t e t cos t Example Determine exp(tj) if J = ] λ λ λ λ ] ) > with(linearalgebra): > J:=Matrix([[lambda,,,],[,lambda,,],[,,lambda,],[,,,lambda > ]]); J := λ λ λ λ > with(linalg): exponential(j*t); e (t λ) t e (t λ) 2 t2 e (t λ) 6 t3 (t λ) e e (t λ) t e (t λ) 2 t2 (t λ) e e (t λ) (t λ) t e (t λ) e Thus, exp(tj) = e tλ t e tλ t 2 2! etλ t 3 3! etλ e tλ t e tλ t 2 2! etλ e tλ t e tλ e tλ 6

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal . Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal 3 9 matrix D such that A = P DP, for A =. 3 4 3 (a) P = 4, D =. 3 (b) P = 4, D =. (c) P = 4 8 4, D =. 3 (d) P

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Reduction to the associated homogeneous system via a particular solution

Reduction to the associated homogeneous system via a particular solution June PURDUE UNIVERSITY Study Guide for the Credit Exam in (MA 5) Linear Algebra This study guide describes briefly the course materials to be covered in MA 5. In order to be qualified for the credit, one

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Math 2030 Assignment 5 Solutions

Math 2030 Assignment 5 Solutions Math 030 Assignment 5 Solutions Question 1: Which of the following sets of vectors are linearly independent? If the set is linear dependent, find a linear dependence relation for the vectors (a) {(1, 0,

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Quizzes for Math 304

Quizzes for Math 304 Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

MATH Spring 2011 Sample problems for Test 2: Solutions

MATH Spring 2011 Sample problems for Test 2: Solutions MATH 304 505 Spring 011 Sample problems for Test : Solutions Any problem may be altered or replaced by a different one! Problem 1 (15 pts) Let M, (R) denote the vector space of matrices with real entries

More information

Linear Algebra. and

Linear Algebra. and Instructions Please answer the six problems on your own paper. These are essay questions: you should write in complete sentences. 1. Are the two matrices 1 2 2 1 3 5 2 7 and 1 1 1 4 4 2 5 5 2 row equivalent?

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

ANSWERS. E k E 2 E 1 A = B

ANSWERS. E k E 2 E 1 A = B MATH 7- Final Exam Spring ANSWERS Essay Questions points Define an Elementary Matrix Display the fundamental matrix multiply equation which summarizes a sequence of swap, combination and multiply operations,

More information

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a). .(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: 11/21/07) Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

Math 314H Solutions to Homework # 3

Math 314H Solutions to Homework # 3 Math 34H Solutions to Homework # 3 Complete the exercises from the second maple assignment which can be downloaded from my linear algebra course web page Attach printouts of your work on this problem to

More information

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true? . Let m and n be two natural numbers such that m > n. Which of the following is/are true? (i) A linear system of m equations in n variables is always consistent. (ii) A linear system of n equations in

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name: Math 34/84 - Exam Blue Exam Solutions December 4, 8 Instructor: Dr. S. Cooper Name: Read each question carefully. Be sure to show all of your work and not just your final conclusion. You may not use your

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8 Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Math 5- Computation Test September 6 th, 6 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Name: Answer Key: Making Math Great Again Be sure to show your work!. (8 points) Consider the following

More information

Math 1553, Introduction to Linear Algebra

Math 1553, Introduction to Linear Algebra Learning goals articulate what students are expected to be able to do in a course that can be measured. This course has course-level learning goals that pertain to the entire course, and section-level

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

LINEAR ALGEBRA KNOWLEDGE SURVEY

LINEAR ALGEBRA KNOWLEDGE SURVEY LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues

More information

Final Exam Practice Problems Answers Math 24 Winter 2012

Final Exam Practice Problems Answers Math 24 Winter 2012 Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Page of 7 Linear Algebra Practice Problems These problems cover Chapters 4, 5, 6, and 7 of Elementary Linear Algebra, 6th ed, by Ron Larson and David Falvo (ISBN-3 = 978--68-78376-2,

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Notes on basis changes and matrix diagonalization

Notes on basis changes and matrix diagonalization Notes on basis changes and matrix diagonalization Howard E Haber Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 April 17, 2017 1 Coordinates of vectors and matrix

More information

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

LINEAR ALGEBRA MICHAEL PENKAVA

LINEAR ALGEBRA MICHAEL PENKAVA LINEAR ALGEBRA MICHAEL PENKAVA 1. Linear Maps Definition 1.1. If V and W are vector spaces over the same field K, then a map λ : V W is called a linear map if it satisfies the two conditions below: (1)

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE MATH 553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE In studying for the final exam, you should FIRST study all tests andquizzeswehave had this semester (solutions can be found on Canvas). Then go

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH. 20F SAMPLE FINAL (WINTER 2010) MATH. 20F SAMPLE FINAL (WINTER 2010) You have 3 hours for this exam. Please write legibly and show all working. No calculators are allowed. Write your name, ID number and your TA s name below. The total

More information

The Gram-Schmidt Process 1

The Gram-Schmidt Process 1 The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition

More information

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45 address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test

More information

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0 Math 12 Final Exam - Dec 14 - PCYNH 122-6pm Fall 212 Name Student No. Section A No aids allowed. Answer all questions on test paper. Total Marks: 4 8 questions (plus a 9th bonus question), 5 points per

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions Warm-up True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information