Appendix A. Vector addition: - The sum of two vectors is another vector that also lie in the space:

Size: px
Start display at page:

Download "Appendix A. Vector addition: - The sum of two vectors is another vector that also lie in the space:"

Transcription

1 Tor Kjellsson Stockholm University Appendix A A.1 Q. Consider the ordinary vectors in 3 dimensions (a x î+a y ĵ+a zˆk), with complex components. Do the following subsets constitute a vector space? If so, what is its dimension? If not, why not? a) The subset of all vectors with a z = 0. Let us start by looking at what a Vector space is: A vector space consists of a set of vectors { α, β, γ...} together with a set of scalars {a, b, c...} is closed under the two operations: vector addition and scalar multiplication. Below we repeat some rules: Vector addition: - The sum of two vectors is another vector that also lie in the space: α + β = γ - It is commutative: - It is associative: α + β = β + α - There exists a null vector: α + ( β + γ ) = ( α + β ) + γ α + 0 = α - For every vector in the vector space there exists an inverse vector: α + α = 0 and another (more logical) way of writing α is α. Scalar multiplication: - The product of any scalar with any vector is within the space: a α = γ - It is distributive with respect to vector addition: a ( β + γ ) = a β + a γ 1

2 and scalar addition: (a + b) β = a β + b β - It is associative with respect to multiplication of scalars: a (b β ) = (ab) β - Multiplication with the scalars 0 and 1 is precisely what one would expect: 0 β = 0, 1 β = β now this was an awful lot of rules. Most of them are however quite intuitive and some can be grouped together. Let us sum them up: To check if a set is a vector space you check the following conditions: i) Any linear combination of vectors in the vector space also lie in the vector space. Especially check that the zero vector exists and that every vector has an inverse. ii) Vector addition is commutative and associative. iii) Scalar multiplication is distributive with respect to vector addition and scalar multiplication. It should also be associative with respect to multiplication of scalars. Most often the second and third criteria are a matter of course and you only need to briefly explain why they apply. Let us finally start with the problem! In this problem we want to check whether or not the set {(a x, a y, 0)} is a vector space. Now, for regular vectors the second and third criteria are a matter of course and we need not check them. The first criteria we however must check: The two vectors (a x, a y, 0), (b x, b y, 0) both lie in our subset. For any complex scalars λ, µ so do λ(a x, a y, 0), µ(b x, b y, 0) too. The sum of these: λ(a x, a y, 0) + µ(b x, b y, 0) = (λa x + µb x, λa y + µb y, 0) is also a vector in the same set. We have thus shown that the set is a vector space! (If you wonder about the null vector and the inverse vectors they are included in this statement. If you don t see why - pause and think about it - did we say anything about the unknowns we just used?) 2

3 b) The subset of vectors whose z-component is 1. This is not a vector space and there are at least two holes you can aim at (but you need only state one). First of all, there is no null vector in this set! Second of all, if you add the same vector to itself the z-component is now 2. However, this resulting vector does not meet the criteria of belonging to the set - thus the set cannot be a vector space. c) The subset of vectors whose components are all equal. Any vector in this set is of the type µ(1, 1, 1) for a complex scalar µ. Any linear combination of this is also of the same type: µ(1, 1, 1) + λ(1, 1, 1) = (µ + λ) (1, 1, 1) so the set is a vector space. Since all components are the same there is only one variable that is allowed to change and thus the dimension of the vector space is one. (Another way of seeing it is that all the vectors point in the same direction.) A.2 Q. Consider the collection of all polynomials (with complex coefficients) of degree less than N in x. For each of the following subsets, decide if it constitutes a vector space or not. If it does, suggest a convenient basis and give the dimension of the space. If not, explain why. a) The whole set. A polynomial in this set has the following appearance: p(x) = N 1 i=0 c i x i (1) and since different degrees of x are linearly independent we could represent them in the following way: p(x) = N 1 i=0 c i x i = (c 0, c 1, c 2,..., c N 1 ) (2) but this is the same representation as a vector! Given this representation it is not hard to convince oneself that the second and third criteria from the previous 3

4 problem (in the box) apply. We also easily see that any linear combination of two such elements also is a polynomial of the same type so we have a vector space! (as before - in the term any, the null element and inverse element are also included.) An appropriate basis is of course {x n } for integers n such that 0 n N 1. The dimension is the total number of values that n can take on - in this case N. b) The subset of even polynomials. The same reasoning as above applies but now with a basis {x 2n }. The dimension is once again given by the total number of values that n can take on. In the case of odd value of N this is for 0 n (N 1)/2 which amounts to (N + 1)/2. In the case of an even N we instead have 0 n (N 2)/2 (why? 1 ) and the dimension is thus N/2. c) The subset where the leading coefficient (the one in front of x n 1 ) is forced to be 1. This is not a vector space and two reasons are: i) The null polynomial is not in this set. ii) The sum of two such polynomials is not in the subset. d) The subset where we require that the polynomials have the value 0 at x = 1. This might seem tricky at first but it really is not. Given any two polynomials that meet this requirement any linear combination of them will still have the value 0 at x = 1. This is thus a vector space. An appropriate basis for this would be {(x 1) n } since the polynomials most easily can be written as linear combination of these. The dimension is in this case N 1 and not N as in problem a). Why? e) The subset where we require that the polynomials have the value 1 at x = 0. This is definitely not a vector space - the sum of two such polynomials has the value 2 at x = 0. 1 Look carefully at the problem statement. 4

5 A.3 Q. Prove that the components of a vector with respect to a given basis are unique. Assume that the vector α have two representations in the same basis: α = i c i e i (3) and α = i d i e i. (4) Now consider the subtraction of these: 0 = 0 = i c i e i i d i e i = i (c i d i ) e i. (5) Since the elements of the basis are linearly independent each term in the sum has to be 0 2. Thus c i d i = 0 for all i and the representation of a vector in a given basis must be unique. A.4 Q. Use the Gram-Schmidt procedure to orthonormalize the 3-space basis: e 1 = (1 + i, 1, i) T e 2 = (i, 3, 1) T e 3 = (0, 28, 0) T In the Gram-Schmidt orthonormalization procedure there are as many steps as the number of basis vectors. The procedure can be a bit involved but the idea is easy to grasp: Start by choosing one of the vectors as a basis and then normalize it: e 1 = e 1 e 1 = 1 2 (1 + i, 1, i)t. (6) 2 Think carefully about this, it is important that you understand it. 5

6 Now we proceed to construct the next basis vector. Recall that the scalar product of two vectors is a measure of the amount they share. In order to construct an orthogonal basis vector to e 1 we must thus subtract this shared amount : c 0 e 2 = e 2 e 1 e 2 e 1 = 1 2 ((1 + i), 1, i)t c (1 + i, 1, i)t. where the constant c 0 is the norm of the right hand side and c 1 is the following scalar product: and we thus have: c 1 = e 1 e 2 = 1 2 (1 i, 1, i) (i, 3, 1)T = 2 c 0 e 2 = (i, 3, 1) T (1 + i, 1, i) T = ( 1, 2, 1 i) T. (7) The norm of the right hand side is: so our basis vector is: ( i i) = 7 e 2 = 1 7 ( 1, 2, 1 i) T. (8) Now we proceed to find the last basis vector and we do this by repeating the process from above: d 0 e 3 = e 3 e 1 e 3 e 1 e 2 e 3 e 2 but now we subtract the projection along the second basis vector as well. As before, d 0 is just the norm of the right hand side that we later divide with to obtain our normalized basis vector e 3. Doing the algebra we obtain: d 0 e 3 = (1 7i, 5, 8 + i) T and the norm of the right hand side is 140 so d 0 = 140. This then gives: and our orthonormal basis is: e 3 = (1 7i, 5, 8 + i) T e 1 = 1 2 (1 + i, 1, i)t e 2 = 1 7 ( 1, 2, 1 i) T (9) e 3 = (1 7i, 5, 8 + i) T 6

7 A.5 Q. Prove the Schwartz inequiality: α β 2 α α β β (10) Hint: Define a vector γ = β ( α β / α α ) α Following the hint we define the vector γ = β ( α β / α α ) α. Now we consider the scalar product of this vector with itself (which must be real and non-negative): ( β β α α α α γ γ 0 ) ( β α β α α α ) 0 β β α β β α β α 2 β α α β + α α α α α α 2 α α 0 β β α β 2 α α β β β α 2 + β α 2 α α α α }{{} 0 α β 2 α α 0 β β α α α β 2. 0 Note also that α β = β α (although, we did not use it here). 7

8 A.14 Q. Using the standard basis (î, ĵ, ˆk) for vectors in three dimensions: a) Construct the matrix representing a rotation through angle θ (counter-clockwise, looking down the axis towards the origin) about the z-axis. When you are asked to give the matrix that represents a transformation, it is easiest to first find what the transformation does to the basis vectors. The components of any vector then transforms with the transpose of that matrix. For a rotation about the z-axis we see that the z-axis itself is left unaltered. The x-axis and y-axis however are rotated in the plane: y y θ θ x x We see that the new position are given, in the old basis, as: ˆx = cos(θ)ˆx + sin(θ)ŷ ŷ = cos(θ + π 2 )ˆx + sin(θ + π 2 )ŷ = sin(θ)ˆx + cos(θ)ŷ (11) so the matrix equation describing the transformation is (recall that ẑ = ẑ): ˆx cos(θ)ˆx + sin(θ)ŷ + 0ẑ cos(θ) sin(θ) 0 ˆx ˆx ŷ = sin(θ)ˆx + cos(θ)ŷ + 0ẑ = sin(θ) cos(θ) 0 ŷ = T ŷ ẑ 0ˆx + 0ŷ + ẑ ẑ ẑ and hence the components of any vector in this basis will transform with the following matrix: cos(θ) sin(θ) 0 T = sin(θ) cos(θ) 0 (12)

9 b) Construct the matrix representing a rotation through angle 120 (counter-clockwise, looking down the axis toward the origin) about an axis through the point (1, 1, 1). Below are the three basis vectors î, ĵ and ˆk together with the vector (1, 1, 1) in this basis: ĵ (1,1,1) î ˆk If we imagine looking down along the vector (1, 1, 1) we see that the three basis vectors are ordered symmetrically about this vector. This also means that the projection of the three basis vectors on the normal plane of the vector (1, 1, 1) must be situated symmetrically. In a plane, however, this means that they are 120 apart and since a rotation about the normal axis is the same as a rotation in this plane the projections of the basis vectors just change places with one turn. This means that î = ĵ, ĵ = ˆk and ˆk = î. The transformation matrix for this is: T = and the matrix we seek is: T = (13)

10 c) Construct the matrix representing reflection through the xy-plane. This is really easy, just think of what happens with the basis vectors: î = î, ĵ = ĵ and ˆk = ˆk. The matrix representing this transformation is: T = and the matrix we seek, the transpose of this, is the same matrix: T = (14) A.15 Q. In the usual basis (î, ĵ, ˆk), construct the matrix T x and T y representing a rotation through angle θ about the respective axis. There is nothing holy with the rotation about the z axis that we considered in problem A.14 a) - the principle is exactly the same. We just have to keep track of which vectors are rotated in what direction: About the x axis: Now we rotate the y axis towards the z axis, so our transformation of basis vectors is: ŷ = cos(θ)ŷ + sin(θ)ẑ ẑ = sin(θ)ŷ + cos(θ)ẑ (15) giving us the transformation T x = cos(θ) sin(θ) (16) 0 sin(θ) cos(θ) and T x = 0 cos(θ) sin(θ) (17) 0 sin(θ) cos(θ) 10

11 About the y axis: Now we rotate the z axis towards the x axis, so our transformation of basis vectors is: ẑ = cos(θ)ẑ + sin(θ)ˆx ˆx = sin(θ)ẑ + cos(θ)ˆx (18) giving us the transformation cos(θ) 0 sin(θ) T y = (19) sin(θ) 0 cos(θ) and cos(θ) 0 sin(θ) T y = (20) sin(θ) 0 cos(θ) Q. Suppose now we change bases to î = ĵ, ĵ = î, ˆk = ˆk. Construct the matrix S that effects this change of basis and check that ST x S 1 and ST y S 1 are what you would expect. Once again we first define the matrix that transforms the basis vectors: S = and then transponate this to get the matrix that transforms the coordinates: S = (21) Calculating the matrices T x, T y in the new basis we obtain: cos(θ) 0 sin(θ) ST x S 1 = = T y (θ) (22) sin(θ) 0 cos(θ) and ST y S 1 = 0 cos(θ) sin(θ) = T x ( θ). (23) 0 sin(θ) cos(θ) 11

12 A.18 Q. The 2 2 matrix representing a rotation of the xy plane is: ( ) cos(θ) sin(θ) T =. (24) sin(θ) cos(θ) Show that this matrix has no real eigenvalues except for certain values of θ and state which these are. This matrix has complex eigenvalues and eigenvectors - find them! Then construct a matrix S that diagonalizes T. Explicitly show that STS 1 is diagonal. The eigenvalues of a matrix can be found by solving the eigenvalue matrix equation: T α = λ α = (T λi) α = 0 under the assumption that α 0 and I is the identity matrix. Since α 0 the equation above is only satisfied if: which for our matrix becomes: The solutions are: det (T λi) = 0 (cos(θ) λ) 2 + sin(θ) 2 = 0 = λ 2 2 cos(θ)λ + 1 = 0. λ = cos(θ) ± cos(θ) 2 1 = cos(θ) ± i sin(θ) = e ±iθ. (25) Indeed these are in general not real numbers but for the angles θ = nπ for any integer n we have λ = 1 or λ = 1. To find the eigenvectors we now reinsert our eigenvalues into the matrix equation: T α = e ±iθ α ( ) a define α = and insert the matrix: b ( ) ( ) ( ) cos(θ) sin(θ) a = e ±iθ a sin(θ) cos(θ) b b to get the system of equations: a cos(θ) b sin(θ) = a (cos(θ) ± i sin(θ)) a sin(θ) + b cos(θ) = b (cos(θ) ± i sin(θ)). (26) and from this (it actually suffices with one of the equations - check this!) we get: b = ia. (27) 12

13 So the eigenvectors have the forms: ( ) ( ) 1 1 α 1 = a and α i 2 = a i which we of course want to normalize: α 1 = 1 ( ) 1 2 i and α 2 = 1 2 ( 1 i ). (28) To construct a matrix S that diagonalizes T we just need to use the eigenvectors of T as columns in S 1 : S 1 = 1 ( ) i i Leaving the algebraic details 3 to the dedicated students we get: STS 1 = 1 2 ( ) ( ) 2 cos(θ) + 2i sin(θ) 0 e iθ 0 = 0 2 cos(θ) 2i sin(θ) 0 e iθ. (29) Note that the entries in the diagonalized matrix are the eigenvalues of the original matrix. What do you think would have happened if you had put the eigenvectors in the opposite order in S 1? Answer is put in the following footnote 4. A.19 Q. Find the eigenvalues and the eigenvectors of the following matrix: Can this matrix be diagonalized? Seeking the eigenvalues: M = ( ) 1 1. (30) 0 1 ( ( ) ( 1 1 a a = λ 0 1) b b) we obtain the characteristic equation: (1 λ) 2 = 0 = λ = 1 (31) giving the eigenvectors: ( ) ( ) 1 1 a = 0 1 b ( ) a = a + b = a b b = b = b = 0. 3 You need to compute the inverse matrix S and then do the algebra that follows. You learn best if you do it yourselves! Hint: when computing the inverse, make sure that you include a factor so that the product SS 1 really becomes the identity matrix. 4 The eigenvalues switch places with each other. 13

14 ( 1 From the equations we only get one eigenvector: e 1 =. We need two 0) linearly independent eigenvectors to span the space so this matrix can not be diagonalized. A.22 a) Q. Show that if two matrices commute in one basis then they commute in any basis. We want to check that: [T e 1, T e 2] = 0 = [T f 1, Tf 2 ] = 0 and here we exploit the similarity equation 5 ST e S 1 = T f. Now assume that [T e 1, T e 2] = 0. Explore what happens to this commutator in another basis: [T f 1, Tf 2 ] = [STe 1S 1, ST e 2S 1 ] = ST e 1S 1 ST e }{{} 2S 1 ST e }{{} 2S 1 ST e }{{} 1S 1 = }{{} T f 1 T f 2 T f 2 T f 1 ST e 1T e 2S 1 ST e 2T e 1S 1 = S [T e 1, T e }{{ 2] S 1 = 0 } (32) 0 b) Q. Show that if two matrices are simultaneously diagonalizable they commute. If two matrices are simultaneously diagonalizable it means that: SM 1 S 1 = diag(a 1, b 1,...) and SM 2 S 1 = diag(a 2, b 2,...) (33) where the notation diag(a, b,...) means a diagonal matrix with entries a,b and so on. 5 Recall that a change of basis through a linear operator S gives the transformation between the bases of a given linear operator T through this equation. Two matrices connected in this way are said to be similar. 14

15 Now, from the previous problem we know that if two matrices commute in one basis they will commute in any basis. The two diagonal matrices above are the transformation of M 1 and M 2 to another basis. But diagonal matrices commute 6 so in this basis the commutator is 0. By the former problem then we have just concluded that two matrices that are simultaneously diagonalizable commute. A.25 Q. Let: a) Q. Verify that T is hermitian. T = ( ) 1 1 i. (34) 1 + i 0 T = ( (T) ) ( ) T T i = = 1 i 0 ( ) 1 1 i = T. (35) 1 + i 0 b) Q. Find its eigenvalues. gives the characteristic equation: ( ) ( ( 1 1 i a a = λ 1 + i 0 b) b) (1 λ)( λ) (1 i)(1 + i) = 0 = λ 1 = 2, λ 2 = 1. 6 If you don t see why: write out the product of two diagonal matrices. The product will be of the type diag(a 1 b 1, a 2 b 2,...). The product of M 2 and M 1 will be diag(b 1 a 1, b 2 a 2,...) which is the same as the former diagonal matrix. 15

16 c) Q. Find and normalize the eigenvectors (note that they are orthogonal). Seeking the eigenvectors: λ = 2 : ( ) ( 1 1 i a 1 + i 0 b) ( a = 2 b) = a + b(1 i) = 2a a(1 + i) = 2b = a = b(1 i) which gives us an eigenvector that we can normalize: e 1 = 1 ( ) 1 i. (36) 3 1 λ = 1 : ( ) ( ) 1 1 i a 1 + i 0 b = ( ) a b = a + b(1 i) = a a(1 + i) = b = b = a(1 + i) which gives us an eigenvector that we can normalize: e 2 = 1 ( ) i (37) and we note that this is orthogonal to the eigenvector. (as eigenvectors belonging to distinct eigenvalues of a hermitian matrix should be). d) Q. Construct the unitary diagonalizing matrix S and check explicitly that it diagonalizes T. A unitary matrix fulfils the following equation: U = U 1. Like in problem A.18 we can construct a matrix S that diagonalizes the matrix T by using the eigenvectors of T to first construct S 1 : S 1 = 1 ( ) 1 i 1 (38) i and then obtain (do the algebra!): S = 1 ( ) 1 + i i (39) and we indeed see that the matrix S is unitary. Once again we do not show the algebraic steps but indeed: ( ) STS = (40) 0 1 as it should. 16

17 e) Q. Check that det(t) and Tr(T) are the same for the regular and diagonalized form (they are invariant under linear transformations. Determinant: det(t) = 1 1 i 1 + i 0 = 2 and = 2. Trace: The trace of a matrix is defined as the sum of the diagonal elements. A quick glance at the two matrices shows that indeed the trace is the same: 1. A.27 Q. A unitary transformation is one for which Û Û = 1 : where 1 is the identity element, for instance the identity matrix if the operators are matrices. a) Q. Show that a unitary transformation preserve inner products, in the sense that Ûα Ûβ = α β for all vectors α β. Ûα Ûβ = Û Ûα β = α β (41) b) Q. Show that the eigenvalues of a unitary transformation have modulus 1. Let α be an element in Hilbert space. Let Û be a unitary transformation. 17

18 Now we consider the eigenvalue equation: Û α = λ α Ûα = λα If the above equation holds, so must the following: From problem a) we found that: Ûα Ûα = λα λα Û Ûα α = λ λ α α = λ 2 α α Û Ûα β = α β so combining this information we see that λ 2 = 1. c) Q. Show that the eigenvectors of a unitary transformation belonging to distinct eigenvalues are orthogonal. Let Û α = a α and expectation value: Û β = b β where a b. Consider the following α Ûβ = α bβ = b α β (42) Subtracting the lower equation from the upper: Û α β = a α β = a α β (43) 0 = (b a) α β (44) and since we made the assumption that a b we must have that α β = 0. 18

MATH 235. Final ANSWERS May 5, 2015

MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Introduction to Group Theory

Introduction to Group Theory Chapter 10 Introduction to Group Theory Since symmetries described by groups play such an important role in modern physics, we will take a little time to introduce the basic structure (as seen by a physicist)

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalue-eigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying

More information

Designing Information Devices and Systems I Spring 2016 Official Lecture Notes Note 21

Designing Information Devices and Systems I Spring 2016 Official Lecture Notes Note 21 EECS 6A Designing Information Devices and Systems I Spring 26 Official Lecture Notes Note 2 Introduction In this lecture note, we will introduce the last topics of this semester, change of basis and diagonalization.

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker Archive of past papers, solutions and homeworks for MATH 224, Linear Algebra 2, Spring 213, Laurence Barker version: 4 June 213 Source file: archfall99.tex page 2: Homeworks. page 3: Quizzes. page 4: Midterm

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

k is a product of elementary matrices.

k is a product of elementary matrices. Mathematics, Spring Lecture (Wilson) Final Eam May, ANSWERS Problem (5 points) (a) There are three kinds of elementary row operations and associated elementary matrices. Describe what each kind of operation

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Mathematical Introduction

Mathematical Introduction Chapter 1 Mathematical Introduction HW #1: 164, 165, 166, 181, 182, 183, 1811, 1812, 114 11 Linear Vector Spaces: Basics 111 Field A collection F of elements a,b etc (also called numbers or scalars) with

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Notes on basis changes and matrix diagonalization

Notes on basis changes and matrix diagonalization Notes on basis changes and matrix diagonalization Howard E Haber Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 April 17, 2017 1 Coordinates of vectors and matrix

More information

Mathematical Foundations of Quantum Mechanics

Mathematical Foundations of Quantum Mechanics Mathematical Foundations of Quantum Mechanics 2016-17 Dr Judith A. McGovern Maths of Vector Spaces This section is designed to be read in conjunction with chapter 1 of Shankar s Principles of Quantum Mechanics,

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of CHAPTER III APPLICATIONS The eigenvalues are λ =, An orthonormal basis of eigenvectors consists of, The eigenvalues are λ =, A basis of eigenvectors consists of, 4 which are not perpendicular However,

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Linear algebra II Homework #1 due Thursday, Feb A =

Linear algebra II Homework #1 due Thursday, Feb A = Homework #1 due Thursday, Feb. 1 1. Find the eigenvalues and the eigenvectors of the matrix [ ] 3 2 A =. 1 6 2. Find the eigenvalues and the eigenvectors of the matrix 3 2 2 A = 2 3 2. 2 2 1 3. The following

More information

Recitation 8: Graphs and Adjacency Matrices

Recitation 8: Graphs and Adjacency Matrices Math 1b TA: Padraic Bartlett Recitation 8: Graphs and Adjacency Matrices Week 8 Caltech 2011 1 Random Question Suppose you take a large triangle XY Z, and divide it up with straight line segments into

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

88 CHAPTER 3. SYMMETRIES

88 CHAPTER 3. SYMMETRIES 88 CHAPTER 3 SYMMETRIES 31 Linear Algebra Start with a field F (this will be the field of scalars) Definition: A vector space over F is a set V with a vector addition and scalar multiplication ( scalars

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Linear Algebra. and

Linear Algebra. and Instructions Please answer the six problems on your own paper. These are essay questions: you should write in complete sentences. 1. Are the two matrices 1 2 2 1 3 5 2 7 and 1 1 1 4 4 2 5 5 2 row equivalent?

More information

INTRODUCTION TO REPRESENTATION THEORY AND CHARACTERS

INTRODUCTION TO REPRESENTATION THEORY AND CHARACTERS INTRODUCTION TO REPRESENTATION THEORY AND CHARACTERS HANMING ZHANG Abstract. In this paper, we will first build up a background for representation theory. We will then discuss some interesting topics in

More information

Recitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples

Recitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples Math b TA: Padraic Bartlett Recitation 9: Probability Matrices and Real Symmetric Matrices Week 9 Caltech 20 Random Question Show that + + + + +... = ϕ, the golden ratio, which is = + 5. 2 2 Homework comments

More information

Summary of Week 9 B = then A A =

Summary of Week 9 B = then A A = Summary of Week 9 Finding the square root of a positive operator Last time we saw that positive operators have a unique positive square root We now briefly look at how one would go about calculating the

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of Chapter 2 Linear Algebra In this chapter, we study the formal structure that provides the background for quantum mechanics. The basic ideas of the mathematical machinery, linear algebra, are rather simple

More information

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) Lecture 1: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) The eigenvalue problem, Ax= λ x, occurs in many, many contexts: classical mechanics, quantum mechanics, optics 22 Eigenvectors and

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

Math 396. An application of Gram-Schmidt to prove connectedness

Math 396. An application of Gram-Schmidt to prove connectedness Math 396. An application of Gram-Schmidt to prove connectedness 1. Motivation and background Let V be an n-dimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Lecture 2: Linear operators

Lecture 2: Linear operators Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study

More information

Quantum Information & Quantum Computing

Quantum Information & Quantum Computing Math 478, Phys 478, CS4803, February 9, 006 1 Georgia Tech Math, Physics & Computing Math 478, Phys 478, CS4803 Quantum Information & Quantum Computing Problems Set 1 Due February 9, 006 Part I : 1. Read

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

Vector spaces and operators

Vector spaces and operators Vector spaces and operators Sourendu Gupta TIFR, Mumbai, India Quantum Mechanics 1 2013 22 August, 2013 1 Outline 2 Setting up 3 Exploring 4 Keywords and References Quantum states are vectors We saw that

More information

Vectors. September 2, 2015

Vectors. September 2, 2015 Vectors September 2, 2015 Our basic notion of a vector is as a displacement, directed from one point of Euclidean space to another, and therefore having direction and magnitude. We will write vectors in

More information

CLASSICAL GROUPS DAVID VOGAN

CLASSICAL GROUPS DAVID VOGAN CLASSICAL GROUPS DAVID VOGAN 1. Orthogonal groups These notes are about classical groups. That term is used in various ways by various people; I ll try to say a little about that as I go along. Basically

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Practice problems for Exam 3 A =

Practice problems for Exam 3 A = Practice problems for Exam 3. Let A = 2 (a) Determine whether A is diagonalizable. If so, find a matrix S such that S AS is diagonal. If not, explain why not. (b) What are the eigenvalues of A? Is A diagonalizable?

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

MODEL ANSWERS TO HWK #1

MODEL ANSWERS TO HWK #1 MODEL ANSWERS TO HWK # Part B (a The four vertices are (,,, (,,, (,, and (,, The distance between the first two vertices is, since two coordinates differ by There are six edges, corresponding to the choice

More information

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0 LECTURE LECTURE 2 0. Distinct eigenvalues I haven t gotten around to stating the following important theorem: Theorem: A matrix with n distinct eigenvalues is diagonalizable. Proof (Sketch) Suppose n =

More information

Topic 2: The mathematical formalism and the standard way of thin

Topic 2: The mathematical formalism and the standard way of thin The mathematical formalism and the standard way of thinking about it http://www.wuthrich.net/ MA Seminar: Philosophy of Physics Vectors and vector spaces Vectors and vector spaces Operators Albert, Quantum

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise. Name Solutions Linear Algebra; Test 3 Throughout the test simplify all answers except where stated otherwise. 1) Find the following: (10 points) ( ) Or note that so the rows are linearly independent, so

More information

1 Dirac Notation for Vector Spaces

1 Dirac Notation for Vector Spaces Theoretical Physics Notes 2: Dirac Notation This installment of the notes covers Dirac notation, which proves to be very useful in many ways. For example, it gives a convenient way of expressing amplitudes

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Math Spring 2011 Final Exam

Math Spring 2011 Final Exam Math 471 - Spring 211 Final Exam Instructions The following exam consists of three problems, each with multiple parts. There are 15 points available on the exam. The highest possible score is 125. Your

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

Designing Information Devices and Systems I Discussion 4B

Designing Information Devices and Systems I Discussion 4B Last Updated: 29-2-2 9:56 EECS 6A Spring 29 Designing Information Devices and Systems I Discussion 4B Reference Definitions: Matrices and Linear (In)Dependence We ve seen that the following statements

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

LAKELAND COMMUNITY COLLEGE COURSE OUTLINE FORM

LAKELAND COMMUNITY COLLEGE COURSE OUTLINE FORM LAKELAND COMMUNITY COLLEGE COURSE OUTLINE FORM ORIGINATION DATE: 8/2/99 APPROVAL DATE: 3/22/12 LAST MODIFICATION DATE: 3/28/12 EFFECTIVE TERM/YEAR: FALL/ 12 COURSE ID: COURSE TITLE: MATH2800 Linear Algebra

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information