10 Orthogonality Orthogonal subspaces

Size: px
Start display at page:

Download "10 Orthogonality Orthogonal subspaces"

Transcription

1 10 Orthogonality 10.1 Orthogonal subspaces In the plane R 2 we think of the coordinate axes as being orthogonal (perpendicular) to each other. We can express this in terms of vectors by saying that every vector in one axis is orthogonal to every vector in the other. Let V be an inner product space and let S and T be subsets of V. We say that S and T are orthogonal, written S T, if every vector in S is orthogonal to every vector in T: S T s t for all s S, t T Example In R 3 let S be the x 1 x 2 -plane and let T be the x 3 -axis. Show that S T. Solution Let s S and let t T. We can write s = [x 1,x 2,0] T and t = [0,0,x 3 ] T so that s,t = s T t = [ x 1 x 2 0 ] 0 0 = 0. Therefore, s t and we conclude that S T Example In R 3 let S be the x 1 x 2 -plane and let T be the x 1 x 3 - plane. Is it true that S T? Solution Although the planes S and T appear to be perpendicular in the informal sense, they are not so in the technical sense. For instance, the vector e 1 = [1,0,0] T is in both S and T, yet e 1,e 1 = e T 1 e 1 = [ ] 1 0 = 1 0, 0 so e 1 e 1. Viewing the first e 1 in S and the second e 1 in T, we see that S T. x 3 1

2 10 ORTHOGONALITY 2 Orthogonal complement. Let S be a subspace of V. The orthogonal complement of S (in V), written S, is the set of all vectors in V that are orthogonal to every vector in S: S = {v V v s for all s S}. For instance, in R 3 the orthogonal complement of the x 1 x 2 -plane is the x 3 -axis. On the other hand, the orthogonal complement of the x 3 -axis is the x 1 x 2 -plane. It follows using the inner product axioms that if S is a subspace of V, then so is its orthogonal complement S. The next theorem says that in order to show that a vector is orthogonal to a subspace it is enough to check that it is orthogonal to each vector in a spanning set for the subspace. Theorem. Let {b 1,b 2,...,b n } be a set vectors in V and let S = Span{b 1,b 2,...,b n }. A vector v in V is in S if and only if v b i for each i. Proof. For simplicity of notation, we prove only the special case when n = 2 so that S = Span{b 1,b 2 }. If v is in S, then it is orthogonal to every vector in S, so, in particular, v b 1 and v b 2. Now assume that v V and v b 1 and v b 2. If s S, then we can write s = αb 1 +αb 2 for some α 1,α 2 R.

3 10 ORTHOGONALITY 3 So using properties of the inner product we get v,s = v,α 1 b 1 +α 2 b 2 = α 1 v,b 1 +α 2 v,b 2 (ii), (iii), and (iv) = α 1 0+α 2 0 v b 1 and v b 2 = 0. Therefore, v s. This shows that v is orthogonal to every vector in S so that v S. Theorem. If A is a matrix, then the orthogonal complement of the row space of A is the null space of A: (RowA) = NullA. Proof. Let Abe anm n matrixandlet b 1,b 2,...,b m be the rowsofawritten as columns so that RowA = Span{b 1,b 2,...,b m }. Let x be a vector in R n. The product Ax is the m 1 matrix obtained by taking the dot products of x with the rows of A: Ax = b T 1. b T m x = b T 1 x. b T mx so, in view of the previous theorem, x is in (RowA) if and only if b T i x = 0 for each i and this holds if and only if x is in NullA Example Let S be the subspace of R 4 spanned by the vectors [1,0, 2,1] T and [0,1,3, 2] T. Find a basis for S. Solution We first note that S = RowA, where [ ] A = According to the theorem, S = (RowA) = NullA so we need only find a basis for the null space of A. The augmented matrix we write to solve Ax = 0 is already in reduced row echelon form (RREF): [ ] so NullA = {[2t s, 3t+2s,t,s] T t,s R}. We have 2t s 2 1 3t+2s t = t 3 1 +s 2 0, s 0 1

4 10 ORTHOGONALITY 4 and therefore {[2, 3,1,0] T,[ 1,2,0,1] T } is a basis for NullA (and hence S ). The next theorem generalizes the notion that the shortest distance from a point to a plane is achieved by dropping a perpendicular. Theorem. Let S be a subspace of V, let b be a vector in V and assume that b = s 0 + r with s 0 S and r S. For every s S we have dist(b,s 0 ) dist(b,s). Proof. Let s S. We have b s 2 = (b s 0 )+(s 0 s) 2 = b s s 0 s 2 Pythagorean theorem b s 0 2 (notethat the Pythagoreantheoremappliessince b s 0 = r S ands 0 s S so that (b s 0 ) (s 0 s)). Therefore, dist(b,s 0 ) = b s 0 b s = dist(b,s) Least squares Suppose that the matrix equation Ax = b has no solution. In terms of distance, this means that dist(b,ax) is never zero, no matter what x is.

5 10 ORTHOGONALITY 5 Instead of leaving the equation unsolved, it is sometimes useful to find an x 0 that is as close to being a solution as possible, that is, for which the distance from b to Ax 0 is less than or equal to the distance from b to Ax for every other x. This is called a least squares solution. A least squares solution can be used, for instance, to find the line that best fits some data points (see Example ). Least squares. Let A be an m n matrix and let b be a vector in R n. If x = x 0 is a solution to A T Ax = A T b. then, for every x R n, dist(b,ax 0 ) dist(b,ax). Such an x 0 is called a least squares solution to the equation Ax = b. Proof. Let x 0 R n and assume that A T Ax 0 = A T b. Letting s 0 = Ax 0, we have A T (b s 0 ) = A T (b Ax 0 ) = 0, so that b s 0 NullA T = (RowA T ) (Theorem in Section 10.1) = (ColA) = S, where S = {Ax x R n } (the product Ax can be interpreted as the linear combination of the columns of A with the entries of x as scalar factors, so S is the set of all linear combinations of the columns of A, which is ColA). This shows that the vector r = b s 0 is in S, so that b = s 0 +r with s 0 S and r S. By the last theorem of Section 10.1, for every x R n, we have dist(b,ax 0 ) = dist(b,s 0 ) dist(b,ax) Example Use a least squares solution to find a line that best fits the data points (1,2), (3,2), and (4,5).

6 10 ORTHOGONALITY 6 Solution Here is the graph: If we write the desired line as c+dx = y, then ideally the line would go through all three points giving the system c+d = 2 c+3d = 2 c+4d = 5, which can be written as the matrix equation Ax = b, where A = 1 1 [ c 1 3, x =, b = d] The least squares solution is obtained by solving the equation A T Ax = A T b. We have [ ] A T A = 1 1 [ ] = and [ ] A T b = 2 2 = [ ] 9, 28 so the matrix equation has corresponding augmented matrix [ ] [ ] [ ] [ [ ] ]

7 10 ORTHOGONALITY 7 Therefore, c = 5 7 and d = 6 7 and the best fitting line is y = x, which is the line shown in the graph. (We could tell in advance that the matrix equation Ax = b has no solution since the points are not collinear. In general, however, it is not necessary to first check that an equation has no solution before applying the least squares method since, if it has a solution, then that is what the method will produce.) We can use the last example to explain the terminology least squares. The method gives a line that minimizes the sum of the squares of the vertical distances from the points to the lines. This sum of squares is s 2 1 +s2 2 +s2 3, with s 1, s 2, and s 3 as indicated: We have s 2 1 +s2 2 +s2 3 = ( 2 (c+d) ) 2 + ( 2 (c+3d) ) 2 + ( 5 (c+4d) ) 2 = b Ax 2 = dist(b,ax) 2 A least squares solution x = [c,d] T makes dist(b,ax) as small as possible and therefore makes s 2 1 +s 2 2 +s 2 3 as small as possible as well Example Use a least squares solution to find a parabola that best fits the data points ( 1,8), (0,8), (1,4), and (2,16). Solution Here is the graph:

8 10 ORTHOGONALITY 8 Ideally, a parabola would go through all four points, so if we write its equation as c+dx+ex 2 = y, then c d +e = 8 c = 8 c +d +e = 4 c+2d+4e = 16, which can be written as the matrix equation Ax = b, where A = , x = c 8 d, b = 8 4. e The least squares solution is obtained by solving the equation A T Ax = A T b. We have A T A = = and A T b = = 36 28,

9 10 ORTHOGONALITY 9 so the matrix equation has corresponding augmented matrix Therefore, c = 5, d = 1, and e = 3, and the best fitting parabola has equation y = 5 x+3x 2, which is the parabola shown in the graph Gram-Schmidt process The standard basis vectors e 1,e 2,...,e n in R n have two useful properties: they are pairwise orthogonal (any two are orthogonal), each is a unit vector (a vector of norm one). If we have a basis for an inner product space V that does not already have these properties, then we can change it into a basis that does by using the Gram-Schmidt process.

10 10 ORTHOGONALITY 10 Orthonormal set. Let {b 1,b 2,...,b s } be a set of vectors in the inner product space V. The set is orthogonal if b i,b j = 0 for all i j (the vectors are pairwise orthogonal). The set is orthonormal if it is orthogonal and each vector is a unit vector. Any orthogonal set of nonzero vectors can be changed into an orthonormal set by dividing each vector by its norm. For instance, {[1,1] T,[ 1,1] T } is an orthogonalset in R 2 and eachofthesevectorshas norm 2, so{[ 1 2, 1 2 ] T,[ 1 2, 1 2 ] T } is an orthonormal set: Let V be an inner product space. The Gram-Schmidt process requires a formula for the projection of a vector on a subspace S.

11 10 ORTHOGONALITY 11 Theorem. Let S be a subspace of V and let {u 1,u 2,...,u s } be an orthonormal basis for S. Let b be a vector in V and let p = s b,u i u i. i=1 Then p S and b p S : The vector p is the projection of b on S. Proof. We prove only the special case s = 2 so that {u 1,u 2 } is a basis for S and p = b,u 1 u 1 + b,u 2 u 2. Since p is a linear combination of u 1 and u 2, it is in S. We have b p,u 1 = b ( b,u 1 u 1 + b,u 2 u 2 ),u1 = b,u 1 b,u 1 u 1,u 1 b,u 2 u 2,u 1 (iii) and (iv) = b,u 1 b,u 1 (1) b,u 1 (0) u 1 is unit vector and u 1 u 2 = 0, so b p is orthogonal to u 1. Similarly, b p is orthogonal to u 2. Since S = Span{u 1,u 2 }, it follows from the theorem in Section 10.1 that b p is in S. We now describe the Gram-Schmidt process in the case of three initial vectors. Suppose that {b 1,b 2,b 3 } is a basis for an inner product space V. The Gram- Schmidt processusesthese vectorsto produce anorthonormal basis{u 1,u 2,u 3 } for V:

12 10 ORTHOGONALITY 12 Here are the first two steps of the process (the view is looking straight down at the plane spanned by b 1 and b 2 and we write v 1 v to mean v v): u 1 = b 1 b 1 u 2 = b 2 p 1 b 2 p 1 where p 1 = b 2,u 1 u 1 Here is the third step of the process: u 3 = b 3 p 2 b 3 p 2 where p 2 = b 3,u 1 u 1 + b 3,u 2 u 2 The general statement is as follows:

13 10 ORTHOGONALITY 13 Gram-Schmidt process. Let {b 1,b 2,...,b n } be a basis for the inner product space V. Define vectors u 1,u 2,...,u n recursively by u 1 = b 1 b 1 u k = b k p k 1 b k p k 1, where p k 1 k 1 = b k,u i u i (k > 1) Then {u 1,u 2,...,u n } is an orthonormal basis for V. Moreover, Span{u 1,u 2,...,u k } = Span{b 1,b 2,...,b k } for each k. i=1 We will not give a detailed proof. However, we note that for each k the vector b k p k 1 is in the orthogonal complement of Span{u 1,u 2,...,u k 1 } by the preceding theorem so that u k u i for 1 i < k Example Let b 1 = [1,2,2,4] T, b 2 = [ 2,0, 4,0] T, and b 3 = [ 1,1,2,0] T, and let S be the span of these vectors. Apply the Gram-Schmidt process to {b 1,b 2,b 3 } to obtain an orthonormal basis {u 1,u 2,u 3 } for S. Solution Sincethe givenvectorsarein R 4, the innerproductisthe dotproduct and the norm of a vector x is given by the formula x = x T x = x 2 1 +x2 2 +x2 3 +x2 4. We use the formula αx = α x to simplify computations. First, Next, and u 1 = b 1 b 1 = [1,2,2,4]T [1,2,2,4] T = 1 5 [1,2,2,4]T. p 1 = b 2,u 1 u 1 = [ 2,0, 4,0] T, 1 5 [1,2,2,4]T u 1 = 2u 1 = 2 5 [1,2,2,4]T b 2 p 1 = [ 2,0, 4,0] T [1,2,2,4]T = 4 5 [ 2,1, 4,2]T, so u 2 = b 4 2 p 1 b 2 p 1 = 5 [ 2,1, 4,2]T 4 5 [ 2,1, 4,2]T = 1 5 [ 2,1, 4,2]T.

14 10 ORTHOGONALITY 14 Finally, p 2 = b 3,u 1 u 1 + b 3,u 2 u 2 = [ 1,1,2,0] T, 1 5 [1,2,2,4]T u 1 + [ 1,1,2,0] T, 1 5 [ 2,1, 4,2]T u 2 = u 1 u 2 = 1 5 [1,2,2,4]T 1 5 [ 2,1, 4,2]T = 1 5 [3,1,6,2]T and b 3 p 2 = [ 1,1,2,0] T 1 5 [3,1,6,2]T = 2 5 [ 4,2,2, 1]T, so u 3 = b 2 3 p 2 b 3 p 2 = 5 [ 4,2,2, 1]T 2 5 [ 4,2,2, 1]T = 1 5 [ 4,2,2, 1]T. Therefore, { 1 5 [1,2,2,4]T, 1 5 [ 2,1, 4,2]T, 1 5 [ 4,2,2, 1]T } is an orthonormal basis for S Example Let S be the span of the functions 1, x, and x 2 in C [0,1]. Apply the Gram-Schmidt process to {1,x,x 2 } to obtain an orthonormal basis {u 1,u 2,u 3 } for S. Solution The inner product in C [0,1] is given by f,g = 1 f(x)g(x)dx. Let 0 b 1 = 1, b 2 = x, and b 3 = x 2. First, so b 1 = b 1,b 1 = 1,1 = 1 0 1dx = 1, u 1 = b 1 b 1 = 1. Next, ( 1 p 1 = b 2,u 1 u 1 = x,1 u 1 = and b 2 p 1 = x 1 2, 0 ) xdx u 1 = 1 2 u 1 = 1 2 so u 2 = b 2 p 1 b 2 p 1 = x 1 2 x 1 2 = x 1 2 = 2 3(x (x 1 2 ) = 3(2x 1). 2 )2 dx

15 10 ORTHOGONALITY 15 Finally, and so p 2 = b 3,u 1 u 1 + b 3,u 2 u 2 = x 2,1 u 1 + x 2, 3(2x 1) u 2 ( 1 ) ( 3 1 ) = x 2 dx u 1 + x 2 (2x 1)dx u 2 = 1 3 u u = 1 3 (1)+ ( ) 3 6 3(2x 1) = x 1 6 b 3 p 2 = x 2 (x 1 6 ) = x2 x+ 1 6, u 3 = b 3 p 2 b 3 p 2 = x2 x+ 1 6 x 2 x+ 1 6 = x 2 x (x2 x+ 1 6 )2 dx = 6 5(x 2 x+ 1 6 ) = 5(6x 2 6x+1). Therefore {1, 3(2x 1), 5(6x 2 6x+1)} is an orthonormal basis for S. 10 Exercises 10 1 Let x 1 = [ 1,0,1] T, x 2 = [0,1, 1] T, and x 3 = [1,1,1] T. Show that S T, where S = Span{x 1,x 2 } and T = Span{x 3 } Let S be the subspace of P 3 spanned by x and x 2. Find S, where the inner product on P 3 is as in Section 9.5 with x 1 = 1, x 2 = 0, and x 3 = 1. Hint: By the first theorem of Section 10.1 a polynomial p = ax 2 +bx+c is in S if and only if it is orthogonal to both x and x 2.

16 10 ORTHOGONALITY Let S be the subspace of R 4 spanned by the vectors [1,3, 4,0] T and [ 2, 6,8,1] T. (a) Find a basis for S. (b) Verify that the basis vectors you found in (a) are orthogonal to the given vectors Use a least squares solution to find a line that best fits the data points (1,2), (2,4), and (3,3) Usealeastsquaressolutiontofindacurveoftheformy = acosx+bsinx that best fits the data points (0,1), (π/2,2), and (π,0). Answer: y = 1 2 cosx+2sinx Let {u 1,u 2,u 3 } be an orthonormal set in an inner product space V. Prove that u 1, u 2, and u 3 are linearly independent. Hint: Suppose that α 1 u 1 +α 2 u 2 +α 3 u 3 = 0. Apply,u 1 to both sides of this equation and use properties of the inner product to conclude that α 1 = 0. Note that similarly α 2 = 0 and α 3 = 0. You may use the fact that 0,v = 0 for every vector v V Let v and w be vectors in an inner product space V and assume that w is nonzero. Let p = v,w w,w w The vector p is the projection of v on w.

17 10 ORTHOGONALITY 17 (a) Show that (v p) w. (b) Find p when v = [1,2] T and w = [3,1] T and sketch Let b 1 = [0,0, 1,1] T, b 2 = [1,0,0,1] T, and b 3 = [1,0, 1,0] T, and let S be the span of these vectors in R 4. Apply the Gram-Schmidt process to {b 1,b 2,b 3 } to obtain an orthonormal basis {u 1,u 2,u 3 } for S. Answer: { 1 2 [0,0, 1,1] T, 1 6 [2,0,1,1] T, 1 3 [1,0, 1, 1] T } Let S be the span of the functions x, x+1, and x 2 1 in C [0,1]. Apply the Gram-Schmidt process to {x,x+1,x 2 1} to obtain an orthonormal basis {u 1,u 2,u 3 } for S. Answer: { 3x, 3x+2, 5(6x 2 6x+1)}.

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

The Gram-Schmidt Process 1

The Gram-Schmidt Process 1 The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Inner Product Spaces

Inner Product Spaces Inner Product Spaces Introduction Recall in the lecture on vector spaces that geometric vectors (i.e. vectors in two and three-dimensional Cartesian space have the properties of addition, subtraction,

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Practice Final Exam. Solutions.

Practice Final Exam. Solutions. MATH Applied Linear Algebra December 6, 8 Practice Final Exam Solutions Find the standard matrix f the linear transfmation T : R R such that T, T, T Solution: Easy to see that the transfmation T can be

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

Homework 5. (due Wednesday 8 th Nov midnight)

Homework 5. (due Wednesday 8 th Nov midnight) Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Problem 1: Solving a linear equation

Problem 1: Solving a linear equation Math 38 Practice Final Exam ANSWERS Page Problem : Solving a linear equation Given matrix A = 2 2 3 7 4 and vector y = 5 8 9. (a) Solve Ax = y (if the equation is consistent) and write the general solution

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

MATH 2030: ASSIGNMENT 4 SOLUTIONS

MATH 2030: ASSIGNMENT 4 SOLUTIONS MATH 23: ASSIGNMENT 4 SOLUTIONS More on the LU factorization Q.: pg 96, q 24. Find the P t LU factorization of the matrix 2 A = 3 2 2 A.. By interchanging row and row 4 we get a matrix that may be easily

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

GENERAL VECTOR SPACES AND SUBSPACES [4.1] GENERAL VECTOR SPACES AND SUBSPACES [4.1] General vector spaces So far we have seen special spaces of vectors of n dimensions denoted by R n. It is possible to define more general vector spaces A vector

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Exercises for Unit I (Topics from linear algebra)

Exercises for Unit I (Topics from linear algebra) Exercises for Unit I (Topics from linear algebra) I.0 : Background This does not correspond to a section in the course notes, but as noted at the beginning of Unit I it contains some exercises which involve

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name: Math 34/84 - Exam Blue Exam Solutions December 4, 8 Instructor: Dr. S. Cooper Name: Read each question carefully. Be sure to show all of your work and not just your final conclusion. You may not use your

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition 6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V. MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have

More information

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality m:033 Notes: 6. Inner Product, Length and Orthogonality Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman April, 00 The inner product Arithmetic is based on addition and

More information

Linear algebra II Homework #1 due Thursday, Feb A =

Linear algebra II Homework #1 due Thursday, Feb A = Homework #1 due Thursday, Feb. 1 1. Find the eigenvalues and the eigenvectors of the matrix [ ] 3 2 A =. 1 6 2. Find the eigenvalues and the eigenvectors of the matrix 3 2 2 A = 2 3 2. 2 2 1 3. The following

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n. 0 Subspaces (Now, we are ready to start the course....) Definitions: A linear combination of the vectors v, v,..., v m is any vector of the form c v + c v +... + c m v m, where c,..., c m R. A subset V

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Announcements Wednesday, November 15

Announcements Wednesday, November 15 3π 4 Announcements Wednesday, November 15 Reviews today: Recitation Style Solve and discuss Practice problems in groups Preparing for the exam tips and strategies It is not mandatory Eduardo at Culc 141,

More information

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation

More information

Exercises for Unit I (Topics from linear algebra)

Exercises for Unit I (Topics from linear algebra) Exercises for Unit I (Topics from linear algebra) I.0 : Background Note. There is no corresponding section in the course notes, but as noted at the beginning of Unit I these are a few exercises which involve

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

ICS 6N Computational Linear Algebra Vector Space

ICS 6N Computational Linear Algebra Vector Space ICS 6N Computational Linear Algebra Vector Space Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 24 Vector Space Definition: A vector space is a non empty set V of

More information

Math 308 Practice Final Exam Page and vector y =

Math 308 Practice Final Exam Page and vector y = Math 308 Practice Final Exam Page Problem : Solving a linear equation 2 0 2 5 Given matrix A = 3 7 0 0 and vector y = 8. 4 0 0 9 (a) Solve Ax = y (if the equation is consistent) and write the general solution

More information

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014 MAH 60 LINEAR ALGEBRA EXAM III Fall 0 Instructions: the use of built-in functions of your calculator such as det( ) or RREF is permitted ) Consider the table and the vectors and matrices given below Fill

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. = SECTION 3.3. PROBLEM. The null space of a matrix A is: N(A) {X : AX }. Here are the calculations of AX for X a,b,c,d, and e. Aa [ ][ ] 3 3 [ ][ ] Ac 3 3 [ ] 3 3 [ ] 4+4 6+6 Ae [ ], Ab [ ][ ] 3 3 3 [ ]

More information

Overview. Motivation for the inner product. Question. Definition

Overview. Motivation for the inner product. Question. Definition Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of

More information

Lecture 20: 6.1 Inner Products

Lecture 20: 6.1 Inner Products Lecture 0: 6.1 Inner Products Wei-Ta Chu 011/1/5 Definition An inner product on a real vector space V is a function that associates a real number u, v with each pair of vectors u and v in V in such a way

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

Exam in TMA4110 Calculus 3, June 2013 Solution

Exam in TMA4110 Calculus 3, June 2013 Solution Norwegian University of Science and Technology Department of Mathematical Sciences Page of 8 Exam in TMA4 Calculus 3, June 3 Solution Problem Let T : R 3 R 3 be a linear transformation such that T = 4,

More information

Linear Equations and Vectors

Linear Equations and Vectors Chapter Linear Equations and Vectors Linear Algebra, Fall 6 Matrices and Systems of Linear Equations Figure. Linear Algebra, Fall 6 Figure. Linear Algebra, Fall 6 Figure. Linear Algebra, Fall 6 Unique

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Solution to Homework 8, Math 2568

Solution to Homework 8, Math 2568 Solution to Homework 8, Math 568 S 5.4: No. 0. Use property of heorem 5 to test for linear independence in P 3 for the following set of cubic polynomials S = { x 3 x, x x, x, x 3 }. Solution: If we use

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Multiple Choice Questions

Multiple Choice Questions Multiple Choice Questions There is no penalty for guessing. Three points per question, so a total of 48 points for this section.. What is the complete relationship between homogeneous linear systems of

More information

Vector space and subspace

Vector space and subspace Vector space and subspace Math 112, week 8 Goals: Vector space, subspace, span. Null space, column space. Linearly independent, bases. Suggested Textbook Readings: Sections 4.1, 4.2, 4.3 Week 8: Vector

More information

A Primer in Econometric Theory

A Primer in Econometric Theory A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,

More information

2018 Fall 2210Q Section 013 Midterm Exam II Solution

2018 Fall 2210Q Section 013 Midterm Exam II Solution 08 Fall 0Q Section 0 Midterm Exam II Solution True or False questions points 0 0 points) ) Let A be an n n matrix. If the equation Ax b has at least one solution for each b R n, then the solution is unique

More information

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation

More information

MTH 362: Advanced Engineering Mathematics

MTH 362: Advanced Engineering Mathematics MTH 362: Advanced Engineering Mathematics Lecture 5 Jonathan A. Chávez Casillas 1 1 University of Rhode Island Department of Mathematics September 26, 2017 1 Linear Independence and Dependence of Vectors

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and 6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if (a) v 1,, v k span V and (b) v 1,, v k are linearly independent. HMHsueh 1 Natural Basis

More information

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45 address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test

More information

Reduction to the associated homogeneous system via a particular solution

Reduction to the associated homogeneous system via a particular solution June PURDUE UNIVERSITY Study Guide for the Credit Exam in (MA 5) Linear Algebra This study guide describes briefly the course materials to be covered in MA 5. In order to be qualified for the credit, one

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

MAT 242 CHAPTER 4: SUBSPACES OF R n

MAT 242 CHAPTER 4: SUBSPACES OF R n MAT 242 CHAPTER 4: SUBSPACES OF R n JOHN QUIGG 1. Subspaces Recall that R n is the set of n 1 matrices, also called vectors, and satisfies the following properties: x + y = y + x x + (y + z) = (x + y)

More information

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n

More information

Announcements Wednesday, November 15

Announcements Wednesday, November 15 Announcements Wednesday, November 15 The third midterm is on this Friday, November 17. The exam covers 3.1, 3.2, 5.1, 5.2, 5.3, and 5.5. About half the problems will be conceptual, and the other half computational.

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1 Winter 2009 I. Topics from linear algebra I.0 : Background 1. Suppose that {x, y} is linearly dependent. Then there are scalars a, b which are not both

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Math 544, Exam 2 Information.

Math 544, Exam 2 Information. Math 544, Exam 2 Information. 10/12/10, LC 115, 2:00-3:15. Exam 2 will be based on: Sections 1.7, 1.9, 3.2, 3.3, 3.4; The corresponding assigned homework problems (see http://www.math.sc.edu/ boylan/sccourses/544fa10/544.html)

More information