Chapter 4 Euclid Space
|
|
- Evelyn Butler
- 5 years ago
- Views:
Transcription
1 Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y, x), () (kx, y) = k(x, y), (3) (x + y, z) = (x, z) + (y, z), (4) (x, x) 0 for all x V and (x, x) = 0 if and only if x = 0, where x, y, z are any vectors in V and k IR is a real number. The vector space V, together with the inner product, is called a real inner product space or a Euclidean space. Example. In IR n, for x = (ξ, ξ,..., ξ n ), y = (η, η,..., η n ). Define (x, y) = n ξ i η i It is easy to verify that it satisfies ()...(4). So, IR n, together with (, ), is an inner product space. Example. Let V = C[a, b] = {all continuous functions on [a, b]}. Define, (f, g) = b a f(x)g(x)dx for f, g C[a, b]. It is an inner product space. From the definition (), the inner product is symmetric and similar to () and (3), we have ( ) (α, kβ) = (kβ, α) = k(β, α) = k(α, β). (3 ) (α, β + γ) = (β + γ, α) = (β, α) + (γ, α) = (α, β) + (α, γ).
2 Definition.. For any x V. Define It is called the norm of x. x = (x, x). Theorem.. (Cauchy - Schartz inequality) For any x, y V, (x, y) x y. The above equality holds if and only if {x, y} is linearly dependent. Proof. When y = 0. The left equals to the right. Suppose that y 0. Let t be a real number. Consider for all t. Therefore, i.e. (x + ty, x + ty) 0 (x, x) + (x, y)t + (y, y)t 0 Let t = (x,y) (y,y). Then (x, y) (x, x) (y, y) 0 (x, y) (x, x)(y, y) i.e. (x, y) x y. If {x, y} is linear dependent i.e. x = ky. The equality holds. On the other hand, if the equality hold, from the above proof, we deduce that either y = 0 or i.e. {x, y} is linear dependent. x (x, y) (y, y) y = 0 Example. Apply Cauchy inequality to IR n. We get ( n n ) / ( n a i b i a i Example. Apply Cauchy inequality to C[a, b] we get b a ( b f(x)g(x)dx a b i ) / ) / ( b ) / f (x)dx g (x)dx a
3 Definition.3. x and y are orthogonal (which is often denoted by x y) if (x, y) = 0. Definition.4. Let x, y V be non zero vectors. The angle between x and y is defined as For orthogonal vectors x and y, we have (x, y) ϑ = cos x y. x + y = x + y. Orthonormal Basis Definition.. We say that a basis {x, x,..., x n } is an orthogonal basis for V if the x i are mutually orthogonal ; that is, (x i, x j ) = 0 whenever i j. In addition, if each x i = (that is, each x i is a unit vector), we say that the basis is orthonormal. It is easy to verify that a set of mutually orthogonal vectors is linearly independent. If {x, x,..., x n } is an orthonormal basis then (x i, x j ) = δ ij where δ ij is the Chroniker notation δ ij = { i = j 0 i j Under an orthonormal basis {x, x,..., x n }, any vector x V can be expressed as In fact if x = n a ix i then (x i, x) = a i. x = (x, x)x (x n, x)x n. Theorem.. (Gram - Schmidt Theorem) Suppose that {x, x,..., x n } is mutually orthogonal in an inner product space V and x i are nonzero. Let y V and set m (y, x j ) x m+ = y (x j, x j ) x j = y j= m proj xj (y) Then the vectors x, x,..., x m, x m+ are mutually orthogonal and j= span{x,..., x m, y} = span{x, x,..., x m, x m+ }. Further, x m+ = 0 if and only if y span{x,..., x m }. 3
4 Remark.. P roj u (v) = (u,v) (u,u) u is called the projection of vector v onto u. Proof. By the definition of x m+, one sees immediately that y span{x,..., x m, x m+ } and x m+ span{x,..., x m, y}. From this, it follows that span{x,..., x m+ } = span{x,..., x m, y}. We only need to prove x m+ x i, for i =,,..., m. Note that (x m+, x i ) = (y m j= = (y, x i ) (y, x j ) (x j, x j ) x j, x i ) m j= = (y, x i ) (y, x i ) = 0 (y, x j ) (x j, x j ) (x j, x i ) For the final statement, if x m+ = 0, then y span{x,..., x m }. Conversely, suppose y span{x,..., x m }, then the set {u, u,..., u m } is a orthonormal basis for span{x,..., x m }, where u i = x i x i. Therefore y = m (y i, u i )u i = m (y, x i ) (x i, x i ) x i. i.e., x m+ = 0. The Theorem is proved. Theorem.. Any finite dimensional inner product space V has an orthogonal basis (and hence orthonormal basis). Proof. Let {x, x,..., x n } be a basis of V. We construct an orthogonal basis {w,..., w n } for V using {x,..., x n }. First, set w = x ( 0). Obviously span{w } = span{x }. For k with k n, we define w k inductively. We set k (x k, w j ) w k = x k (w j, w j ) w j j= (the Gram - Schmidt orthogonalization). By Theorem (.) span{w, w,..., w k } = span{x, x,..., x k } and w,..., w k are mutually orthogonal. Since {x,..., x k, x k } is linearly independent, we deduce that w k 0. After n-steps, we set an orthogonal basis {w, w,..., w n } of V. We may set u i = w i w i for i =,,..., n to set an orthonormal basis {u, u,..., u n }. 4
5 Example. Consider the subsapce V of IR 4 with basis of the three vectors v = (,, 0, ), v = (0,,, 4), v 3 = (3, 3, 3, 0). The vector space V has inner product defined as usual. We apply the Theorem (.) process as follows: Since v = 3, we set u = v v = ( 3, 3, 0, 3 ). Next, w = v proj u (v ) = v (v, u )u = (0,,, 4) 9 3 ( 3, 3, 0, 3 ) = (, 0,, ). Since w = 3, we obtain that u = w w = ( 3, 0, 3, 3 ). Finally, we set w 3 = v 3 proj u (v 3 ) proj u (v 3 ) = (3, 3, 3, 0) 3( 3, 3, 0, 3 ) ( )( 3, 0, 3, 3 ) = ( 0,, 3 3, 4 3 ). Since w 3 = 7, we set u 3 = w 3 w 3 = 7 ( 3,, 0 3, 4 3 ). Example. Consider P (IR) of real polynomials of degree at most. P has an inner product given by for p, q P (IR). (p, q) = 0 p(t)q(t)dt The standard basis {, t, t } is not orthogonal for this inner product. Gram-Schimidt procedure to obtain an orthonormal basis. We take w =, we see easily (, ) =, and (t, ) =. Hence We apply the To compute w 3, we calculate (w, w 3 ) = w = t = t (w, w ) = 0 0 t (t )dt =, (t ) dt = 5
6 and (t, ) = 3. Consequently, This gives an orthogonal basis for P (IR). w 3 = t 3 / / (t ) = t t + 6. Now let us study the transition matrix from one orthonormal basis to another. Let X = (x, x,..., x n ) and Y = (y, y,..., y n ) be two orthonormal bases of an inner vector space V and the transition matrix from X to Y be A = (a ij ), i.e., a a... a n (y,..., y n ) = (ɛ,..., ɛ n ) a a... a n a n a n... a nn Since (y,..., y n ) is orthonormal, (y i, y j ) = δ ij = { i = j 0 i j Note that the columns of matrix A are coordinates of y,..., y n under the orthonormal basis X = (x,..., x n ). Therefore, (y i, y j ) = a i a j + a i a j a ni a nj = { i = j 0 i j That is, A A = E or A = A. Definition.. An n n real matrix A is called orthogonal if A A = E. From above analysis, we deduce that the transition matrix from one orthonormal basis to another is orthogonal. Conversely, if the first basis is orthogonal and the transition matrix from this basis to another is orthogonal, then the second basis is orthonormal. From A A = E, we deduce that AA = AA = E. 6
7 3 Isometries Definition 3.. Two Euclidean space V and V over IR is said to be isometric if there exists a bijection σ from V to V satisfying (i) σ(x + y) = σ(x) + σ(y), for all x, y V. (ii) σ(kx) = kσ(x), for all x V and k IR. (iii) (σ(x), σ(y)) = (x, y), for all x, y V. It is easy to verify that the isometric relation is an equivalent relation. As a consequence, we have: Theorem 3.. Two finite dimensional Euclidean spaces are isometric if and only if they have the same dimension. 4 Orthogonal Transformation Definition 4.. A linear transformation A on a Euclidean space V is called an orthogonal transformation if it keeps the inner product invariant, i.e., for any x, y V, (Ax, Ay) = (x, y) An orthogonal transformation can be characterized as follows. Theorem 4.. Let A be a linear transformation on an Euclidean space V. The following four statements are equivalent: (i) A is an orthogonal transformation ; (ii) A keeps length of vector invariant, ie., Ax = x, for all x V. (iii) If {x,..., x n } is an orthonormal basis of V, so is {Aɛ,..., Aɛ n } (iv) The matrix A of A under any orthonormal basis is orthogonal. Proof. We prove that (i) is equivalent (ii). If A is an orthogonal transformation, then Ax = (Ax, Ax) = (x, x) = x Therefore, Ax = x. Conversely, if A keeps length of vectors invariant, then (Ax, Ax) = (x, x), (Ay, Ay) = (y, y) and (A(x + y), A(x + y)) = (x + y, x + y) 7
8 Expanding the last equation, we get (Ax, Ax) + (Ax, Ay) + (Ay, Ay) = (x, x) + (x, y) + (y, y) Therefore, (Ax, Ay) = (x, y), that is, A is an orthogonal transformation. We prove that (i) and (iii) are equivalent. Let {x, x,..., x n } be an orthonormal basis and A be an orthogonal transformation. Then (Ax i, Ax j ) = (x i, x j ) == { i = j 0 i j That is, {Ax, Ax,..., Ax n } is an orthonormal basis. Conversely, let {Ax,..., Ax n } is an orthonormal basis. Assume that x = ξ x + ξ x ξ n x n y = η x + η x η n x n then Ax = n ξ iax i and Ay = n η iax i and n n n (Ax, Ay) = ( ξ i Ax i, η i Ax i ) = ξ i η i (Ax i, Ax j ) = i,j= That is, A is an orthogonal transformation. n ξ i η i = (x, y) We prove (iii) and (iv) are equivalent. Let X = {x,..., x n } be an orthonormal basis and A be the matrix representation of the linear transformation A under the basis {x,..., x n }, i.e., A = Φ X (A). or (Ax,..., Ax n ) = (x,..., x n )A If {Ax,..., Ax n } is orthonormal, then A is the transition matrix from basis {x,..., x n } to the basis {Ax,..., Ax n }. So it is othorgonal. Conversely if A is an orthogonal matrix then {Ax,..., Ax n } is an orthonormal basis. Since an orthogonal matrix is invertible, so is an orthogonal transformation. In fact an orthogonal transformation is an isometry on V. Therefore the product of orthogonal transformations is orthogonal and the product of orthogonal matrices are orthogonal. If A is an orthogonal matrix then AA t = E so A = or A = ±. If A =, we say A is of the first type or a rotation. If A =, we say A is of the second type. 8
9 5 Orthogonal Complement Definition 5.. Suppose that V, V are two subspaces of an Euclidean space V. If for any x V and y V, (x, y) = 0, we say V is orthogonal to V, denoted by V V. If x V and (x, y) = 0 for all y V, we say x is orthogonal to V denoted x V. Homework. If V V then V V = {0}. Theorem 5.. If V, V,..., V s are subspaces of V and are pairwisely orthogonal then the sum V + V V n is a direct sum. Proof. Suppose that α i V i for i =,,..., s and α + α α s = 0. We prove that α = α =... = α s = 0. Taking inner product on both sides of the above equation with α and using the orthogonality, we get (α i, α i ) = 0 which implies α i = 0 for all i. That is V + V V s is a direct sum. Definition 5.. A subspace V is called the orthogonal complement of V if V V and V = V + V. Obviously if V is the orthogonal complement of V then V is the orthogonal complement of V. Theorem 5.. Any subspace V of an Euclidean space V has a unique orthogonal complement. Proof. If V = {0} then its orthogonal complement is V and uniqueness is obvious. Assume V {0}. Since V is an Euclidean space with inner product (, ) inherited from V, it has an orthogonal basis, denoted by {x, x,..., x m }. Augment it into an orthogonal basis of V, that is, {x,..., x m, x m+,..., x n }. Obviously, the orthogonal complement of V is span{x m+,..., x n }. Let us prove the uniqueness. Let V and V 3 are orthogonal complements of V then V = V V and V = V V3 9
10 If α V, then from the second equation above, α = α +α 3 where α V and α 3 V 3. Since α α and α 3 α, (α, α ) = (α, α ) + (α 3, α ) = (α + α 3, α ) = (α, α ) = 0. We deduce that α = 0. Therefore α = α 3 V 3. That is V V 3. Similarly, we can prove that V 3 V, that is V = V 3. The orthogonal complement of V is denoted by V. By the definition, Corollary 5.3. V = {x V x V }. dim V + dim V = n. From V = V V, for any x V there exist (uniquely) x V, x V such that x = x + x Define P roj V (x) = x. It is called the projection of x onto V. 6 Standard Form of Symmetric Matrices Recall that an n n matrix A is symmetric if A t = A. We will prove that for any n n real symmetric matrix, there is an orthogonal matrix T such that T t AT = T AT is a diagonal matrix. Let us study properties of symmetric matrices. Lemma 6.. lem Let A be a real symmetric matrix then all eigenvalues of A are real. Proof. Let λ 0 be an eigenvalue of A and x = (x, x,..., x n ) t C n be a corresponding eigenvector, i.e., Ax = λ 0 x. Let x = (x, x,..., x n ) t C n, where x i is the complex conjugate of x i. Then Ax = λ 0 x. Thus, (Ax, x) = x t Ax = x t A t x = (Ax) t x = (Ax) t x The left side of the above is λ 0 x t x and the right is λ 0 x t x. So λ 0 x t x = λ 0 x t x Therefore λ 0 = λ 0 is a real number since x t x 0. 0
11 Note that n n (Ax, x) = a ij x j x i = x j a ij x i i,j= i,j= n n n n = x j ( a ji x i ) = ( a ji x i )x j j= j= = (x, Ax) Moreover, (Ax, x) = λ 0 (x, x) and (x, Ax) = λ 0 (x, x). Therefore, λ 0 = λ 0 or λ 0 is real number. For a real symmetric matrix A, define a linear transformation A on IR n as follows Ax = Ax Lemma 6.. lem Let A be a real symmetric matrix and A be defined as above, then for any x, y IR n (Ax, y) = (x, Ay) (6.) or y t Ax = x t Ay. Proof. In fact y t Ax = y t A t x = (Ay) t x = x t Ay. Definition 6.. A transform A on IR n satisfying (6.) is called a self-adjoint operator on IR n. Lemma 6.3. lem3 Let A be a self-adjoint operator and V be an A-subspace. Then V is also an A-subspace. Proof. Let y V. We need to show that Ay V. For any x V, since Ax V and y V, we have (y, Ax) = 0 Therefore, (Ay, x) = (y, Ax) = 0, i.e. Ay V or Ay V. Lemma 6.4. lem4 Let A be a real symmetric matrix. Then eigenvectors in IR n associated with distinct eigenvalues are orthogonal.
12 Proof. Let λ and µ be two distinct eigenvalues and x, y be eigenvectors associated with λ and µ, respectively, i.e. Ax = λx and Ay = µy. Since (Ax, y) = (x, Ay) we have This implies (x, y) = 0 since λ µ. Now let us prove the main Theorem. λ(x, y) = (Ax, y) = (x, Ay) = µ(x, y) Theorem 6.5. Let A be a real symmetric n n matrix. There exists an n n orthogonal matrix T such that T t AT = T AT is diagonal. In other word, any real symmetric matrix is diagonalizable. Proof. We only need to show that there is an orthonormal basis of IR n that consist of eigenvectors of A. We prove it by induction. The theorem is true for n =. Suppose that theorem holds for n. For n dimensional space IR n, the linear transformation A has a real eigenvalue λ. Let x IR n be an associated eigenvector. We may normalize x such that x =. Let V = span{x }. Then V is an A subspace, by Lemma??, V = V is also an A subspace and the dimension of V is n. Consider A V. It is obvious that A V satisfies (6.). That is A V is self adjoint. By the induction assumption, A V has an orthonormal basis {x, x 3,..., x n } that consist of eigenvectors of A V. Then {x, x,..., x n } is an orthonormal basis of IR n that consist of eigenvectors of A. The Theorem is proved.. Let A be a real symmetric matrix. How do we diagonalize A? From the above Theorem, to find the orthgonal matrix T is equivalent to find an orthonormal basis of IR n that consist of eigenvectors of A. In fact, if t t n t t n t n t η =., η t =.,, η t n = n. is an orthonormal basis of IR n that consist of eigenvectors of A, then the transition matrix from ɛ, ɛ,..., ɛ n to η, η,..., η n is t t... t n T = t t... t n t n t n... t nn t nn
13 and T AT = T t AT is diagonal. Therefore, one can find the orthogonal matrix T as follows: () Find all eigenvalues of A. Let λ,..., λ r be all eigenvalues of A. () For each λ i, solve the homogenous system (λ i A)x = 0 to find a basis for the solution space V λi. Then use the Gram-Schimit process to find an orthonormal basis η i, η i,..., η iki for V λi. (3) Since λ, λ,..., λ r are distinct, {η, η,..., η k,...η r, η r,..., η rkr } are pairwise orthogonal and it forms an orthonormal basis of IR n which forms the orthogonal matrix T. Example. Let 0 A = () Find eigenvalues of A. We have λ 0 λ λ λ λ A = λ λ = 0 λ 0 λ 0 0 λ λ λ λ λ = (λ ) 3 0 = (λ ) 3 (λ + 3) 0 Therefore eigenvalues of A are λ = and λ = 3. For λ =. Solve (λ A)X = 0 to find a basis for the solution space V λ. α = (,, 0, 0) α = (, 0,, 0) α 3 = (, 0, 0, ) 3
14 Orthogonalizing it, we get Normalizing it, we get β = α = (,, 0, 0) β = α (α, β ) (β, β ) β = (,,, 0) β 3 = α 3 (α 3, β ) (β, β ) β (α 3, β ) (β, β ) β = ( 3, 3, 3, ) η = (,, 0, 0) η = (,,, 0) η 3 = (,,, 3 ) For λ = 3. Solve (λ A)X = 0 to find a basis for the solution space V λ = span{(,,, )}. Normalizing it, we get η 4 = (,,, ) Then {η, η, η 3, η 4 } form a orthonormal basis of IR 4. Therefore the orthogonal matrix T = and T AT = 3 7 Orthogonal Projection and Direct Let A be m n matrix and ker(a) = row(a) IR n. Assume that AX = 0 then x α + x α x n α n = 0 4
15 and (R i, v) = 0 for each row R i. So v row(a) iff v ker(a). Corollary 7.. Suppose A is an m n matrix of rank n. Then A t A is an invertible n n matrix. Proof. We need to show that A t AX = 0 only has zero solution. Indeed, A t AX = 0 implies that AX ker(a t ). However, AX col(a) = row(a t ). But ker(a t ) = row(a t ). This implies that Ax row(a t ) (row(a t )) = {0}. Therefore, AX = 0. Since v IR and rank(a) = n, A is injective or v = 0. Note that: AX = 0 has only zero solution iff rank(a) = n. 5
1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationTypical Problem: Compute.
Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and
More informationFinal Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015
Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,
More informationx 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7
Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationChapter 6 Inner product spaces
Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationThen x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r
Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you
More informationMath 108b: Notes on the Spectral Theorem
Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator
More information6 Inner Product Spaces
Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationPRACTICE PROBLEMS FOR THE FINAL
PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationINNER PRODUCT SPACE. Definition 1
INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of
More informationLinear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4
Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary
More informationElementary linear algebra
Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationMATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation
More informationLinear Algebra Lecture Notes-II
Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationNumerical Linear Algebra
University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices
More informationand u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by
Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v,
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationSolutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015
Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationLINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM
LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator
More informationPractice Final Exam. Solutions.
MATH Applied Linear Algebra December 6, 8 Practice Final Exam Solutions Find the standard matrix f the linear transfmation T : R R such that T, T, T Solution: Easy to see that the transfmation T can be
More information1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal
. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal 3 9 matrix D such that A = P DP, for A =. 3 4 3 (a) P = 4, D =. 3 (b) P = 4, D =. (c) P = 4 8 4, D =. 3 (d) P
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationW2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.
MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More information(v, w) = arccos( < v, w >
MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationLecture 3: Review of Linear Algebra
ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,
More informationMAT Linear Algebra Collection of sample exams
MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationUNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More informationAlgebra II. Paulius Drungilas and Jonas Jankauskas
Algebra II Paulius Drungilas and Jonas Jankauskas Contents 1. Quadratic forms 3 What is quadratic form? 3 Change of variables. 3 Equivalence of quadratic forms. 4 Canonical form. 4 Normal form. 7 Positive
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationPractice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5
Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More informationLECTURE 7. k=1 (, v k)u k. Moreover r
LECTURE 7 Finite rank operators Definition. T is said to be of rank r (r < ) if dim T(H) = r. The class of operators of rank r is denoted by K r and K := r K r. Theorem 1. T K r iff T K r. Proof. Let T
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More informationA Brief Outline of Math 355
A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting
More information1. Select the unique answer (choice) for each problem. Write only the answer.
MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]
ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0
More informationUniversity of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012
University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012 Name: Exam Rules: This is a closed book exam. Once the exam
More informationWorksheet for Lecture 25 Section 6.4 Gram-Schmidt Process
Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal
More informationMA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS
MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is
More informationFinal Exam - Take Home Portion Math 211, Summer 2017
Final Exam - Take Home Portion Math 2, Summer 207 Name: Directions: Complete a total of 5 problems. Problem must be completed. The remaining problems are categorized in four groups. Select one problem
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationExam in TMA4110 Calculus 3, June 2013 Solution
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 8 Exam in TMA4 Calculus 3, June 3 Solution Problem Let T : R 3 R 3 be a linear transformation such that T = 4,
More information5 Compact linear operators
5 Compact linear operators One of the most important results of Linear Algebra is that for every selfadjoint linear map A on a finite-dimensional space, there exists a basis consisting of eigenvectors.
More informationQuantum Computing Lecture 2. Review of Linear Algebra
Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces
More informationHomework 5. (due Wednesday 8 th Nov midnight)
Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if
More informationMath 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationDiagonalizing Matrices
Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationGQE ALGEBRA PROBLEMS
GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationhomogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45
address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test
More informationMathematics Department Stanford University Math 61CM/DM Inner products
Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector
More informationMATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:
More informationQuizzes for Math 304
Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More informationMA 265 FINAL EXAM Fall 2012
MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators
More informationContents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2
Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationMath 413/513 Chapter 6 (from Friedberg, Insel, & Spence)
Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector
More informationSolving a system by back-substitution, checking consistency of a system (no rows of the form
MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationExercise Sheet 1.
Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationAnswer Keys For Math 225 Final Review Problem
Answer Keys For Math Final Review Problem () For each of the following maps T, Determine whether T is a linear transformation. If T is a linear transformation, determine whether T is -, onto and/or bijective.
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationLecture 3: Review of Linear Algebra
ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,
More informationWI1403-LR Linear Algebra. Delft University of Technology
WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear
More information