MATH 304 Linear Algebra Lecture 20: The GramSchmidt process (continued). Eigenvalues and eigenvectors.


 Lorena Bryan
 1 years ago
 Views:
Transcription
1 MATH 304 Linear Algebra Lecture 20: The GramSchmidt process (continued). Eigenvalues and eigenvectors.
2 Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v 2,...,v k V form an orthogonal set if they are orthogonal to each other: v i,v j = 0 for i j. If, in addition, all vectors are of unit norm, v i = 1, then v 1,v 2,...,v k is called an orthonormal set. Theorem Any orthogonal set is linearly independent.
3 Orthogonal projection Theorem Let V be a finitedimensional inner product space and V 0 be a subspace of V. Then any vector x V is uniquely represented as x = p + o, where p V 0 and o V 0. The component p is the orthogonal projection of the vector x onto the subspace V 0. The distance from x to the subspace V 0 is o. If v 1,v 2,...,v n is an orthogonal basis for V 0 then p = x,v 1 v 1,v 1 v 1 + x,v 2 v 2,v 2 v x,v n v n,v n v n.
4 o x p V 0
5 The GramSchmidt orthogonalization process Let V be a vector space with an inner product. Suppose x 1,x 2,...,x n is a basis for V. Let v 1 = x 1, v 2 = x 2 x 2,v 1 v 1,v 1 v 1, v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2,... v n = x n x n,v 1 v 1,v 1 v 1 x n,v n 1 v n 1,v n 1 v n 1. Then v 1,v 2,...,v n is an orthogonal basis for V.
6 v 3 x 3 p 3 Span(v 1,v 2 ) = Span(x 1,x 2 )
7 Any basis x 1,x 2,...,x n Orthogonal basis v 1,v 2,...,v n Properties of the GramSchmidt process: v k = x k (α 1 x α k 1 x k 1 ), 1 k n; the span of v 1,...,v k is the same as the span of x 1,...,x k ; v k is orthogonal to x 1,...,x k 1 ; v k = x k p k, where p k is the orthogonal projection of the vector x k on the subspace spanned by x 1,...,x k 1 ; v k is the distance from x k to the subspace spanned by x 1,...,x k 1.
8 Normalization Let V be a vector space with an inner product. Suppose v 1,v 2,...,v n is an orthogonal basis for V. Let w 1 = v 1 v 1, w 2 = v 2 v 2,..., w n = v n v n. Then w 1,w 2,...,w n is an orthonormal basis for V. Theorem Any finitedimensional vector space with an inner product has an orthonormal basis. Remark. An infinitedimensional vector space with an inner product may or may not have an orthonormal basis.
9 Orthogonalization / Normalization An alternative form of the GramSchmidt process combines orthogonalization with normalization. Suppose x 1,x 2,...,x n is a basis for an inner product space V. Let v 1 = x 1, w 1 = v 1 v 1, v 2 = x 2 x 2,w 1 w 1, w 2 = v 2 v 2, v 3 = x 3 x 3,w 1 w 1 x 3,w 2 w 2, w 3 = v 3 v 3,... v n = x n x n,w 1 w 1 x n,w n 1 w n 1, w n = v n v n. Then w 1,w 2,...,w n is an orthonormal basis for V.
10 Problem. Let Π be the plane in R 3 spanned by vectors x 1 = (1, 2, 2) and x 2 = ( 1, 0, 2). (i) Find an orthonormal basis for Π. (ii) Extend it to an orthonormal basis for R 3. x 1,x 2 is a basis for the plane Π. We can extend it to a basis for R 3 by adding one vector from the standard basis. For instance, vectors x 1, x 2, and x 3 = (0, 0, 1) form a basis for R 3 because = = 2 0.
11 Using the GramSchmidt process, we orthogonalize the basis x 1 = (1, 2, 2), x 2 = ( 1, 0, 2), x 3 = (0, 0, 1): v 1 = x 1 = (1, 2, 2), v 2 = x 2 x 2,v 1 v 1,v 1 v 1 = ( 1, 0, 2) 3 (1, 2, 2) 9 = ( 4/3, 2/3, 4/3), v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2 = (0, 0, 1) 2 4/3 (1, 2, 2) ( 4/3, 2/3, 4/3) 9 4 = (2/9, 2/9, 1/9).
12 Now v 1 = (1, 2, 2), v 2 = ( 4/3, 2/3, 4/3), v 3 = (2/9, 2/9, 1/9) is an orthogonal basis for R 3 while v 1,v 2 is an orthogonal basis for Π. It remains to normalize these vectors. v 1,v 1 = 9 = v 1 = 3 v 2,v 2 = 4 = v 2 = 2 v 3,v 3 = 1/9 = v 3 = 1/3 w 1 = v 1 / v 1 = (1/3, 2/3, 2/3) = 1 3 (1, 2, 2), w 2 = v 2 / v 2 = ( 2/3, 1/3, 2/3) = 1 3 ( 2, 1, 2), w 3 = v 3 / v 3 = (2/3, 2/3, 1/3) = 1 3 (2, 2, 1). w 1,w 2 is an orthonormal basis for Π. w 1,w 2,w 3 is an orthonormal basis for R 3.
13 Problem. Find the distance from the point y = (0, 0, 0, 1) to the subspace Π R 4 spanned by vectors x 1 = (1, 1, 1, 1), x 2 = (1, 1, 3, 1), and x 3 = ( 3, 7, 1, 3). Let us apply the GramSchmidt process to vectors x 1,x 2,x 3,y. We should obtain an orthogonal system v 1,v 2,v 3,v 4. The desired distance will be v 4.
14 x 1 = (1, 1, 1, 1), x 2 = (1, 1, 3, 1), x 3 = ( 3, 7, 1, 3), y = (0, 0, 0, 1). v 1 = x 1 = (1, 1, 1, 1), v 2 = x 2 x 2,v 1 v 1,v 1 v 1 = (1, 1, 3, 1) 4 (1, 1, 1, 1) 4 = (0, 2, 2, 0), v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2 = ( 3, 7, 1, 3) (1, 1, 1, 1) (0, 2, 2, 0) 4 8 = (0, 0, 0, 0).
15 The GramSchmidt process can be used to check linear independence of vectors! The vector x 3 is a linear combination of x 1 and x 2. Π is a plane, not a 3dimensional subspace. We should orthogonalize vectors x 1,x 2,y. v 4 = y y,v 1 v 1,v 1 v 1 y,v 2 v 2,v 2 v 2 = (0, 0, 0, 1) 1 4 (1, 1, 1, 1) 0 (0, 2, 2, 0) 8 = (1/4, 1/4, 1/4, 3/4). ( 1 v 4 = 4, 1 4, 1 4, 3 ) 1 12 = (1, 1, 1, 3) = =
16 Problem. Find the distance from the point z = (0, 0, 1, 0) to the plane Π that passes through the point x 0 = (1, 0, 0, 0) and is parallel to the vectors v 1 = (1, 1, 1, 1) and v 2 = (0, 2, 2, 0). The plane Π is not a subspace of R 4 as it does not pass through the origin. Let Π 0 = Span(v 1,v 2 ). Then Π = Π 0 + x 0. Hence the distance from the point z to the plane Π is the same as the distance from the point z x 0 to the plane Π 0. We shall apply the GramSchmidt process to vectors v 1,v 2,z x 0. This will yield an orthogonal system w 1,w 2,w 3. The desired distance will be w 3.
17 v 1 = (1, 1, 1, 1), v 2 = (0, 2, 2, 0), z x 0 = ( 1, 0, 1, 0). w 1 = v 1 = (1, 1, 1, 1), w 2 = v 2 v 2,w 1 w 1,w 1 w 1 = v 2 = (0, 2, 2, 0) as v 2 v 1. w 3 = (z x 0 ) z x 0,w 1 w 1,w 1 w 1 z x 0,w 2 w 2 w 2,w 2 = ( 1, 0, 1, 0) 0 4 (1, 1, 1, 1) 2 (0, 2, 2, 0) 8 = ( 1, 1/2, 1/2, 0). ( w 3 = 1, 1 2, 1 ) 2, 0 1 = ( 2, 1, 1, 0) = = 3 2.
18 Linear transformations of R 2 Any linear mapping f : R 2 R 2 is represented as multiplication of a 2dimensional column vector by a 2 2 matrix: f (x) = Ax or f ( x y ) = ( a b c d ) ( ) x. y Linear transformations corresponding to particular matrices can have various geometric properties.
19 A = ( ) Rotation by 90 o
20 ( ) A = Rotation by 45 o
21 A = ( ) Reflection in the vertical axis
22 A = ( ) Reflection in the line x y = 0
23 A = ( ) 1 1/2 0 1 Horizontal shear
24 ( ) 1/2 0 A = 0 1/2 Scaling
25 ( ) 3 0 A = 0 1/3 Squeeze
26 A = ( ) Vertical projection on the horizontal axis
27 A = ( ) Horizontal projection on the line x + y = 0
28 A = ( ) Identity
29 Eigenvalues and eigenvectors of a matrix Definition. Let A be an n n matrix. A number λ R is called an eigenvalue of the matrix A if Av = λv for a nonzero column vector v R n. The vector v is called an eigenvector of A belonging to (or associated with) the eigenvalue λ. Remarks. Alternative notation: eigenvalue = characteristic value, eigenvector = characteristic vector. The zero vector is never considered an eigenvector.
30 ( ) 2 0 Example. A =. 0 3 ( ) ( ) ( ) ( ) = = 2, ( ) ( ) ( ) ( ) = = Hence (1, 0) is an eigenvector of A belonging to the eigenvalue 2, while (0, 2) is an eigenvector of A belonging to the eigenvalue 3.
Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More information5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.
Linear Algebra  Test File  Spring Test # For problems  consider the following system of equations. x + y  z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the
More informationMath 224, Fall 2007 Exam 3 Thursday, December 6, 2007
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank
More informationMATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finitedimensional vector space V. Then the following conditions are equivalent:
More informationNo books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fillintheblank question.
Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fillintheblank question. Name: Section: Question Points
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using GramSchmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationDefinition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition
6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationMA 265 FINAL EXAM Fall 2012
MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators
More informationLecture 3: Review of Linear Algebra
ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,
More informationThe Gram Schmidt Process
u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple
More informationThe Gram Schmidt Process
The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationBASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x
BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,
More informationLinear Algebra Final Exam Study Guide Solutions Fall 2012
. Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize
More informationLecture 9: Vector Algebra
Lecture 9: Vector Algebra Linear combination of vectors Geometric interpretation Interpreting as MatrixVector Multiplication Span of a set of vectors Vector Spaces and Subspaces Linearly Independent/Dependent
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationElements of linear algebra
Elements of linear algebra Elements of linear algebra A vector space S is a set (numbers, vectors, functions) which has addition and scalar multiplication defined, so that the linear combination c 1 v
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationMath 290, Midterm IIkey
Math 290, Midterm IIkey Name (Print): (first) Signature: (last) The following rules apply: There are a total of 20 points on this 50 minutes exam. This contains 7 pages (including this cover page) and
More informationLecture 3: Review of Linear Algebra
ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,
More informationThere are two things that are particularly nice about the first basis
Orthogonality and the GramSchmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially
More informationI. Multiple Choice Questions (Answer any eight)
Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam  Course Instructor : Prashanth L.A. Date : Sep24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY
More informationand u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by
Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v,
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More informationwhich arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i
MODULE 6 Topics: GramSchmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.
More informationDefinition (T invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342  Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationCAAM 336 DIFFERENTIAL EQUATIONS IN SCI AND ENG Examination 1
CAAM 6 DIFFERENTIAL EQUATIONS IN SCI AND ENG Examination Instructions: Time limit: uninterrupted hours There are four questions worth a total of 5 points Please do not look at the questions until you begin
More informationMATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.
MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ
More informationSolutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015
Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See
More information1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations
Math 46  Abstract Linear Algebra Fall, section E Orthogonal matrices and rotations Planar rotations Definition: A planar rotation in R n is a linear map R: R n R n such that there is a plane P R n (through
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More information1 Invariant subspaces
MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another
More information33A Linear Algebra and Applications: Practice Final Exam  Solutions
33A Linear Algebra and Applications: Practice Final Eam  Solutions Question Consider a plane V in R 3 with a basis given by v = and v =. Suppose, y are both in V. (a) [3 points] If [ ] B =, find. (b)
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationComputational math: Assignment 1
Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange
More informationLinear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form
Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =
More informationftuiowamath2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST
me me ftuiowamath255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following
More informationMath 2331 Linear Algebra
6.2 Orthogonal Sets Math 233 Linear Algebra 6.2 Orthogonal Sets Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math233 Jiwen He, University of Houston
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationBasic Operator Theory with KL Expansion
Basic Operator Theory with Karhunen Loéve Expansion Wafa Abu Zarqa, Maryam Abu Shamala, Samiha Al Aga, Khould Al Qitieti, Reem Al Rashedi Department of Mathematical Sciences College of Science United Arab
More informationLinear Algebra Final Exam Review
Linear Algebra Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationLecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)
Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Travis Schedler Tue, Oct 18, 2011 (version: Tue, Oct 18, 6:00 PM) Goals (2) Solving systems of equations
More informationMath 2331 Linear Algebra
5. Eigenvectors & Eigenvalues Math 233 Linear Algebra 5. Eigenvectors & Eigenvalues ShangHuan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ ShangHuan Chiu,
More informationMath 396. An application of GramSchmidt to prove connectedness
Math 396. An application of GramSchmidt to prove connectedness 1. Motivation and background Let V be an ndimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V
More informationHomework 2. Solutions T =
Homework. s Let {e x, e y, e z } be an orthonormal basis in E. Consider the following ordered triples: a) {e x, e x + e y, 5e z }, b) {e y, e x, 5e z }, c) {e y, e x, e z }, d) {e y, e x, 5e z }, e) {
More informationRecall the convention that, for us, all vectors are column vectors.
Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists
More informationLinear Algebra in Actuarial Science: Slides to the lecture
Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a ToolBox Linear Equation Systems Discretization of differential equations: solving linear equations
More informationANSWERS. E k E 2 E 1 A = B
MATH 7 Final Exam Spring ANSWERS Essay Questions points Define an Elementary Matrix Display the fundamental matrix multiply equation which summarizes a sequence of swap, combination and multiply operations,
More information1. Select the unique answer (choice) for each problem. Write only the answer.
MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +
More informationCS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works
CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The
More informationSolutions: Problem Set 3 Math 201B, Winter 2007
Solutions: Problem Set 3 Math 201B, Winter 2007 Problem 1. Prove that an infinitedimensional Hilbert space is a separable metric space if and only if it has a countable orthonormal basis. Solution. If
More informationMath 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:
Math 34/84  Exam Blue Exam Solutions December 4, 8 Instructor: Dr. S. Cooper Name: Read each question carefully. Be sure to show all of your work and not just your final conclusion. You may not use your
More informationEigenvalues, Eigenvectors, and Diagonalization
Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 51 Orthonormal
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationQuadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.
Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric
More informationFunctional Analysis Review
Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More informationLECTURE VI: SELFADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY
LECTURE VI: SELFADJOINT AND UNITARY OPERATORS MAT 204  FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a ndimensional euclidean vector
More informationMath 2331 Linear Algebra
4.5 The Dimension of a Vector Space Math 233 Linear Algebra 4.5 The Dimension of a Vector Space ShangHuan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ ShangHuan
More informationMath 240 Calculus III
Generalized Calculus III Summer 2015, Session II Thursday, July 23, 2015 Agenda 1. 2. 3. 4. Motivation Defective matrices cannot be diagonalized because they do not possess enough eigenvectors to make
More informationEE731 Lecture Notes: Matrix Computations for Signal Processing
EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More informationReview and problem list for Applied Math I
Review and problem list for Applied Math I (This is a first version of a serious review sheet; it may contain errors and it certainly omits a number of topic which were covered in the course. Let me know
More informationLecture 4: Linear independence, span, and bases (1)
Lecture 4: Linear independence, span, and bases (1) Travis Schedler Tue, Sep 20, 2011 (version: Wed, Sep 21, 6:30 PM) Goals (2) Understand linear independence and examples Understand span and examples
More informationICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization
ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 21 Symmetric matrices An n n
More informationAA242B: MECHANICAL VIBRATIONS
AA242B: MECHANICAL VIBRATIONS 1 / 17 AA242B: MECHANICAL VIBRATIONS Solution Methods for the Generalized Eigenvalue Problem These slides are based on the recommended textbook: M. Géradin and D. Rixen, Mechanical
More informationL3: Review of linear algebra and MATLAB
L3: Review of linear algebra and MATLAB Vector and matrix notation Vectors Matrices Vector spaces Linear transformations Eigenvalues and eigenvectors MATLAB primer CSCE 666 Pattern Analysis Ricardo GutierrezOsuna
More informationSolutions for Math 225 Assignment #5 1
Solutions for Math 225 Assignment #5 1 (1) Find a polynomial f(x) of degree at most 3 satisfying that f(0) = 2, f( 1) = 1, f(1) = 3 and f(3) = 1. Solution. By Lagrange Interpolation, ( ) (x + 1)(x 1)(x
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More information1. Elements of linear algebra
Elements of linear algebra Contents Solving systems of linear equations 2 Diagonal form of a square matrix 3 The Jordan normal form of a square matrix 4 The GramSchmidt orthogonalization process 5 The
More informationPseudoinverse & MoorePenrose Conditions
ECE 275AB Lecture 7 Fall 2008 V1.0 c K. KreutzDelgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & MoorePenrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. KreutzDelgado, UC San Diego
More informationMATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.
MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian. Spanning set Let S be a subset of a vector space V. Definition. The span of the set S is the smallest subspace W V that contains S. If
More informationReview of Linear Algebra
Review of Linear Algebra VBS/MRC Review of Linear Algebra 0 Ok, the Questions Why does algebra sound like cobra? Why do they say abstract algebra? Why bother about all this abstract stuff? What is a vector,
More informationMATHEMATICS COMPREHENSIVE EXAM: INCLASS COMPONENT
MATHEMATICS COMPREHENSIVE EXAM: INCLASS COMPONENT The following is the list of questions for the oral exam. At the same time, these questions represent all topics for the written exam. The procedure for
More informationMatrices and Vectors. Definition of Matrix. An MxN matrix A is a twodimensional array of numbers A =
30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a twodimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can
More informationStat 159/259: Linear Algebra Notes
Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the
More informationTopic 2 Quiz 2. choice C implies B and B implies C. correctchoice C implies B, but B does not imply C
Topic 1 Quiz 1 text A reduced rowechelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correctchoice may have 0, 1, 2, or 3 choice may have 0,
More informationLecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,
2. REVIEW OF LINEAR ALGEBRA 1 Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε, where Y n 1 response vector and X n p is the model matrix (or design matrix ) with one row for
More informationThe GramSchmidt Process 1
The GramSchmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition
More informationMath Linear Algebra Final Exam Review Sheet
Math 151 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a componentwise operation. Two vectors v and w may be added together as long as they contain the same number n of
More informationEigenvalues and Eigenvectors
LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalueeigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are
More information4. Linear transformations as a vector space 17
4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation
More informationReview of some mathematical tools
MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical
More informationSpan and Linear Independence
Span and Linear Independence It is common to confuse span and linear independence, because although they are different concepts, they are related. To see their relationship, let s revisit the previous
More informationExam in TMA4110 Calculus 3, June 2013 Solution
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 8 Exam in TMA4 Calculus 3, June 3 Solution Problem Let T : R 3 R 3 be a linear transformation such that T = 4,
More informationEE263: Introduction to Linear Dynamical Systems Review Session 2
EE263: Introduction to Linear Dynamical Systems Review Session 2 Basic concepts from linear algebra nullspace range rank and conservation of dimension EE263 RS2 1 Prerequisites We assume that you are familiar
More informationVectors and Matrices Statistics with Vectors and Matrices
Vectors and Matrices Statistics with Vectors and Matrices Lecture 3 September 7, 005 Analysis Lecture #39/7/005 Slide 1 of 55 Today s Lecture Vectors and Matrices (Supplement A  augmented with SAS proc
More information18.06SC Final Exam Solutions
18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon
More informationLecture 17 Methods for System of Linear Equations: Part 2. Songting Luo. Department of Mathematics Iowa State University
Lecture 17 Methods for System of Linear Equations: Part 2 Songting Luo Department of Mathematics Iowa State University MATH 481 Numerical Methods for Differential Equations Songting Luo ( Department of
More informationPRINCIPAL COMPONENTS ANALYSIS
121 CHAPTER 11 PRINCIPAL COMPONENTS ANALYSIS We now have the tools necessary to discuss one of the most important concepts in mathematical statistics: Principal Components Analysis (PCA). PCA involves
More information. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3
Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. An n n matrix
More informationMath 2331 Linear Algebra
6. Orthogonal Projections Math 2 Linear Algebra 6. Orthogonal Projections Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2 Jiwen He, University of
More informationEigenvalues, Eigenvectors, and an Intro to PCA
Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about rewriting our data using a new set of variables, or a new basis.
More information