Numerical Linear Algebra Chap. 2: Least Squares Problems

Size: px
Start display at page:

Download "Numerical Linear Algebra Chap. 2: Least Squares Problems"

Transcription

1 Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

2 onto a line Problem: Given a point b R m and a line through the origin in the direction of a R m Find the point p on the line closest to the point b Observations: Hence p has a representattion αa for some α R The line connecting b to p is perpendicular to the vector a 0 = a T (b p) = a T (b αa) = a T b αa T a α = at b a T a. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

3 matrix p = αa = aα = a at b a T a = aat a T b =: Pb a The projection of some b onto the direction of some a is obtained by multiplying the vector b by the rank-one matrix P = aat a T a TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

4 onto a subspace Problem: Given a point b R m and n linearly independent vectors a 1,..., a n R m Find the linear combination p = n j=1 x ja j span{a 1,..., a n } which is closest to the point b p is called projection of b onto span{a 1,..., a n }. p is characterized by the condition that the error b p is perpendicular to the subspace, and this is equivalent to: b p is perpendicular to a 1,..., a n. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

5 onto a subspace ct. b p = b Ax is perpendicular to a 1,..., a n if and only if (a 1 ) T (b Ax) = 0 (a 2 ) T (b Ax) = 0. (a n ) T (b Ax) = 0 A T (b Ax) = 0 A T Ax = A T b. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

6 Theorem A T A is nonsingular if and only if the columns of A are linearly independent Let x be in the nullspace of A. Then it holds Ax = 0 A T Ax = 0. Hence, x is in the nullspace of A T A. If x is in the nullspace of A T A, then it holds A T Ax = 0 0 = x T A T Ax = (Ax) T (Ax) = Ax 2 Ax = 0. If the columns of A are linearly independent, then it follows x = 0. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

7 onto a subspace If a 1,..., a n are linearly independent, then the projection p of some vector b onto span{a1,..., a n } is given by p = Ax where x is the unique solution of A T Ax = A T b x = (A T A) 1 A T b The projector is given by P = A(A T A) 1 A T P is symmetric, and it holds P 2 = P The distance from b to the subspace is b Pb TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

8 Least squares problem TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

9 Least squares problem ct. Given are measurements (t j, b j ), j = 1,..., m. Find straight line f (t) = x 1 + x 2 t which matches the measurements best Try to solve the linear system x 1 + x 2 t j = b j, j = 1,..., m as good as possible. Solve the linear system 1 t 1 1 t 2. 1 t m ( x1 b 1 ) b 2 = x 2. b m as good as possible. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

10 Least squares problem ct. Replace problem Ax = b by: Find x such that the error Ax b is as small as possible. The (unique) solution is called least squares solution. Solution: The nearest point in the space {Ax : x R n } to the point b is the projection of b onto this space. Hence the solution of the least squares problem is x = (A T A) 1 Ab. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

11 Fitting a straight line Measurements t j b j A = 1 2, b = A T Ax = ( ) ( ) 7 14 x = A T 3.8 b = x = ( ) TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

12 Least squares problem TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

13 Fitting a straight line Given m points (t j, b j ), j = 1, dots, m in the plane. Find a straight line x 1 + x 2 t, t R fitting the data in the least squares sense. 1 t 1 b 1 1 t 2 ( ) A =., b 2 AT b = t 1 t 2... t m. = 1 t m b n ( m j=1 b ) j m j=1 t jb j 1 t 1 ( ) A T t 2 ( m m A = t 1 t 2... t. m. = j=1 t ) j m j=1 t m j j=1 t j 2 1 t m ( m m A T j=1 Ax = t ) j m j=1 t m j j=1 t j 2 x = ( m j=1 b j m j=1 t jb j ) = A T b TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

14 Trigonometric polynomial TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

15 Trigonometric polynomial ct. Given measurements (t j, b j ), j = 1,..., n. Find trigonometric polynomial f (t) = x 1 + x 2 sin(t) + x 3 cos(t) such that the measurements are closest to the graph of f 1 sin(t 1 ) cos(t 1 ) 1 sin(t 2 ) cos(t 2 ) A =. 1 sin(t n ) cos(t n ) Solve Ax b = min! A T Ax = A T b TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

16 Trigonometric polynomial ct TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

17 Orthonormal vectors q 1, q 2,..., q n R m are orthonormal if { (q j ) T q k 0 when j k = 1 when j = k Orthonormal vectors are linearly independent: n n α j q j = 0 0 = (q k ) T α j q j = j=1 j=1 n α j (q k ) T q j = α k, k = 1,..., n j=1 If n = m then q 1,..., q n is an orthonormal basis. If q 1,..., q n is an orthonormal basis of R n, then x R n has the representation x = n α j q j (q k ) T x = j=1 n α j (q k ) T q j = α k x = j=1 n x T q j q j. j=1 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

18 Example q 1 = ( ) cos θ, q 2 = sin θ ( ) sin θ, x = cos θ ( ) cos α sin α x = x T q 1 q 1 + x T q 2 q 2 = ((cos α cos θ + sin α sin θ)q 1 + ( cos α sin θ + sin α cos θ)q 2 ( ) ( ) cos θ sin θ = cos(α θ) + sin(α θ). sin θ cos θ TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

19 Orthogonal matrix An n n matrix with orthonormal columns is called orthogonal matrix. It is usually denoted by Q. Every orthogonal matrix Q is nonsingular. Q T Q = I Q 1 = Q T. If Q is orthogonal then Q T is orthogonal: (Q T ) T Q T = QQ T = QQ 1 = I TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

20 Rotation ( ) cos θ sin θ Q = sin θ cos θ rotates every vector in the plane through the angle θ ( ) ( ) Q T cos θ sin θ cos( θ) sin( θ) = = = Q 1 sin θ cos θ sin( θ) cos( θ) rotates every vector in the plane through the angle θ TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

21 Permutation P = obviously is an orthogonal matrix Px = x 1 x 2 x 3 x 4 = x 2 x 4 x 1 x P T = = P puts the components back into their original order x 2 x 4 x 1 x 3 = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51 x 1 x 2 x 3 x 4

22 Reflection If u is any unit vector, then Q = I 2uu T is orthogonal: since u T u = 1. Q T = I 2(uu T ) T = I 2(u T ) T u T = I 2uu T = Q, Q T Q = (I 2uu T )(I 2uu T ) = I 4uu T + 4uu T uu T = I Q defines a refection at the hyperplane u : Let x = u T x u + v, u T v = 0. Then it follows Qx = (u T x)(i 2uu T )u + (I 2uu T )v = (u T x)(u 2uu T u) + (v 2uu T v) = (u T x)u + v Now it is obvious, that Q 2 = Q: reflecting twice through a mirror brings back the original. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

23 Orthogonal matrices If Q is orthogonal, then it leaves lengths unchanged: Qx 2 = (Qx) T (Qx) = x T Q T Qx = x T x = x 2. It also leaves angles unchanged cos(x, y) = x T y x y = x T Q T Qy Qx Qy = (Qx)T (Qy) Qx Qy. Orthogonal matrices are excellent for computations: numbers never grow too large when lengths are fixed. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

24 Let q 1,..., q n R m orthonormal vectors (how do we know that m n?) Then with Q := [q 1,..., q n ] R m n the projector onto V := span{q 1,..., q n } is P = Q(Q T Q) 1 Q T = QQ T and the projection of x R m onto V is Px = QQ T x = n (x T q j )q j, j=1 the truncation of the Fourier development with respect to an orthonormal basis containing q 1,..., q n. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

25 Least squares problem Consider the least squares problem Qx b = min!, where the system matrix has orthonormal columns: Solution: x = Q T b TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

26 Example is an orthogonal matrix Q = Represent the vector x = (1, 1, 1) T as a linear combination of the columns q j of Q. x T q 1 = 19 13, x T q 2 = 11 13, x T q 3 = 5 13 x = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

27 Fitting a straight line Given measurements (t j, b j ), j = 1,..., m. Assume that the measurement times t j add to zero. Since the scalar product of (t j ) j=1,...,m and (1, 1,..., 1) T is zero the normal equations ( m m A T j=1 Ax = t ) j m j=1 t m j j=1 t j 2 x = obtain the form ( ) m 0 m 0 j=1 t j 2 x = ( m j=1 b j m j=1 t jb j ) = A T b ( m j=1 b ) j m j=1 t jb j and the solution is x = ( m j=1 b ) j/m m j=1 t jb j / m j=1 t j 2 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

28 Fitting a straight line Orthogonal columns are so helpful that it is worth moving the time origing to produce them. To this end subtract the average time t = 1 m m j=1 t j. Then the shifted measurement times t j := t j t add to zero. The solution of the least squares problem is x 1 + x 2 t = x 1 + x 2 (t t) = ( x 1 x 2 t) + x 2 t where x 1 = 1 m m j=1 b j and x 2 = b j t j m t. j=1 j 2 j=1 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

29 Example ct. Measurements t j b j The average time is t = 2, and t = ( 3, 2, 1, 0, 1, 2, 3) T, and x 1 = j=1 b j = 3.8 7, x j=1 2 = b j t j m t j=1 j 2 = Therefore, the solution of the least equares problem is ( x 1 x 2 t) + x 2 t = t = t. 28 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

30 Gram Schmidt process Assume that a 1,..., a n R m are given linearly independent vectors. Problem: Determine an orthogonal basis q 1,..., q n of V := span{a 1,..., a n } Advantage For Q := [q 1,..., q n ] it holds that Q T Q = diag{ q 1 2,..., q n 2 } is a diagonal matrix. Hence the orthogonal projection onto V decouples: Px := Q(Q T Q) 1 Q T x = Qdiag{ q 1 2,..., q n 2 } 1 ( x T q 1,..., x T q n) T = ( x T q 1 Q q 1 2,..., x T q n ) T n x T q j q n 2 = q j 2 qj. j=1 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

31 Example Project x = 2 3 to the subspace spanned by q1 = 2 3 and q2 = Since q 1 and q 2 are orthogonal, the projection is 1 p = x T q 1 q 1 2 q1 + x T q 2 q 2 2 q2 = Indeed, p x = 1 1 is orthogonal to q1 and q = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

32 Gram Schmidt process Assume that a 1,..., a n R m are given linearly independent vectors. Problem: Determine an orthogonal basis q 1,..., q n of V := span{a 1,..., a n } Idea: Set q 1 = a 1, and for j = 2, 3,..., n subtract the projection of a j to the subspace span{a 1,..., a j 1 } from a j. Then this difference q j is orthogonal to span{a 1,..., a j 1 }, and therefore to the previously determined vectors q i. Since q 1,..., q j 1 is an orthogonal basis of span{a 1,..., a j 1 }, the projections are decoupled. Hence, j 1 q j = a j k=1 (a j ) T q k q k 2 qk, j = 2,..., n. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

33 Example Apply the Gram Schmidt process to a 1 = 3 4, a 2 = 2 11, a 3 = q 1 = a 1, q 2 = a 2 (a2 ) T (q 1 ) q 1 2 q 1 = = q 3 = a 3 (a3 ) T (q 1 ) q 1 2 q 1 (a3 ) T (q 2 ) q 2 2 q 2 = = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

34 Factorization Remember j 1 q j = a j k=1 (a j ) T q k q k 2 qk, j = 2,..., n. It is more convenient to normalize the q vectors in the course of the algorithm and to get rid of the denominators. The Gram Schmidt algorithm then reads q 1 = q j = 1 a 1 a1 ( ) j 1 / j 1 a j (a j ) T q k q k a j (a j ) T q k q k, j = 2,..., n. k=1 k=1 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

35 QR Factorization The following form is more convenient: a 1 = a 1 q 1 j 1 j 1 a j = (a j ) T q k q k + aj (a j ) T q k q k q j, j = 2,..., n. k=1 k=1 Since at every step a j is a linear combination of q 1,..., q j, and later q i :s are not involved: r 11 r 12 r r 1n A = ( a 1,..., a n) = ( r 22 r r 2n q 1,..., q n)... = QR... where r ij = (a j ) T q i, i = 1,..., j 1 and r jj = a j j 1 k=1 (aj ) T q k q k. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51 r nn

36 Example Determine the QR-factorization of A = [a 1, a 2, a 3 ] = q 1 = 1 a 1 a1 = q 2 q 2 = a 2 (a 2 ) T q 1 q 1 = = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

37 Example ct. q 3 q 3 = a 3 (a 3 ) T q 1 q 1 (a 3 ) T q 2 q = = 1 1 q3 = Hence the QR factorization of A is / = 0 2/3 1/ 2 0 2/3 1/ TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

38 Least squares problems Any matrix A R m n with linearly independent columns can be factored into QR. Q R m n has orthonormal columns, and R R n n is upper triangular with positive diagonal. Given the QR factorization of A a least squares problem with system matrix A can be solved easily: A T Ax = A T b (QR) T (QR)x = (QR) T b R T Q T QRx = R T Q T b R T Rx = R T Q T b Rx = Q T b. Instead of solving the normal equations A T Ax = A T b by Gaussian elimination one just solves Rx = Q T b by back substitution. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

39 Example Solve the least squares problem Ax b where and b = Normal equations: A T Ax = x = A T b = 3 17 x = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

40 Example ct. Taking advantage of the known QR factorization / = 0 2/3 1/ 2 0 2/3 1/ we have to solve Rx = x = Q T b = 14/3 2 1/ x = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

41 Gram Schmidt algorithm 1: q 1 = a 1 / a 1 2: for j=2,...,n do 3: q j = a j 4: for k=1,...,j-1 do 5: r kj = (a j ) T q k ; 6: q j = q j r kj q k ; 7: end for 8: q j = q j / q j 9: end for In steps 5 and 6 the projection of a j onto spanq k is subtracted from q j for k = 1,..., j 1. At that time k 1 q j = a j (a j ) T q l q l l=1 and from (q l ) T q k = 0 for l = 1,..., k 1 it follows that we can replace step 5 by r kj = (q j ) T q k. This version is called modified Gram Schmidt method. It is numerically more stable than the original Gram Schmidt factorization. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

42 Householder transformation Although the modified Gram Schmidt method is knwon to be more stable than the original Gram Schmidt method, there is a better way to solve least square problems by Householder transformation. Problem: Given a vector a R m ; find a refection H = I 2uu T, u = 1 such that Ha is a multiple of the first unit vector e := (1, 0,..., 0) T. From the orthogonality of H it follows that Ha = a. Hence, Ha = a 2u T a u = ± a e, and therefore u and a ± a e are parallel. If a is not a multiple of e then there are two solutions of our problem: u = a + a e a + a e and u = a a e a a e, for a = a e the first solution is defined, for a = a e only the second one. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

43 Householder transformation ct. In any case the Householder transformation H = I 2 ww T w T w with w = a + sign(a 1) a e exists. For stability reasons (cancellation!) we always choose this one. Let a = (2, 2, 1) T. Then a = 3, and w = (5, 2, 1) T, and H = I 1 w 2 ww T = I ( ) Ha = A second solution is defined by w = ( 1, 2, 1) T : H = I 1 w 2 w w T = I ( ) Ha = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

44 Least equares problem Consider the least square problem Ax b = min! where A R m n, b R n. Since orthogonal matrices do not change the length of a vector, it follows Ax b = Q(Ax b) for every orthogonal matrix Q R m m. ( R Assume that QA = where Q O) T Q = I, R R n n is upper triangular and O R m n,n denotes the nullmatrix. Then we obtain ( Ax b 2 = QAx Qb 2 R = x Qb O) 2 = Rx b b 2 2 where b 1 R n contains the leading n components of Qb, and b 2 the remaining ones. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

45 Least equares problem ct. From Ax b 2 = Rx b b 2 2 it is obvious that x = R 1 b 1 is the solution of the least squares problem, and that b 2 is the residuum. Q is constructed step by step using Householder transformations. Let H 1 be a Householder transformation such that Ha 1 = ± a 1 e, where a 1 denotes the first column of A. Then r 11 r r 1n 0 A 1 := H 1 A =. Ã 1 0 TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

46 Least equares problem ct. Next we annihilate the subdiagonal elements of the second column. Let H 2 be the Householder matrix wich maps the first column of Ã1 to a multiple of the first unit vector (in R m 1 ), and let H 2 =. H 2 0 Then we obtain r 11 r 12 r r 1n 0 r 22 r r 2n H 2 H 1 A = H 2 A 1 =: A 2 = Ã TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

47 Least equares problem ct. Continuing that way we finally arrive at r 11 r r 1n 0 r r 2n ( H n H n 1 H 1 A =: A n = r nn R =: O) ) ( b1 With b := H n H n 1 H 1 b := b n, b 1 R n the least squares problem obtains the form with the unique solution Ax b = A n x b x = R 1 b1. TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

48 Example A = [a 1, a 2 ] = , b = From a 1 = 2 it follows that the Householder method mapping the first column a 1 of A to a multiple of the first unit vector is H 1 = I 2 ww T w T w where w = = A 1 = H 1 A = , H 1b = TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

49 Example ct. Obviously, this problem is equivalent to x = min! which is solved by ( ) x = ( ) x = ( ) TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

50 For didactic reasons we solve the transformed problem A 1 x H 1 b = x = min! using a Householder transformation. H 2 = I 2 vv T v T v, v = = maps the first column of Ã2 to a multiple of the first unit vector in R 3 Hence H ( ) = 0 1 0, H 2 A 1 = A 0 1 = 0 3 H TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

51 H = ( ) = and we obtain the equivalent problem x = min! which is solved by ( ) 2 5 x = ( 6.75 ) 0 3 x = ( ) TUHH Heinrich Voss Numerical Linear Algebra Chap. 2: Least Squares Problems / 51

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

Orthogonal Transformations

Orthogonal Transformations Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Least-Squares Systems and The QR factorization

Least-Squares Systems and The QR factorization Least-Squares Systems and The QR factorization Orthogonality Least-squares systems. The Gram-Schmidt and Modified Gram-Schmidt processes. The Householder QR and the Givens QR. Orthogonality The Gram-Schmidt

More information

Problem 1. CS205 Homework #2 Solutions. Solution

Problem 1. CS205 Homework #2 Solutions. Solution CS205 Homework #2 s Problem 1 [Heath 3.29, page 152] Let v be a nonzero n-vector. The hyperplane normal to v is the (n-1)-dimensional subspace of all vectors z such that v T z = 0. A reflector is a linear

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

This can be accomplished by left matrix multiplication as follows: I

This can be accomplished by left matrix multiplication as follows: I 1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method

More information

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) Householder QR Householder reflectors are matrices of the form P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) w Px x Geometrically, P x represents a mirror image of x with respect to

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Orthonormal Transformations

Orthonormal Transformations Orthonormal Transformations Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 25, 2010 Applications of transformation Q : R m R m, with Q T Q = I 1.

More information

Orthonormal Transformations and Least Squares

Orthonormal Transformations and Least Squares Orthonormal Transformations and Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 30, 2009 Applications of Qx with Q T Q = I 1. solving

More information

Chapter 5 Orthogonality

Chapter 5 Orthogonality Matrix Methods for Computational Modeling and Data Analytics Virginia Tech Spring 08 Chapter 5 Orthogonality Mark Embree embree@vt.edu Ax=b version of February 08 We needonemoretoolfrom basic linear algebra

More information

P = A(A T A) 1 A T. A Om (m n)

P = A(A T A) 1 A T. A Om (m n) Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline

More information

3 QR factorization revisited

3 QR factorization revisited LINEAR ALGEBRA: NUMERICAL METHODS. Version: August 2, 2000 30 3 QR factorization revisited Now we can explain why A = QR factorization is much better when using it to solve Ax = b than the A = LU factorization

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they

More information

Dot product and linear least squares problems

Dot product and linear least squares problems Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name: Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

18.06 Quiz 2 April 7, 2010 Professor Strang

18.06 Quiz 2 April 7, 2010 Professor Strang 18.06 Quiz 2 April 7, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number or instructor is 2. 3. 1. (33 points) (a) Find the matrix P that projects every vector b in R 3 onto the line

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Lecture 6, Sci. Comp. for DPhil Students

Lecture 6, Sci. Comp. for DPhil Students Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Eigenvalue Problems CHAPTER 1 : PRELIMINARIES

Eigenvalue Problems CHAPTER 1 : PRELIMINARIES Eigenvalue Problems CHAPTER 1 : PRELIMINARIES Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Mathematics TUHH Heinrich Voss Preliminaries Eigenvalue problems 2012 1 / 14

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Class notes: Approximation

Class notes: Approximation Class notes: Approximation Introduction Vector spaces, linear independence, subspace The goal of Numerical Analysis is to compute approximations We want to approximate eg numbers in R or C vectors in R

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Dimension and Structure

Dimension and Structure 96 Chapter 7 Dimension and Structure 7.1 Basis and Dimensions Bases for Subspaces Definition 7.1.1. A set of vectors in a subspace V of R n is said to be a basis for V if it is linearly independent and

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017 Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Stability of the Gram-Schmidt process

Stability of the Gram-Schmidt process Stability of the Gram-Schmidt process Orthogonal projection We learned in multivariable calculus (or physics or elementary linear algebra) that if q is a unit vector and v is any vector then the orthogonal

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

MATH 532: Linear Algebra

MATH 532: Linear Algebra MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Math 6610 : Analysis of Numerical Methods I. Chee Han Tan

Math 6610 : Analysis of Numerical Methods I. Chee Han Tan Math 6610 : Analysis of Numerical Methods I Chee Han Tan Last modified : August 18, 017 Contents 1 Introduction 5 1.1 Linear Algebra.................................... 5 1. Orthogonal Vectors and Matrices..........................

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

Applied Linear Algebra

Applied Linear Algebra Applied Linear Algebra Gábor P. Nagy and Viktor Vígh University of Szeged Bolyai Institute Winter 2014 1 / 262 Table of contents I 1 Introduction, review Complex numbers Vectors and matrices Determinants

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 10: Vector Algebra: Orthogonal Basis Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that

More information

5.6. PSEUDOINVERSES 101. A H w.

5.6. PSEUDOINVERSES 101. A H w. 5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18 Overview

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2 MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS SYSTEMS OF EQUATIONS AND MATRICES Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

4 Linear Algebra Review

4 Linear Algebra Review 4 Linear Algebra Review For this topic we quickly review many key aspects of linear algebra that will be necessary for the remainder of the course 41 Vectors and Matrices For the context of data analysis,

More information

Important Matrix Factorizations

Important Matrix Factorizations LU Factorization Choleski Factorization The QR Factorization LU Factorization: Gaussian Elimination Matrices Gaussian elimination transforms vectors of the form a α, b where a R k, 0 α R, and b R n k 1,

More information

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Matrix Factorization and Analysis

Matrix Factorization and Analysis Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

PRACTICE PROBLEMS (TEST I).

PRACTICE PROBLEMS (TEST I). 6 Calculus III for CS Spring PRACTICE PROBLEMS (TEST I). MATERIAL: Chapter from Salas-Hille-Etgen, Sections,, 3, 4, 5, 6 from the [Notes:Th], Chapters,, 3, 4 from [Demko] (most of it is covered in more

More information

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Linear least squares problem: Example

Linear least squares problem: Example Linear least squares problem: Example We want to determine n unknown parameters c,,c n using m measurements where m n Here always denotes the -norm v = (v + + v n) / Example problem Fit the experimental

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

Linear Algebra V = T = ( 4 3 ).

Linear Algebra V = T = ( 4 3 ). Linear Algebra Vectors A column vector is a list of numbers stored vertically The dimension of a column vector is the number of values in the vector W is a -dimensional column vector and V is a 5-dimensional

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Linear Algebra 1 Exam 2 Solutions 7/14/3

Linear Algebra 1 Exam 2 Solutions 7/14/3 Linear Algebra 1 Exam Solutions 7/14/3 Question 1 The line L has the symmetric equation: x 1 = y + 3 The line M has the parametric equation: = z 4. [x, y, z] = [ 4, 10, 5] + s[10, 7, ]. The line N is perpendicular

More information

Vector and Matrix Norms. Vector and Matrix Norms

Vector and Matrix Norms. Vector and Matrix Norms Vector and Matrix Norms Vector Space Algebra Matrix Algebra: We let x x and A A, where, if x is an element of an abstract vector space n, and A = A: n m, then x is a complex column vector of length n whose

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

Linear Algebra. The Manga Guide. Supplemental Appendixes. Shin Takahashi, Iroha Inoue, and Trend-Pro Co., Ltd.

Linear Algebra. The Manga Guide. Supplemental Appendixes. Shin Takahashi, Iroha Inoue, and Trend-Pro Co., Ltd. The Manga Guide to Linear Algebra Supplemental Appendixes Shin Takahashi, Iroha Inoue, and Trend-Pro Co., Ltd. Copyright by Shin Takahashi and TREND-PRO Co., Ltd. ISBN-: 978--97--9 Contents A Workbook...

More information

9. Numerical linear algebra background

9. Numerical linear algebra background Convex Optimization Boyd & Vandenberghe 9. Numerical linear algebra background matrix structure and algorithm complexity solving linear equations with factored matrices LU, Cholesky, LDL T factorization

More information

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 26, 2010 Linear system Linear system Ax = b, A C m,n, b C m, x C n. under-determined

More information

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am ECS130 Scientific Computing Lecture 1: Introduction Monday, January 7, 10:00 10:50 am About Course: ECS130 Scientific Computing Professor: Zhaojun Bai Webpage: http://web.cs.ucdavis.edu/~bai/ecs130/ Today

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Linear Least squares

Linear Least squares Linear Least squares Method of least squares Measurement errors are inevitable in observational and experimental sciences Errors can be smoothed out by averaging over more measurements than necessary to

More information