Chapter 5 Orthogonality

Size: px
Start display at page:

Download "Chapter 5 Orthogonality"

Transcription

1 Matrix Methods for Computational Modeling and Data Analytics Virginia Tech Spring 08 Chapter 5 Orthogonality Mark Embree embree@vt.edu Ax=b version of February 08 We needonemoretoolfrom basic linear algebra to tackle the applications that lie ahead. In the last chapter we saw that basis vectors provide the most economical way to describe a subspace, but not all bases are created equal. Clearly the following three pairs of vectors all form bases for IR : apple 0, apple 0 ; apple p / p /, apple p / p / ; apple 0, apple,.00. For example, the vector x =[, ] T can be written as a linear combination of all three sets of vectors. In the first two cases, apple = apple apple = p apple p apple p / / p + 0 p, / / the coefficients multiplying against the basis vectors are no bigger than kxk = p. In the third case, apple apple apple = , the coefficients 999 and 000 are much bigger than kxk. Small changes in x require significant swings in these coefficients. For example, if we change the second entry of x from to 0.9, apple.9 apple apple = , the coefficients change by 00, which is 000 times the change in x!

2 6 The first two bases above are special, for the norm of each vector is one, and the vectors are orthogonal. The vectors in the third basis form a small angle. Indeed, all these bases are linearly independent, but, extrapolating from Animal Farm, some bases are more linearly independent than others. This chapter is devoted to the art of orthogonalizing a set of linearly independent vectors. The main tool of this craft is the orthogonal projector. 5. Projectors Among the most fundamental operations in linear algebra is the projection of a vector into a particular subspace. Such actions are encoded in a special class of matrices. Definition 8 A matrix P IR n n is a projector provided P = P. If additionally P = P T, then P is called an orthogonal projector. The simplest projectors send all vectors into a one-dimensional subspace. Suppose that w T v =. Then P := vw T is a projector, since P =(vw T )(vw T )=v(w T v)w T = vw T = P. We say that P projects onto span{v}, since and along span{w}?, since R(P) =span{v}, N(P) =span{w}? = {x IR n : x T w = 0}. Example Consider the vectors Confirm that w T v =, so v = is a projector. It sends the vector apple apple 0, w =. P = vw T = x = apple 0 0 apple,

3 6 to Px = apple 0 0 apple = apple 0, which is contained in R(P) =span(v). The remainder, span{w} x Px = is indeed orthogonal to span(w). apple apple 0 apple = x Px w x The remainder x Px plays a key role in our development. Notice that P annihilates this vector, P(x Px) =Px P x = Px Px = 0, which means that x Px N(P). v Px The projector P sends the vector x to Px R(P) = span{v}, while the remainder x Px is in N(P) = span{w}?, i.e., x Px is orthogonal to w. span{v} For the projector in the last example, P 6= P T, so we call P an oblique projector. For such projectors, the remainder x Px generally forms oblique = not 90 an oblique angle with R(P). In this course we shall most often work with orthogonal projectors, for which P = P T. Of course this implies N(P) =N(P T ), and so, the Fundamental Theorem of Linear Algebra s N(P T )? R(P) becomes N(P)? R(P). Since x Px N(P), the remainder is always orthogonal to R(P). This explains why we call such a P an orthogonal projector. To build an orthogonal projector onto a one-dimensional subspace, we need only a single vector v. Set P = vv T, and, to give P = P, require that v T v = kvk =, i.e., v must be a unit vector. This P projects onto and along span{v}, since R(P) =span{v}, N(P) =span{v}?. The notation span{v}? denotes the set of all vectors orthogonal to span{v}. We illustrate these ideas in the following example. Example Use the unit vector v from the last example to build apple 0 P = vv T = 0 0,

4 64 which projects vectors onto the x axis. For the same x used before, we have apple apple apple 0 Px = = Now observe the orthogonality of the remainder apple 0 x Px = with the projection subspace R(P) =span{v}. Did you notice that, in contrast to the first example, the projection Px has smaller norm than x, kpxk applekxk? Such reduction is a general trait of orthogonal projectors. x Px x v Px span{v} The orthogonal projector P sends the vector x to Px R(P) =span{v}, while the remainder x Px is contained in N(P) =span{v}?, i.e., x Px is orthogonal to v and Px. Proposition If P is an orthogonal projector, then kpxk apple kxk for all x IR n. Proof. Since P is an orthogonal projector, P T = P and P = P. Hence Px is orthogonal to x Px: (Px) T (x Px) =x T P T x x T P T Px = x T Px x T P x = x T Px xpx = 0. Now apply the Pythagorean Theorem to the orthogonal pieces Px R(P) and x Px N(P), kxk = kpx +(x Px)k = kpxk + kx Pxk, and conclude the result by noting that kx Pxk 0. student experiments 5.8. To get an appreciation for projectors, in matlab construct the orthogonal projector P onto apple span. Now construct 500 random vectors x = randn(, ). Produce a figure showing all 500 of these vectors in the plane (plotted as blue dots), then superimpose on the plot the values of Px (plotted as red dots). Use axis equals to scale the axes in the same manner. Repeat the experiment, but now in three dimensions, with P the orthogonal projector onto 8 9 < = span 4 5 : ;. Use matlab s plot command plot x and Px in three dimensional space.

5 65 5. Orthogonalization Now we arrive at the main point of this chapter: the transformation of basis into an orthonormal basis for the same set. Definition 9 A basis {q,...,q n } is orthonormal provided q T j q k = 0 for all j 6= k (the vectors are orthogonal); kq j k = for j =,..., n (the vectors are normalized). We can summarize orthonormality quite neatly if we arrange the basis in the matrix Q = [ q q q n ]. Then Q T Q contains all possible inner products between q j and q k : since q T j q j = kq jk =. q Tq q Tq q Tq n Q T q T Q = q q Tq q Tq n = I, qn T q qn T q qn T q n Definition 0 A matrix Q IR n n is unitary if Q T Q = I. If Q IR m n with m > n and Q T Q = I, then Q is subunitary. student experiments 5.9. Suppose Q IR m n with m < n. Is it possible that Q T Q = I? Explain Suppose that Q IR m n is subunitary. Show that QQ T is a projector. What is R(Q) in terms of q,...,q n? We are now prepared to orthogonalize a set of vectors. Suppose we have a basis a,...,a n for a subspace S C m. We seek an orthonormal basis q,...,q n for the same subspace, which we shall span{a } build one vector at a time. The goal is to first construct unit vector q a P a so that span{q } = span{a } and then to build unit vector q orthogonal to q so that span{q, q } = span{a, a } and then unit vector q orthogonal to q and q so that span{q, q, q } = span{a, a, a } a a span{a } The setting: basis vectors a and a that are neither unit length, nor orthogonal.

6 66 and so on, one q j vector at a time, until we have the whole new basis: span{q, q,...,q n } = span{a, a,...,a n } = S. span{a } The first of these steps is easy, for we get q by normalizing a : a P a a q = ka k a. The next step is decisive, and here we use insight gained from our quick study of orthogonal projectors. Figuratively speaking, we must remove the part of a in the direction q, leaving the portion of a orthogonal to q. Since q is a unit vector, q First normalize a to obtain q with span{q } = span{a }. a span{a } P := q q T is an orthogonal projector onto span{q } = span{a }, so P a = q q T a a P a a span{a } is the part of a in the direction q. Remove that from a to get bq := a P a. And since P bq = P (a P a )=0, q P a a span{a } Compute cq = a P a, the portion of a that is orthogonal to q. spot that bq N(P ), which is orthogonal to R(P )=span{q }, so bq? q. If you prefer a less high-falutin explanation, just compute q T bq = qt a q T q qt a = qt a q T a = 0. So we have constructed a vector bq orthogonal to q such that span{q, bq } = span{a, a }. Since we want not just an orthogonal basis, but an orthonormal basis, we adjust bq by scaling it to become the unit vector q := kbq k bq. q a P a a span{a } The orthogonalization process for subsequent vectors follows the same template. To construct q, we first remove from a its components in the q and q directions: q P a a Normalize q = cq /kcq k to get a unit vector in the cq direction. span{a } bq := a P a P a, where P = q q T, where for j =,, q T j bq = qt j a q T j q qt a q T j q qt a = q T j a q T j a = 0,

7 67 using the orthonormality of q and q. Normalize bq to get q := kbq k bq. Now the general pattern should be evident. For future vectors, construct bq k+ = a k+ k  P j a k+, with P j := q j q T j, j= then normalize giving q k+ = kbq k+ k bq k+, q k+? span{q,...,q k } which extends the orthonormal basis in one more direction: span{q,...,q k, q k+ } = span{a,...,a k, a k+ }. This algorithm is known as the Gram Schmidt process. student experiments 5.. Under what circumstances will this procedure break down? That is, will you ever divide by zero when trying to construct q k+ = bq k+ /kbq k+ k? 5.. Show that P k = P + + P k is an orthogonal projector. With this notation, explain how we can compactly summarize each Gram Schmidt step as q k+ = (I P k)a k+ k(i P k )a k+ k. 5. Gram Schmidt is QR factorization The Gram Schmidt process is a classical way to orthogonalize a basis, and you can execute the process by hand when the vectors are short (m is small) and there are few of them (n is small). As with Gaussian elimination, we want to automate the process for larger problems and just as Gaussian elimination gave us the A = LU factorization of a square matrix, so Gram Schmidt will lead to a factorization of a general rectangular matrix. Let us work through the arithmetic behind the orthogonalization of three linearly independent vectors a, a, and a, and along the

8 68 way define some quantities r j,k that arise in the process: bq := a q := kbq k bq = r, bq bq := a q q T a = a r, q q := kbq k bq = r, bq bq := a q q T a q q T a = a r, q r, q q := kbq k bq = bq. r, To summarize, we have defined 8 >< q T j a k, j < k; r j,k = kbq j k, j = k; >: 0, j > k. With this notation, three steps of the Gram Schmidt process become: r, q = a r, q = a r, q r, q = a r, q r, q, or, collecting the a j vectors on the left hand side: a = r, q + 0q + 0q a = r, q + r, q + 0q a = r, q + r, q + r, q. Stack the orthonormal basis vectors as the columns of the subunitary matrix Q IR m n Q = [ q q q ], and note that a = a = 4 q q q 5 4 q q q r, r, r, 0 5

9 69 a = 4 q q q 5 4 r, r, r, which we organize in matrix form as r, r, r, 4 a a a 5 = 4 q q q r, r, r, We summarize the entire process as: A = QR, where Q IR m n is subunitary and R IR n n is upper triangular. Now change your perspective: let A IR m n be any matrix with linearly independent columns, m n. Performing the Gram Schmidt process on those columns yields the decomposition A = QR, which is known as the QR factorization. But wait, there s more! Suppose b R(A), so there exists some x IR n such that Ax = b. Substitute the QR factorization to obtain QRx = b. Since we have written b = Q(Rx), we see that b R(Q): since b was any vector in the column space of A, we conclude that R(A) R(Q). Similarly, if b R(Q), then there exists x IR n such that Qx = b. Since R is invertible, splice I = RR into this last equation to obtain QR(R x)=b: so A(R x)=b, hence b R(A), and R(Q) R(A). Since R(Q) and R(A) each contain the other, they must be equal. Finally, notice that P = QQ T IR m m is an orthogonal projector: 5, P = QQ T QQ T = QQ T = P, since the columns of Q are orthonormal, Q T Q = I, and (QQ T ) T = (Q T ) T Q T = QQ T. In fact, one can confirm that QQ T 6 = 4 q q n q T. q T n 7 5 = q q T + q nq T n. Notice that R(P) =R(Q), so P projects onto R(Q) =R(A). Theorem 7 Let A = QR be a QR factorization of a matrix A IR m n with linearly independent columns, with Q IR m n and R IR n n. Then R(Q) =R(A) and QQ T is an orthogonal projector onto R(A). 5.4 QR solves systems Suppose we wish to solve Ax = b for a square matrix A with linearly independent columns. The linear independence implies that

10 70 A is invertible. Rather than perform Gaussian elimination, we could alternatively replace A with its QR factorization, QRx = b. Since Q is subunitary, Q T Q = I, so the Q term on the left can be cleared by premultiplying by Q T : Rx = Q T b. (5.) Since R is upper triangular, this Rx = Q T b system can be solved for x by back-substitution, just as with the last step in Gaussian elimination. We can formally write the solution as x = R Q T b. In general, the R factor in a QR factorization will be different from the U factor in an LU factorization, though both are upper triangular matrices. But is R invertible? Yes: since the columns of A are linearly independent, the diagonal entries of R obtained via the Gram Schmidt process are always nonzero. student experiments 5.. The discussion above assumed A was a square matrix. Suppose A IR m n with m > n, and A has n linearly independent columns. If b R(A), write down a solution x to Ax = b in terms of the QR factorization of A. Is this solution unique? 5.5 Least squares, take two Suppose that, instead of solving Ax = b, we instead wish to find x to minimize min kb Axk, xc n the least squares problem discussed in the last chapter. We saw there that x minimized the misfit b Ax when A T Ax = A T b, and that this x was unique provided N(A) ={0}. Throughout this chapter we have assumed that a,...,a n are linearly independent, giving the matrix A = [ a a a n ] linearly independent columns: hence N(A) = {0}, and the least squares solution is the unique x that satisfies Substitute A = QR for A to get A T Ax = A T b. (QR) T (QR)x =(QR) T b

11 7 which is equivalent to R T Q T QRx = R T Q T b. Since Q is subunitary (its columns are orthogonal), Q T Q = I, so our x solves R T Rx = R T Q T b. (5.) Note that R T is a lower-triangular matrix with nonzero entries on the main diagonal, so it is invertible. Multiply both sides of (5.) by (R ) to obtain Rx = Q T b, (5.) which, remarkably enough, is the same equation we obtained in (5.) for solving Ax = b when b R(A). But is the agreement of equations (5.) and (5.) really so remarkable? Insert I = Q T Q on the right-hand side of (5.) to obtain Rx = Q T QQ T b matlab captures this idea in its backslash command, \. When you type A\b, you will obtain x = A b when A is a square invertible matrix, and the x that minimizes kb Axk when A IR m n with m > n having linearly independent columns. = Q T (QQ T )b = Q T b R, noting that QQ T is the orthogonal projector onto R(A), and b R = QQ T b is the vector given by the FTLA decomposition: b =(QQ T b)+(i QQ T )b = b R + b N. So implicitly, equation (5.) projects b IR n to b R, to give an equation Rx = Q T b R that has a solution x.

Chapter 4 Fundamentals of Subspaces

Chapter 4 Fundamentals of Subspaces Matrix Methods for Computational Modeling and Data Analytics Virginia Tech Spring 8 Chapter 4 Fundamentals of Subspaces Mark Embree embree@vt.edu Ax=b version of February 8 When does a linear system of

More information

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Linear Algebra Chap. 2: Least Squares Problems Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

18.06 Quiz 2 April 7, 2010 Professor Strang

18.06 Quiz 2 April 7, 2010 Professor Strang 18.06 Quiz 2 April 7, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number or instructor is 2. 3. 1. (33 points) (a) Find the matrix P that projects every vector b in R 3 onto the line

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 38 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 38.. Outline..

More information

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized

More information

This can be accomplished by left matrix multiplication as follows: I

This can be accomplished by left matrix multiplication as follows: I 1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name: Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due

More information

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases Week Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases. Opening Remarks.. Low Rank Approximation 463 Week. Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases 464.. Outline..

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Lecture 6, Sci. Comp. for DPhil Students

Lecture 6, Sci. Comp. for DPhil Students Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Important Matrix Factorizations

Important Matrix Factorizations LU Factorization Choleski Factorization The QR Factorization LU Factorization: Gaussian Elimination Matrices Gaussian elimination transforms vectors of the form a α, b where a R k, 0 α R, and b R n k 1,

More information

Linear least squares problem: Example

Linear least squares problem: Example Linear least squares problem: Example We want to determine n unknown parameters c,,c n using m measurements where m n Here always denotes the -norm v = (v + + v n) / Example problem Fit the experimental

More information

Least-Squares Systems and The QR factorization

Least-Squares Systems and The QR factorization Least-Squares Systems and The QR factorization Orthogonality Least-squares systems. The Gram-Schmidt and Modified Gram-Schmidt processes. The Householder QR and the Givens QR. Orthogonality The Gram-Schmidt

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

Matrices and Matrix Algebra.

Matrices and Matrix Algebra. Matrices and Matrix Algebra 3.1. Operations on Matrices Matrix Notation and Terminology Matrix: a rectangular array of numbers, called entries. A matrix with m rows and n columns m n A n n matrix : a square

More information

Dot product and linear least squares problems

Dot product and linear least squares problems Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The

More information

Stability of the Gram-Schmidt process

Stability of the Gram-Schmidt process Stability of the Gram-Schmidt process Orthogonal projection We learned in multivariable calculus (or physics or elementary linear algebra) that if q is a unit vector and v is any vector then the orthogonal

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they

More information

Orthogonal Transformations

Orthogonal Transformations Orthogonal Transformations Tom Lyche University of Oslo Norway Orthogonal Transformations p. 1/3 Applications of Qx with Q T Q = I 1. solving least squares problems (today) 2. solving linear equations

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University September 22, 2005 0 Preface This collection of ten

More information

P = A(A T A) 1 A T. A Om (m n)

P = A(A T A) 1 A T. A Om (m n) Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax

More information

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated. Math 504, Homework 5 Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated 1 Find the eigenvalues and the associated eigenspaces

More information

Vector Spaces ปร ภ ม เวกเตอร

Vector Spaces ปร ภ ม เวกเตอร Vector Spaces ปร ภ ม เวกเตอร 1 5.1 Real Vector Spaces ปร ภ ม เวกเตอร ของจ านวนจร ง Vector Space Axioms (1/2) Let V be an arbitrary nonempty set of objects on which two operations are defined, addition

More information

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS Math 50B Midterm II Information Spring 019 SOLUTIONS TO PRACTICE PROBLEMS Problem 1. Determine whether each set S below forms a subspace of the given vector space V. Show carefully that your answer is

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

3 QR factorization revisited

3 QR factorization revisited LINEAR ALGEBRA: NUMERICAL METHODS. Version: August 2, 2000 30 3 QR factorization revisited Now we can explain why A = QR factorization is much better when using it to solve Ax = b than the A = LU factorization

More information

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b, Least-Squares A famous application of linear algebra is the case study provided by the 1974 update of the North American Datum (pp. 373-374, a geodetic survey that keeps accurate measurements of distances

More information

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education MTH 3 Linear Algebra Study Guide Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education June 3, ii Contents Table of Contents iii Matrix Algebra. Real Life

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

(Mathematical Operations with Arrays) Applied Linear Algebra in Geoscience Using MATLAB

(Mathematical Operations with Arrays) Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB (Mathematical Operations with Arrays) Contents Getting Started Matrices Creating Arrays Linear equations Mathematical Operations with Arrays Using Script

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

L2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015

L2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015 L-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 015 Marty McFly: Wait a minute, Doc. Ah... Are you telling me you built a time machine... out of a DeLorean? Doc Brown: The way I see

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

1 Inner Product and Orthogonality

1 Inner Product and Orthogonality CSCI 4/Fall 6/Vora/GWU/Orthogonality and Norms Inner Product and Orthogonality Definition : The inner product of two vectors x and y, x x x =.., y =. x n y y... y n is denoted x, y : Note that n x, y =

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Class notes: Approximation

Class notes: Approximation Class notes: Approximation Introduction Vector spaces, linear independence, subspace The goal of Numerical Analysis is to compute approximations We want to approximate eg numbers in R or C vectors in R

More information

Chapter 3 Least Squares Solution of y = A x 3.1 Introduction We turn to a problem that is dual to the overconstrained estimation problems considered s

Chapter 3 Least Squares Solution of y = A x 3.1 Introduction We turn to a problem that is dual to the overconstrained estimation problems considered s Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

Vector Spaces ปร ภ ม เวกเตอร

Vector Spaces ปร ภ ม เวกเตอร Vector Spaces ปร ภ ม เวกเตอร 5.1 Real Vector Spaces ปร ภ ม เวกเตอร ของจ านวนจร ง Vector Space Axioms (1/2) Let V be an arbitrary nonempty set of objects on which two operations are defined, addition and

More information

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1 Winter 2009 I. Topics from linear algebra I.0 : Background 1. Suppose that {x, y} is linearly dependent. Then there are scalars a, b which are not both

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

MATH 3511 Lecture 1. Solving Linear Systems 1

MATH 3511 Lecture 1. Solving Linear Systems 1 MATH 3511 Lecture 1 Solving Linear Systems 1 Dmitriy Leykekhman Spring 2012 Goals Review of basic linear algebra Solution of simple linear systems Gaussian elimination D Leykekhman - MATH 3511 Introduction

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation Lecture Cholesky method QR decomposition Terminology Linear systems: Eigensystems: Jacobi transformations QR transformation Cholesky method: For a symmetric positive definite matrix, one can do an LU decomposition

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25 EECS 6 Designing Information Devices and Systems I Spring 8 Lecture Notes Note 5 5. Speeding up OMP In the last lecture note, we introduced orthogonal matching pursuit OMP, an algorithm that can extract

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.

More information

LINEAR ALGEBRA KNOWLEDGE SURVEY

LINEAR ALGEBRA KNOWLEDGE SURVEY LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................

More information

PRACTICE PROBLEMS (TEST I).

PRACTICE PROBLEMS (TEST I). 6 Calculus III for CS Spring PRACTICE PROBLEMS (TEST I). MATERIAL: Chapter from Salas-Hille-Etgen, Sections,, 3, 4, 5, 6 from the [Notes:Th], Chapters,, 3, 4 from [Demko] (most of it is covered in more

More information

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result: Exercise Find the QR decomposition of A =, then use that decomposition to solve the least squares problem Ax = 2 3 4 Solution Name the columns of A by A = [a a 2 a 3 ] and denote the columns of the results

More information

Vector and Matrix Norms. Vector and Matrix Norms

Vector and Matrix Norms. Vector and Matrix Norms Vector and Matrix Norms Vector Space Algebra Matrix Algebra: We let x x and A A, where, if x is an element of an abstract vector space n, and A = A: n m, then x is a complex column vector of length n whose

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication.

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication. This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication. Copyright Pearson Canada Inc. All rights reserved. Copyright Pearson

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017 Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 4: Applications of Orthogonality: QR Decompositions Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known

More information

Fundamentals of Engineering Analysis (650163)

Fundamentals of Engineering Analysis (650163) Philadelphia University Faculty of Engineering Communications and Electronics Engineering Fundamentals of Engineering Analysis (6563) Part Dr. Omar R Daoud Matrices: Introduction DEFINITION A matrix is

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information