Finite Dimensional Space Orthogonal Projection Operators

Size: px
Start display at page:

Download "Finite Dimensional Space Orthogonal Projection Operators"

Transcription

1 Finite Dimensional Space Orthogonal Projection Operators Robert M. Haralick Computer Science, Graduate Center City University of New York

2 Spaces Spaces have points called vectors Spaces have sets of points Some sets are called subspaces Spaces have directions Spaces have sets of directions Spaces have a language of representing points in terms of traveling different lengths in different directions

3 Vector Spaces Definition A space V is a vector space over a field of scalars S if and only if x V and y V implies x + y V x + y = y + x x + (y + z) = (x + y) + z There exists 0 V satisfying x + 0 = x for every x V x V implies there exists a unique y V such that x + y = 0 If x V and α S, then αx V If α, β S and x V, then α(βx) = (αβ)x There exists a scalar 1 S x V implies 1x = x α(x + y) = αx + αy (α + β)x = αx + βx

4 Language of Spaces The words of the language are the basis elements b 1, b 2,..., b N, b n = 1, n = 1,..., N The basis elements specify independent directions The sentence takes the form N n=1 α nb n The meaning of the sentence is Begin at the origin Go α 1 in direction b 1 Go α 2 in direction b 2... Go α N in direction b N And you arrive at the point represented by (α 1,..., α N ) The set of all places that can be reached by such a sentence is called the space spanned by the directions b 1,..., b N The interesting sentences are the minimal ones

5 Linear Independence Minimal sentence means using independent directions. Definition b 1,..., b N are independent directions (linearly independent) when α n b n = 0 if and only if α n = 0, n = 1,..., N n=1 If you travel α 1 in direction b 1, then travel α 2 in direction b 2,..., then travel α N in direction b N and you return to the origin, then the directions are dependent.

6 Linear Dependence Definition b 1,..., b N are linearly dependent if and only if for some α 1,..., α N, not all 0 α n b n = 0 n=1 If you travel α 1 in direction b 1, then travel α 2 in direction b 2,..., then travel α N in direction b N and you return to the origin, then the directions are dependent.

7 Finite Dimensional Vectors and Transpose x = x 1 x 2. x N y = y 1 y 2. y N x y = xy = x n y n n=1 x 1 y 1 x 1 y 2... x 1 y N x 2 y 1 x 2 y 2... x 2 y N. x N y 1 x N y 2... x N y N

8 Angles b i b j cosine of the angle between directions b i and b j b i b j = b j b i b i b i = squared length b i b i = b i 2 b (c + d) = b c + b d (αb) c = α(b c) b i b j means geometrically orthogonal b i b j if and only if b i b j = 0 b 1,..., b N is orthonormal if and only if b i b j = 0 when i j b i = 1

9 Lengths Definition The length of a vector x is its distance from the origin. x = x x Let x = N n=1 α nb n and b 1,..., b N be orthonormal x 2 = = = α i b i 2 = ( α i b i ) i=1 i=1 N α i b i ( α j b j ) = i=1 j=1 αi 2 b i b i + i=1 i=1 N j=1 i=1 j=1 α j b j α i α j b i b j α i α j b i b j = j=1 j i i=1 α 2 i

10 Coordinate Representation Let b 1,..., b N be an orthonormal basis Let x be a vector Find α 1,..., α N such that x = N n=1 α nb n Suppose x = N n=1 α nb n. N b i x = b i ( α n b n ) = n=1 α n b i b n n=1 = α i b i b i = α i x = (α 1,..., α N ) with respect to basis b 1,..., b n Change the basis and you change the coordinate representation.

11 Inner Product Let x = (α 1,..., α N ), y = (β 1,..., β N ), be coordinates with respect to orthonormal basis b 1,..., b N x y = ( α i b i ) ( β j b j ) = = = i=1 j=1 N α i b i ( β j b j ) = i=1 j=1 α i β i b i b i + i=1 α i β i i=1 i=1 i=1 j=1 α i β j b i b j j=1 j i α i β j b i b j

12 Dimensionality N: Dimension of Space Number of directions required in a minimal sentence to specify (reach) any point in the space Number of degrees of freedom needed to represent a point in the space {x x = N n=1 α nb n } M: Dimension of Subspace Number of directions required in a minimal sentence to specify (reach) any point in the subspace Number of degrees of freedom needed to represent a point in the subspace {x x = M β mb m } {x x = M β mb m + N m=m+1 0b m} M degrees of freedom; N M degrees of constraint

13 Constraints M: Dimension of Subspace Number of directions required in a minimal sentence to specify (reach) any point in the subspace Number of degrees of freedom needed to represent a point in the subspace {x x = M β mb m } {x x = M β mb m + N m=m+1 0b m} M degrees of freedom; N M degrees of constraint Let i {M + 1,..., N} and b 1,..., b N be orthonormal. Consider b i x b i x = b i = M β m b m = M β m 0 = 0 M b i β mb m = M β m b i b m

14 Co-Dimension M: Dimension of Subspace Number of directions required in a minimal sentence to specify (reach) any point in the subspace Number of degrees of freedom needed to represent a point in the subspace {x x = M β mb m } {x x = M β mb m + N m=m+1 0b m} M degrees of freedom N M degrees of constraint N M Co-dimension N M Constraints Let i {M + 1,..., N} and b 1,..., b N be orthonormal. b i x = 0, i {M + 1,..., N}

15 Basis Vectors Let b 1,..., b N be an orthonormal basis for a space S. Each b n is a direction The length of each b n is one A direction and a length represents a point or vector in the space b i b j, i j b n is a point or vector in the space

16 Representing Subspaces 1-Dimensional Space b 2 b 1 2-Dimensional Space {x for some α 1, x = α 1 b 1 } {x b 2 x = 0}

17 Representing Subspaces N Dimensional Space S M Dimensional Subspace T b 1,..., b N orthonormal basis S = {x for some α 1,..., α N, x = N n=1 α nb n } M degrees of freedom T = {x for some α 1,..., α M, x = M α mb m } N M degrees of constraint T = {x b i x = 0, i {M + 1,..., N}}

18 Orthogonal Subspaces Definition A subspace T is orthogonal to a subspace U if and only if t T and u U implies t u = 0 Definition Let T be a subspace of S. The orthogonal complement of T, denoted by T, is defined by T = {x S for every t T, x t = 0}

19 Orthogonal Subspaces Proposition Let b 1,..., b N be an orthogonal basis of S. Let V be a subspace of S spanned by b 1,..., b M. Then V is the subspace spanned by b M+1..., b N Proof. V = {x S v V implies x v = 0} M x v = x α mb m = ( β nb n) = n=1 β n M n=1 α mb nb m = M M α mβ m α mb m M αmβm = 0 for all α 1,..., α M implies β 1 = 0,..., β M = 0 Therefore, V = {x x = β i b i } i=m+1

20 Orthogonal Representations Proposition Let V be a subspace of S and let x S. Then there exists a v V and w V such that x = v + w Proof. Let b 1,..., b N be an orthonormal basis for S such that b 1,..., b M is an orthonormal basis for V and b M+1,... b N is an orthonormal basis for V Then for some α 1,..., α N, x = α n b n = n=1 M α n b n + n=1 i=m+1 α i b i But v = M n=1 α nb n V and w = N i=m+1 α ib i V. Therefore x = v + w for v V and w V.

21 Orthogonal Projection Definition Let V be a subspace of S. Let x S and x = v + w where v V and w V. Then v is called the orthogonal projection of x onto V.

22 Orthogonal Projections are Unique Proposition Let V be a subspace of S. Let x S and x = v 1 + w 1 = v 2 + w 2 where v 1, v 2 V and w 1, w 2 V. Then v 1 = v 2. Proof. Let b 1,..., b M be an orthonormal basis for V. Then v 1 = M α mb m and v 2 = M β mb m. b i x = b i (v 1 + w 1 ) = b i = b i (v 2 + w 2 ) = b i Therefore, α i = β i, i = 1,..., M M α m b m = α i M β m b m = β i

23 Orthogonal Projection Operator Proposition Let V be an M dimensional subspace of S. Let x S and x = v + w where v V and w V. Let b 1,..., b N be an orthonormal basis of S and b 1,..., b M be an orthonormal basis of V. Then v = Px where P = M b mb m. Proof. x S implies x = N n=1 β nb n = M β mb m + N n=m+1 β nb n. Then Now, v = b mx = b m β n b n = n=1 M β m b m = = Px β n b mb n = β m n=1 M M (b mx)b m = ( b m b m)x

24 Orthogonal Projection Operator Proposition Let b 1,..., b N be an orthonormal basis S and b 1,..., b M an orthonormal basis for the subspace V of S. Then P = M b mb m is the orthogonal projection operator to V.

25 Orthogonal Projection Operators Proposition If P is an orthogonal projection operator to the subspace V of S, then P 2 = P P = P Proof. Let b 1,..., b M be an orthonormal basis for V. Then, P 2 = = M M b mb m b i b i M b m M i=1 i=1 (b mb i )b i = M b mb m = P M M P = ( b mb m) = (b mb m) M = b mb m = P

26 Uniqueness Proposition Suppose P = P 2, P = P, Q = Q 2, and Q = Q. If PQ = Q and QP = P, then Q = P. Proof. Q = PQ = (PQ) = Q P = QP = P

27 Orthogonal Projection Operators are Unique Proposition Let V be a M dimensional subspace of S. Let b 1,..., b M be one orthonormal basis for V and let c 1,..., c M be another orthonormal basis for V. Define P = M b mb m and Q = M c mc m. Then Q = P. Proof. By the definition of orthogonal projection operators, both P and Q are orthogonal projection operators onto V. Hence, P = P 2 and P = P. Likewise, Q = Q 2 and Q = Q. Since the columns of P and Q are in V, PQ = Q and QP = P By the uniqueness proposition, Q = P.

28 Orthogonal Projection Operator Characterization Theorem Theorem If P = P 2 and P = P, then P is the orthogonal projection operator onto Col(P). Proof. Let b 1,..., b M be an orthonormal basis for Col(P). Define Q = M b mb m. Then Q = Q 2 and Q = Q. Clearly, Col(Q) = Col(P) so that QP = P and PQ = Q. By the uniqueness proposition, P = Q. And since Q is the orthogonal projection operator onto Col(P), P must also be the orthogonal projection operator onto Col(P).

29 Projection Operators Definition P is called a projection operator if and only if P 2 = P ( ) ( ) ( ) = ( ) ( ) ( ) = ( ) ( ) ( ) =

30 Trace Definition Let A = (a ij ) be a square N N matrix. Proposition Trace(A) = n=1 a nn Trace( N n=1 α na n ) = N n=1 α ntrace(a n )

31 Trace Proposition Proof. Trace(AB) = Trace(BA) Let C N N = (c ij ) = A N K B K N and D K K = (d mn) = B K N A N K. c ij = d mn = Trace(C) = = K a ik b kj k=1 b mi a in i=1 c ii = i=1 K a ik b ki i=1 k=1 K K b ki a ik = d kk = Trace(D) = Trace(BA) k=1 i=1 k=1

32 Trace Corollary x Ax = Trace(Axx ) Proof. x Ax = Trace(x Ax) = Trace(x (Ax)) = Trace((Ax)x ) = Trace(Axx )

33 Trace Proposition Let A = (a ij ) be a M N matrix. Then M amn 2 = Trace(AA ) n=1 Proof. Let B = (b ij ) = AA. Then b ij = N n=1 a ina jn. Hence, b ii = N n=1 a ina in = N n=1 a2 in. Therefore Trace(B) = Trace(AA ) = M b mm = M N n=1 a2 mn

34 Trace Proposition Let P be an orthogonal projection operator to the M dimensional subspace V. Then Trace(P) = M Proof. Let b 1,..., b M be an orthonormal basis for V. Then P = M b mb m Trace(P) = Trace( = = M b m b m) M Trace(b m b m) = M Trace(1) = M Trace(b mb m ) M 1 = M

35 Trace Proposition Let P be an orthogonal projection operator onto a M dimensional subspace. Then i=1 j=1 pij 2 = M Proof. pij 2 = Trace(PP ) = Trace(PP) = Trace(P) = M i=1 j=1

36 Kernel and Range Definition The Kernel of a matrix operator A is Kernel(A) = {x Ax = 0} The Range of a matrix operator A is Range(A) = {y for some x, y = Ax}

37 Kernel and Range Proposition Let P be a projection operator onto subspace V of S. Then Range(P) + Ker(P) = S Proof. Let x S. Px + (I P)x = Px + x Px = x. Certainly Px Range(P). Consider (I P)x. P[(I P)x] = Px PPx = Px Px = 0 Therefore, by definition of Kernel(P), (I P)x Kernel(P).

38 Kernel and Range Proposition Let P be an orthogonal projection operator. Then Range(P) Kernel(P) Proof. Let x Range(P) and y Kernel(P). Then for some u, x = Pu. Consider x y. x y = (Pu) y = u P y = u Py But y Kernel(P) so that Py=0. Therefore x y = 0.

39 Projecting Kernel(P) Px Range(P) x b 2 b 1 P = ( )

40 Proposition Let P be the orthogonal projection operator onto the subspace V. Then I P is the orthogonal projection operator onto the subspace V. Proof. (I P)(I P) = I P P + P 2 = I 2P + P = I P (I P) = I P = I P V = Kernel(P). Let x V. Then Px = 0. Consider (I P)x = x Px = x

41 Orthogonal Projection Minimizes Error Theorem Let V be a subspace of S. Let f : S V and x S. min(x f (x)) (x f (x)) f is achieved when f is the orthogonal projection operator from S to V Proof. Let x S. Then there exists v V and w V such that x = v + w. Consider ɛ 2 = (x f (x)) (x f (x)) = x x (v + w) f (x) f (x) (v + w) + f (x) f (x) = x x v f (x) f (x) v f (x) f (x) = (v + w) (v + w) v f (x) f (x) v f (x) f (x) = v v v f (x) f (x) v f (x) f (x) + w w = (v f (x)) (v f (x)) + w w ɛ 2 is minimized by making f (x) = v, the orthogonal projection of x onto V.

42 Constructing an Orthogonal Projection Operator Proposition Let X Z N be of full rank. Then the orthogonal projection operator onto the subspace col(x) is given by X(X X) 1 X. Proof. By definition col(x) is spanned by the columns of X. Notice that X(X X) 1 X operating on X produces X. X(X X) 1 X X = X[(X X) 1 (X X)] = X Hence anything in col(x) will be a fixed point of X(X X) 1 X. Furthermore, for any z R N, X(X X) 1 X z = X[(X X) 1 X z] col(x). Therefore, X(X X) 1 X maps onto col(x). Now we check that X(X X) 1 X satisfies the definition of orthogonal projection operator. [X(X X) 1 X ][X(X X) 1 X ] = X[(X X) 1 X X](X X) 1 X = X(X X) 1 X [X(X X) 1 X ] = X(X X) 1 X

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Linear Algebra Lecture Notes-I

Linear Algebra Lecture Notes-I Linear Algebra Lecture Notes-I Vikas Bist Department of Mathematics Panjab University, Chandigarh-6004 email: bistvikas@gmail.com Last revised on February 9, 208 This text is based on the lectures delivered

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

ELEMENTS OF MATRIX ALGEBRA

ELEMENTS OF MATRIX ALGEBRA ELEMENTS OF MATRIX ALGEBRA CHUNG-MING KUAN Department of Finance National Taiwan University September 09, 2009 c Chung-Ming Kuan, 1996, 2001, 2009 E-mail: ckuan@ntuedutw; URL: homepagentuedutw/ ckuan CONTENTS

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

Subspace Classifiers. Robert M. Haralick. Computer Science, Graduate Center City University of New York

Subspace Classifiers. Robert M. Haralick. Computer Science, Graduate Center City University of New York Subspace Classifiers Robert M. Haralick Computer Science, Graduate Center City University of New York Outline The Gaussian Classifier When Σ 1 = Σ 2 and P(c 1 ) = P(c 2 ), then assign vector x to class

More information

Linear models. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. October 5, 2016

Linear models. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. October 5, 2016 Linear models Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark October 5, 2016 1 / 16 Outline for today linear models least squares estimation orthogonal projections estimation

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018 Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018 Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

MAT 2037 LINEAR ALGEBRA I web:

MAT 2037 LINEAR ALGEBRA I web: MAT 237 LINEAR ALGEBRA I 2625 Dokuz Eylül University, Faculty of Science, Department of Mathematics web: Instructor: Engin Mermut http://kisideuedutr/enginmermut/ HOMEWORK 2 MATRIX ALGEBRA Textbook: Linear

More information

of A in U satisfies S 1 S 2 = { 0}, S 1 + S 2 = R n. Examples 1: (a.) S 1 = span . 1 (c.) S 1 = span, S , S 2 = span 0 (d.

of A in U satisfies S 1 S 2 = { 0}, S 1 + S 2 = R n. Examples 1: (a.) S 1 = span . 1 (c.) S 1 = span, S , S 2 = span 0 (d. . Complements and Projection Maps In this section, we explore the notion of subspaces being complements. Then, the unique decomposition of vectors in R n into two pieces associated to complements lets

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

Linear algebra review

Linear algebra review EE263 Autumn 2015 S. Boyd and S. Lall Linear algebra review vector space, subspaces independence, basis, dimension nullspace and range left and right invertibility 1 Vector spaces a vector space or linear

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 13, 2014

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 13, 2014 University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra PhD Preliminary Exam June 3, 24 Name: Exam Rules: This is a closed book exam Once the exam begins,

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Notice that the set complement of A in U satisfies

Notice that the set complement of A in U satisfies Complements and Projection Maps In this section, we explore the notion of subspaces being complements Then, the unique decomposition of vectors in R n into two pieces associated to complements lets us

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Chapter 3. Matrices. 3.1 Matrices

Chapter 3. Matrices. 3.1 Matrices 40 Chapter 3 Matrices 3.1 Matrices Definition 3.1 Matrix) A matrix A is a rectangular array of m n real numbers {a ij } written as a 11 a 12 a 1n a 21 a 22 a 2n A =.... a m1 a m2 a mn The array has m rows

More information

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT Math Camp II Basic Linear Algebra Yiqing Xu MIT Aug 26, 2014 1 Solving Systems of Linear Equations 2 Vectors and Vector Spaces 3 Matrices 4 Least Squares Systems of Linear Equations Definition A linear

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

12. Special Transformations 1

12. Special Transformations 1 12. Special Transformations 1 Projections Take V = R 3 and consider the subspace W = {(x, y,z) x y z = 0}. Then the map P:V V that projects every vector in R 3 orthogonally onto the plane W is a linear

More information

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63 Contents Appendix D (Inner Product Spaces W-5 Index W-63 Inner city space W-49 W-5 Chapter : Appendix D Inner Product Spaces The inner product, taken of any two vectors in an arbitrary vector space, generalizes

More information

P = A(A T A) 1 A T. A Om (m n)

P = A(A T A) 1 A T. A Om (m n) Chapter 4: Orthogonality 4.. Projections Proposition. Let A be a matrix. Then N(A T A) N(A). Proof. If Ax, then of course A T Ax. Conversely, if A T Ax, then so Ax also. x (A T Ax) x T A T Ax (Ax) T Ax

More information

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Vector Spaces. (1) Every vector space V has a zero vector 0 V Vector Spaces 1. Vector Spaces A (real) vector space V is a set which has two operations: 1. An association of x, y V to an element x+y V. This operation is called vector addition. 2. The association of

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Section 7.5 Inner Product Spaces

Section 7.5 Inner Product Spaces Section 7.5 Inner Product Spaces With the dot product defined in Chapter 6, we were able to study the following properties of vectors in R n. ) Length or norm of a vector u. ( u = p u u ) 2) Distance of

More information

GEOMETRY OF MATRICES x 1

GEOMETRY OF MATRICES x 1 GEOMETRY OF MATRICES. SPACES OF VECTORS.. Definition of R n. The space R n consists of all column vectors with n components. The components are real numbers... Representation of Vectors in R n.... R. The

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x = Linear Algebra Review Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1 x x = 2. x n Vectors of up to three dimensions are easy to diagram.

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is

More information

Chapter 4. Matrices and Matrix Rings

Chapter 4. Matrices and Matrix Rings Chapter 4 Matrices and Matrix Rings We first consider matrices in full generality, i.e., over an arbitrary ring R. However, after the first few pages, it will be assumed that R is commutative. The topics,

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No. 7. LEAST SQUARES ESTIMATION 1 EXERCISE: Least-Squares Estimation and Uniqueness of Estimates 1. For n real numbers a 1,...,a n, what value of a minimizes the sum of squared distances from a to each of

More information

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1 E2 212: Matrix Theory (Fall 2010) s to Test - 1 1. Let X = [x 1, x 2,..., x n ] R m n be a tall matrix. Let S R(X), and let P be an orthogonal projector onto S. (a) If X is full rank, show that P can be

More information

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra The two principal problems in linear algebra are: Linear system Given an n n matrix A and an n-vector b, determine x IR n such that A x = b Eigenvalue problem Given an n n matrix

More information

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products. MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products. Orthogonal projection Theorem 1 Let V be a subspace of R n. Then any vector x R n is uniquely represented

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 1. Basic Linear Algebra Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Example

More information

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018 Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

7 : APPENDIX. Vectors and Matrices

7 : APPENDIX. Vectors and Matrices 7 : APPENDIX Vectors and Matrices An n-tuple vector x is defined as an ordered set of n numbers. Usually we write these numbers x 1,...,x n in a column in the order indicated by their subscripts. The transpose

More information

Rank, Trace, Determinant, Transpose an Inverse of a Matrix Let A be an n n square matrix: A = a11 a1 a1n a1 a an a n1 a n a nn nn where is the jth col

Rank, Trace, Determinant, Transpose an Inverse of a Matrix Let A be an n n square matrix: A = a11 a1 a1n a1 a an a n1 a n a nn nn where is the jth col Review of Linear Algebra { E18 Hanout Vectors an Their Inner Proucts Let X an Y be two vectors: an Their inner prouct is ene as X =[x1; ;x n ] T Y =[y1; ;y n ] T (X; Y ) = X T Y = x k y k k=1 where T an

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Linear Algebra. Christos Michalopoulos. September 24, NTU, Department of Economics

Linear Algebra. Christos Michalopoulos. September 24, NTU, Department of Economics Linear Algebra Christos Michalopoulos NTU, Department of Economics September 24, 2011 Christos Michalopoulos Linear Algebra September 24, 2011 1 / 93 Linear Equations Denition A linear equation in n-variables

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

Methods of Mathematical Physics X1 Homework 2 - Solutions

Methods of Mathematical Physics X1 Homework 2 - Solutions Methods of Mathematical Physics - 556 X1 Homework - Solutions 1. Recall that we define the orthogonal complement as in class: If S is a vector space, and T is a subspace, then we define the orthogonal

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

Linear Algebra. Paul Yiu. 6D: 2-planes in R 4. Department of Mathematics Florida Atlantic University. Fall 2011

Linear Algebra. Paul Yiu. 6D: 2-planes in R 4. Department of Mathematics Florida Atlantic University. Fall 2011 Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6D: 2-planes in R 4 The angle between a vector and a plane The angle between a vector v R n and a subspace V is the

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

Matrices MA1S1. Tristan McLoughlin. November 9, Anton & Rorres: Ch

Matrices MA1S1. Tristan McLoughlin. November 9, Anton & Rorres: Ch Matrices MA1S1 Tristan McLoughlin November 9, 2014 Anton & Rorres: Ch 1.3-1.8 Basic matrix notation We have studied matrices as a tool for solving systems of linear equations but now we want to study them

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

Math 321: Linear Algebra

Math 321: Linear Algebra Math 32: Linear Algebra T. Kapitula Department of Mathematics and Statistics University of New Mexico September 8, 24 Textbook: Linear Algebra,by J. Hefferon E-mail: kapitula@math.unm.edu Prof. Kapitula,

More information

STAT200C: Review of Linear Algebra

STAT200C: Review of Linear Algebra Stat200C Instructor: Zhaoxia Yu STAT200C: Review of Linear Algebra 1 Review of Linear Algebra 1.1 Vector Spaces, Rank, Trace, and Linear Equations 1.1.1 Rank and Vector Spaces Definition A vector whose

More information

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space. Chapter 1 Preliminaries The purpose of this chapter is to provide some basic background information. Linear Space Hilbert Space Basic Principles 1 2 Preliminaries Linear Space The notion of linear space

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

G1110 & 852G1 Numerical Linear Algebra

G1110 & 852G1 Numerical Linear Algebra The University of Sussex Department of Mathematics G & 85G Numerical Linear Algebra Lecture Notes Autumn Term Kerstin Hesse (w aw S w a w w (w aw H(wa = (w aw + w Figure : Geometric explanation of the

More information

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences. Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.. Recall that P 3 denotes the vector space of polynomials of degree less

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A). Math 344 Lecture #19 3.5 Normed Linear Spaces Definition 3.5.1. A seminorm on a vector space V over F is a map : V R that for all x, y V and for all α F satisfies (i) x 0 (positivity), (ii) αx = α x (scale

More information

arxiv: v2 [math.na] 27 Dec 2016

arxiv: v2 [math.na] 27 Dec 2016 An algorithm for constructing Equiangular vectors Azim rivaz a,, Danial Sadeghi a a Department of Mathematics, Shahid Bahonar University of Kerman, Kerman 76169-14111, IRAN arxiv:1412.7552v2 [math.na]

More information

Spring 2014 Math 272 Final Exam Review Sheet

Spring 2014 Math 272 Final Exam Review Sheet Spring 2014 Math 272 Final Exam Review Sheet You will not be allowed use of a calculator or any other device other than your pencil or pen and some scratch paper. Notes are also not allowed. In kindness

More information

Systems of Linear Equations

Systems of Linear Equations Systems of Linear Equations Math 108A: August 21, 2008 John Douglas Moore Our goal in these notes is to explain a few facts regarding linear systems of equations not included in the first few chapters

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Further Mathematical Methods (Linear Algebra)

Further Mathematical Methods (Linear Algebra) Further Mathematical Methods (Linear Algebra) Solutions For The 2 Examination Question (a) For a non-empty subset W of V to be a subspace of V we require that for all vectors x y W and all scalars α R:

More information

Commutator rings. Zachary Mesyan. July 6, 2006

Commutator rings. Zachary Mesyan. July 6, 2006 Commutator rings Zachary Mesyan July 6, 2006 Abstract A ring is called a commutator ring if every element is a sum of additive commutators. In this note we give examples of such rings. In particular, we

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

Linear Algebra Formulas. Ben Lee

Linear Algebra Formulas. Ben Lee Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Linear Algebra Problems

Linear Algebra Problems Math 504 505 Linear Algebra Problems Jerry L. Kazdan Topics 1 Basics 2 Linear Equations 3 Linear Maps 4 Rank One Matrices 5 Algebra of Matrices 6 Eigenvalues and Eigenvectors 7 Inner Products and Quadratic

More information

10 Orthogonality Orthogonal subspaces

10 Orthogonality Orthogonal subspaces 10 Orthogonality 10.1 Orthogonal subspaces In the plane R 2 we think of the coordinate axes as being orthogonal (perpendicular) to each other. We can express this in terms of vectors by saying that every

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models

Bare minimum on matrix algebra. Psychology 588: Covariance structure and factor models Bare minimum on matrix algebra Psychology 588: Covariance structure and factor models Matrix multiplication 2 Consider three notations for linear combinations y11 y1 m x11 x 1p b11 b 1m y y x x b b n1

More information

A Do It Yourself Guide to Linear Algebra

A Do It Yourself Guide to Linear Algebra A Do It Yourself Guide to Linear Algebra Lecture Notes based on REUs, 2001-2010 Instructor: László Babai Notes compiled by Howard Liu 6-30-2010 1 Vector Spaces 1.1 Basics Definition 1.1.1. A vector space

More information

Chapter 0 Miscellaneous Preliminaries

Chapter 0 Miscellaneous Preliminaries EE 520: Topics Compressed Sensing Linear Algebra Review Notes scribed by Kevin Palmowski, Spring 2013, for Namrata Vaswani s course Notes on matrix spark courtesy of Brian Lois More notes added by Namrata

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17 Contents 1 Matrices and Systems of Equations 2 11 Systems of Linear Equations 2 12 Row Echelon Form 3 13 Matrix Algebra 5 14 Elementary Matrices 8 15 Partitioned Matrices 10 2 Determinants 12 21 The Determinant

More information

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε, 2. REVIEW OF LINEAR ALGEBRA 1 Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε, where Y n 1 response vector and X n p is the model matrix (or design matrix ) with one row for

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

NOTES on LINEAR ALGEBRA 1

NOTES on LINEAR ALGEBRA 1 School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Honours Algebra 2, Assignment 8

Honours Algebra 2, Assignment 8 Honours Algebra, Assignment 8 Jamie Klassen and Michael Snarski April 10, 01 Question 1. Let V be the vector space over the reals consisting of polynomials of degree at most n 1, and let x 1,...,x n be

More information