Linear Algebra. Session 12

Size: px
Start display at page:

Download "Linear Algebra. Session 12"

Transcription

1 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017

2 Example 12.1 Find the constant function that is the least squares fit to the following data x f(x) Solution c = 1 c = 0 f (x) = c c = 1 c = c

3 Then, the normal system is ( ) c = ( ) c = 1 4 ( ) = 1 (mean arithmetic value) Thus, the constant function is f (x) =

4 Example 12.2 Find the linear polynomial function that is the least squares fit to the following data x f(x) Solution c 1 = 1 c f (x) = c 1 + c 2 x 1 + c 2 = 0 c 1 + 2c 2 = 1 c 1 + 3c 2 = ( c1 c 2 ) =

5 Then, the nomal system is ( ) ( ) ( c1 c 2 Thus, the linear function is ( c1 ) = c 2 ) = ( 4 8 ) f (x) = x ( { c1 = 0.4 c 2 = 0.4 )

6 Example 12.3 Find the quadratic polynomial function that is the least squares fit to the following data x f(x) Solution c 1 = 1 f (x) = c 1 + c 2 x + c 3 x 2 c 1 + c 2 + c 3 = 0 c 1 + 2c 2 + 4c 3 = 1 c 1 + 3c 2 + 9c 3 = 2

7 Then, the nomal system is c 1 c 2 c 3 = c 1 c 2 c =

8 c 1 c 2 c 3 Thus, the quadratic function is = f (x) = x + 0.5x 2 c 1 = 0.9 c 2 = 1.1 c 3 = 0.5

9 Orthogonal sets Let <, > denote the scalar product in R n Definition Nonzero vectors v 1, v 2,, v k R n form an orthogonal set if they are orthogonal to each other: < v i, v j >= 0 for all i j. If, in addition, all vectors are of unit length, v i, v 1, v 2,, v k is called an orthonormal set. For instance, The standard basis e 1 = (1, 0, 0,..., 0), e 2 = (0, 1, 0,..., 0),, e n = (0, 0, 0,..., 1). It is an orthonormal set.

10 Orthonormal bases Suppose v 1, v 2,, v n is an orthonormal basis for R n (i.e., it is a basis and an orthonormal set). Theorem Let x = x 1 v 1 + x 2 v x n v n and y = y 1 v 1 + y 2 v y n v n where x i, y 1 R i) < x, y >= n i=i x iy i i) x = n i=i x iy i

11 proof i) n n < x, y >= x i v i, y j v j = i=i j=i n x i i=i j=i n y j v i, v j = ii) follows from i) when y = x n n x i v i, v j = i=i j=i n x i y i i=i

12 Suppose V is a subspace of R n. Let p be the orthogonal projection of a vector x R n onto V. If V is a one-dimensional subspace spanned by a v, then p = <x,v> <v,v> v If V admits an orthogonal basis v 1, v 2,, v k, then p = < x, v 1 > < v 1, v 1 > v 1 + < x, v 2 > < v 2, v 2 > v < x, v k > < v k, v k > v k Indeed, < p, v i >= k <x,v j > j=i <v j,v j > < v j, v i >= <x,v i > <v i,v i > < v i, v i >=< x, v i > < x p, v i >= 0 (x p) v i (x p) V.

13 Coordinates relative to an orthogonal basis Theorem If v 1, v 2,, v n is an orthogonal basis for R n, then x = < x, v 1 > < v 1, v 1 > v 1 + < x, v 2 > < v 2, v 2 > v < x, v n > < v n, v n > v n for any vector x R n Corollary If v 1, v 2,, v n is an orthonormal basis for R n, then z =< x, v 1 > v 1 + < x, v 2 > v < x, v n > v n for any vector x R n.

14 . Let V be a subspace of R n. Suppose x 1, x 2,, x k is a basis for V. Let v 1 = x 1 v 2 = x 2 <x 2,v 1 > <v 1,v 1 > v 1 v 3 = x 3 <x 3,v 1 > <v 1,v 1 > v 1 <x 3,v 2 > <v 2,v 2 > v 2 v k = x k <x k,v 1 > <v 1,v 1 > v 1 <x k,v 2 > <v 2,v 2 > v 2 <x k,v k 1 > <v k 1,v k 1 > v k 1 Then v 1, v 2,, v k is an orthogonal basis for V.

15 . Properties of the Gram-Schmidt process: Any basis x 1, x 2,, x k Orthogonal basis v 1, v 2,, v k v j = x j (α 1 x 1 α 2 x 2 α j 1 x j 1 ); 1 j k the span of v 1, v 2,, v j is the same as the span of x 1, x 2,, x j v j is orthogonal to x 1, x 2,, x j 1 v j = x j p j where p j is the orthogonal projection of the vector x j on the subspace spanned by x 1, x 2,..., x k v j is the distance from x j to the subspace spanned by x 1, x 2,..., x j 1

16 . Normalization Let V be a subspace of R n. Suppose v 1, v 2,, v k is an orthogonal basis for V Let w 1 = v 2 v 2, w 2 = v 2 v 2,..., w k = v k v k Then w 1, w 2,..., w k is an orthonormal basis for V. Theorem Any non-trivial subspace of R n admits an orthonormal basis.

17 . Example 12.4 Let Π the plane spanned by vectors x 1 = (1, 1, 0), x 2 = (0, 1, 1). i) Find the orthogonal projection of the vector y = (4, 0, 1) onto the plane Π. i) Find the distance from y to Π. Solution First we apply the Gram-Schmidt process to the basis x 1, x 2. v 1 = x 1 = (1, 0, 0)

18 . v 2 = x 2 <x 2,v 1 > <v 1,v 1 > v 1 = (0, 1, 1) 1 2 (1, 1, 0) = ( 1/2, 1/2, 1) v 3 = x 3 <x 3,v 1 > <v 1,v 1 > v 1 <x 3,v 2 > <v 2,v 2 > v 2 Now that v 1, v 2 is an orthogonal basis for Π, the orthogonal projection of y onto Π is p = < y, v 1 > < v 1, v 1 > v 1+ < y, v 2 > < v 2, v 2 > v 2 = 4 3 (1, 0, 0)+ ( 1/2, 1/2, 1) = (2, 2 3/2 The distance from y to Π is o = y p = (1, 1, 1) = 3

19 . Example 12.5 Find the distance from the point y = (0, 0, 0, 1) to the subspace V R 4 spanned by vectors x 1 = (1, 1, 1, 1), x 2 = (1, 1, 3, 1), x 3 = ( 3, 7, 1, 3),. Solution First we apply the Gram-Schmidt process to the basis x 1, x 2, x 3 and obtain an orthogonal basis v 1, v 2, v 3 for the subspace V Next we compute the orthogonal projection p of the vector y onto V p = < y, v 1 > < v 1, v 1 > v 1 + < y, v 2 > < v 2, v 2 > v 2 + < y, v 3 > < v 3, v 3 > v 3

20 . Then the distance from y to V equals to o = y p Alternatively, we can apply t he Gram-Schmidt process to vectors x 1, x 2, x 3, y. We should obtain an orthogonal system v 1, v 2, v 3, v 4. Then t he desired distance will be v 4. v 1 = x 1 = (1, 1, 1, 1) v 2 = x 2 <x 2,v 1 > <v 1,v 1 > v 1 = (1, 1, 3, 1) 4 4 (1, 1, 1, 1) = ( 1/2, 1/2, 1) v 3 = x 3 <x 3,v 1 > <v 1,v 1 > v 1 <x 3,v 2 > <v 2,v 2 > v 2 = ( 3, 7, 1, 3) 12 4 (1, 1, 1, 1) 16 8 (0, 2, 2, 0) = (0, 0, 0, 0)

21 . The Gram-Schmidt process can be used to check linear independence of vectors!. It failed because the vector x 3 is a linear combination of x 1 and x 2. V is a plane, not a 3 dimensional subspace. To fix things, it is enough to drop x 3, i.e., we should orthogonalize vectors x 1, x 2, y. ˆv 3 = y <y,v 1> <v 1,v 1 > v 1 <y,v 2> <v 2,v 2 > v 2 = (0, 0, 0, 1) 1 4 (1, 1, 1, 3) 0 8 (0, 2, 2, 0) = (1/4, 1/4, 1/4, 3/4) Then the distance from y to V equals to ˆv 3 = (1/4, 1/4, 1/4, 3/4) = 12 4

22 Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α : V R n is called a norm on V if it has the following properties: α(x) 0, α(x) = 0 only for x = 0 (Positivy) α(rx) = r α(x) for all r R. (Homogeneity)

23 α(x + y) α(x) + α(y). (Triangle Inequality) Notation. The norm of a vector x V is usually denoted by x. Different norms on V are distinguished by subscripts, e.g., x 1 and x 2.

24 Let V = R n and let x = (x 1, x 2,..., x n ) be a vector in V x = max{ x 1, x 2,..., x n } Positivity and homogeneity are obvious. Let x = (x 1, x 2,..., x n ) and y = (y 1, y 2,..., y n ). Then x + y = (x 1 + y 1, x 2 + y 2,..., x n + y n ) x i + y i x i + y i max j x j + max j y j max j x i + y i max j x j + max j y j x + y x + y

25 x 1 = x 1 + x x n Positivity and homogeneity are obvious. Let x = (x 1, x 2,..., x n ) and y = (y 1, y 2,..., y n ). Then x + y = (x 1 + y 1, x 2 + y 2,..., x n + y n ) x i + y i x i + y i i x i + y i i x i + i y i x + y 1 x 1 + y 1

26 x p = ( x 1 p + x 2 p x n p ) 1/p Positivity and homogeneity are obvious. Let x = (x 1, x 2,..., x n ) and y = (y 1, y 2,..., y n ). Then x + y = (x 1 + y 1, x 2 + y 2,..., x n + y n ). Now, using the Minkoswky Inequality ( x 1 + y 1 p + x 2 + y 2 p + + x n + y n p ( x 1 p + x 2 p + + x n p ) 1/p + ( y 1 p + y 2 p + + y n p ) 1/p with p > 0 it follows that x + y p x p + y p

27 Normed vector space Definition. A normed vector space is a vector space endowed with a norm. The norm defines a distance function on the normed vector space: dist(x, y) = y y. Then we say that a vector x is a good approximation of a vector x 0 if dist(x, x 0 ) is small. Also, we say that a sequence of vectors x 1, x 2,..., x n converges to a vector x if dist(x, x n ) 0 as n.

28 Unit circle on the normed vector space V = R n : {x V : x p = 1} x 1 = ( x 1 + x 2 x 3/2 = ( x 1 3/2 + x 2 3/2 ) 3/2 x 2 = ( x x 2 2 ) 1/2 x 3 = ( x x 2 3 ) 1/3 x 6 = ( x x 2 6 ) 1/6 x = max{ x 1 + x 2 }

29

30 V = C[a, b], f : [a, b] R f = max a x b f (x) f 1 = inta b f (x) dx f p = ( inta b f (x) p dx ) 1/p, p 1 Theorem f p is a norm on C[a, b] for any p 1

31 Abstract Linear Algebra The notion of inner product generalizes the notion of dot product of vectors in R n Definition Let V be a vector space. A function β : V V R usually denoted β(x, y) =< x, y > is called an inner product on V if it is positive, symmetric, and bilinear. That is, if i) < x, x > 0, < x, x >= 0 only for x = 0 (Positivity) ii) < x, y >=< y, x > (Symmetry)

32 Abstract Linear Algebra iii) < rx, y >= r < y, x > (Homogeneity ) iv) < x + y, z >=< x, z > + < y, z > (Distributive Law ) An inner product space is a vector space endowed with an inner product.

33 Abstract Linear Algebra V = R n Remarks < x, y >= x y = x 1 y 1 + x 2 y x n y n < x, y >= d 1 x 1 y 1 + d 2 x 2 y d n x n y n where d 1, d 2,..., d n > 0 < x, y >= (Dx) (Dy) where D is an invertible n n matrix. a) Invertibility of D is necessary to show that < x, x >= 0 x = 0 b) The second example is a particular case of the third one when D = diag(d 1/2 1, d 1/2 2,..., dn 1/2 ).

34 Abstract Linear Algebra Example 12.6 Find an inner product on R 2 such that < e 1, e 1 >= 1, < e 2, e 1 >= 3, and < e 1, e 2 >= 1 where e 1 = (1, 0), e 2 = (0, 1) Solution Let x = (x 1, x 2 ), y = (y 1, y 2 ) R 2. Then using bilinearity, we obtain < x, y >=< x 1 e 1 + x 2 e 2, y 1 e 1 + y 2 e 2 >= x 1 y 1 < e 1, e 1 > +x 1 y 2 < e 1, e 2 > +x 2 y 1 < e 2, e 1 > +y 1 y 1 < e 2, e 2 >=

35 Abstract Linear Algebra < x, y >= 2x 1 y 1 x 1 y 2 x 2 y 1 + 3x 2 y 2 It remains to check that < x, x > > 0 for x 0 Indeed, < x, x >= 2x 2 1 y 1 2x 1 x 2 + 3x 2 2 = (x 1 x 2 ) 2 + x x 2 2 > 0 for x 0

36 Abstract Linear Algebra V = M m,n (R), space of m n matrices. < A, B >= trace(ab T ) If A = (a ij ) and B = (b ij ), then < A, B >= m i=1 n j=1 a ijb ij V = C[a, b]. < f, g >= b a f (x)g(x)dx < f, g >= b a w(x)f (x)g(x)dx where w is bounded, piecewise continuous, and w > 0 everywhere on [a, b]. w is called the weight function

37 Abstract Linear Algebra Theorem Suppose < x, y > is an inner product on a vector space V. Then for all x, y V Proof < x, y > 2 < x, x >< y, y > For any t R let v t = x + ty then < v t, v t >=< x + ty, x + ty >=< x, x > +t < x, y > + t < y, x > +t 2 < y, y >

38 Abstract Linear Algebra Now, assume that y 0 and let t = <x,y> <y,y>. Then < v t, v t >=< x, x > <x,y>2 <y,y> since < v t, v t > 0 the desired inequality follows. In the case y = 0, we have < x, y >=< y, y >= 0

39 Abstract Linear Algebra Cauchy-Schwarz Inequality < x, y > < x, x > < y, y > Corollary < x, y > x y Corollary For any f, g C[a, b], ( b a f (x)g(x)dx ) 2 b a f (x) 2 dx b a g(x) 2 dx

40 Abstract Linear Algebra Norms induced by inner products Theorem Suppose < x, y > is an inner product on a vector space V. Then, x = < x, x > is a norm. Proof Positivity is obvious. Homogeneity: rx = < rx, rx > = r < x, x > = r x Triangle inequality (follows from Cauchy-Schwarzs): x + y 2 =< x + y, x + y >=< x, x > + < x, y > + < y, x > + < y, y >

41 Abstract Linear Algebra < x, x > + < x, y > + < y, x > + < y, y > x x y + y 2 = ( x + y ) 2

42 Abstract Linear Algebra The length of a vector in R n. x = x1 2 + x x n 2 ( b 1/2 The norm f 2 = a dx) f (x) 2 = on the vector space C[a, b] is induced by the inner product < f, g >= b a f (x)g(x)dx

43 Abstract Linear Algebra Angle Let V be an inner product space with an inner product <, > and the induced norm. Then < x, y > x y for all x, y V (the Cauchy-Schwarz inequality). Therefore we can define the angle between nonzero vectors in V by ) (x, y) = arcos ( <x,y> x y Then < x, y >= x y (x, y). In particular, vectors x and y are orthogonal (denoted x y if < x, y >= 0.

44 Abstract Linear Algebra Orthogonal sets Let V be an inner product space with an inner product <, > and the induced norm. Definition A nonempty set S V of nonzero vectors is called an orthogonal set if all vectors in S are mutually orthogonal. That is, 0 / S and < x, y >= 0 for any x, y S, x y. An orthogonal set S V is called orthonormal if x = 1 for any x S.

45 Abstract Linear Algebra Remark. Vectors v 1, v 2,..., v n V form an orthonormal set if and only if { 1, if i = j, < v i, v j >= 0, if i j,

46 . Singular Value Decomposition In this section, we assume throughout that A is an m n matrix with m n. (This assumption is made for convenience only; all the results will also hold if m < n). We will present a method for determining how close A is to a matrix of smaller rank. The method involves factoring A into a product UΣV T, where U is an m m orthogonal matrix, V is an n n orthogonal matrix, and Σ is an m n matrix whose off-diagonal entries are all 0 s and whose diagonal elements satisfy σ 1 σ 2 σ n 0

47 . Σ = σ 1 σ 2... σ n The σ s determined by this factorization are unique and are called the singular values of A. The factorization UΣV T is called the singular value decomposition of A, or, for short, the SVD of A.

48 . The SVD Theorem If A is an m n matrix, then A has a singular value decomposition Sketch of the proof A T A is a symmetric n n matrix. The eigenvalues of A T A are all real and it has an orthogonal diagonalizing matrix V. Furthermore, its eigenvalues must all be nonnegative. To see this point, let λ be an eigenvalue of A T A and x be an eigenvector belonging to λ. It follows that

49 . Ax 2 = x T A T Ax = x T λx = λx T x = λ x 2 λ = Ax 2 x 2 We may assume that the columns of V have been ordered so that the corresponding eigenvalues satisfy λ 1 λ 2 λ n 0. The singular values are given by σ j = λ j, j = 1, 2,..., n

50 . Let r denote the rank of A. The matrix A T A will also have rank r. Since A T A is symmetric, its rank equals the number of nonzero eigenvalues. Thus, σ 1 σ 2 σ r > 0 σ r+1 = σ r+2 = = σ n = 0 Now let V 1 = (v 1, v 2,..., v r, ) and V 2 = (v r+1, v r+2,..., v n, ) The column vectors of V 1 are eigenvectors of A T A belonging to λ i, i = 1, 2,..., r. The column vectors of V 2 are eigenvectors of A T A belonging to λ j = 0, j = r + 1, r + 2,..., n.

51 . Now let Σ 1 be the r r matrix defined by Σ 1 = σ 1 σ 2... σ n The m n matrix Σ is then given by ( ) Σ1 0 Σ = 0 0

52 . To complete the proof, we must show how to construct an m m orthogonal matrix U such that A = UΣV T AV = UΣ Comparing the first r columns of each side of the last equation, we see that Thus, if we define Av i = σ i v i, u i = 1 σ i Av i, i = 1, 2,..., r i = 1, 2,..., r

53 . and then it follows that U 1 = (u 1, u 2,..., u r ) AV 1 = U 1 Σ 1 The column vectors of U 1 form an orthonormal set. Thus, form an orthonormal basis for R(A). The vector space R(A) = N(A T ) has dimension m r. Let {u r+1, u r+2,, u n } be an orthonormal basis for N(A T ) and set U 2 = (u r+1, u r+2,..., u n )

54 . Setthe m n matrix U by U = (U 1, U 2 ) The the matrices U, Σ, and V satisfy A = UΣV T

55 . Observations Let A be an m n matrix with a singular value decomposition A = UΣV T. The singular values σ 1,..., σ n of A are unique; however, the matrices U and V are not unique. Since V diagonalizes A T A, it follows that the v j s are eigenvectors of A T A. Since AA T = UΣΣ T U T, it follows that U diagonalizes AA T and that the u j s are eigenvectors of AAT. The v j s are called the right singular vectors of A, and the u j s are called the left singular vectors of A.

56 . If A has rank r, then (i) v 1, v 2,..., v r form an orthonormal basis for R(A T ). (ii) v r+1, v r+2,..., v n form an orthonormal basis for N(A). (iii) u 1, u 2,..., u r form an orthonormal basis for R(A). (iv) u r+1, u r+2,..., u r+n form an orthonormal basis for N(A T ) The rank of the matrix A is equal to the number of its nonzero singular values (where singular values are counted according to multiplicity).

57 . In the case that A has rank r < n, if we set V 1 = (v 1, v 2,..., v r ) U 1 = (u 1, u 2,..., u r ) and define Σ 1 as before, then A = U 1 Σ 1 V T 1 This factorization, is called the compact form of the singular value decomposition of A. This form is useful in many applications

58 . Example 12.7 let A = Compute the singular values and the singular value decomposition of A Solution The matrix AA T = ( )

59 . has eigenvalues λ 1 = 4 and λ 2 = 0. Consequently, the singular values of A are σ 1 = 2 and σ 2 = 0 The eigenvalue λ 1 has eigenvectors of the form α(1, 1) T, and σ 2 has eigenvectors of the form β(1, 1) T. Therefore, the orthogonal matrix V = 1 2 ( )

60 . diagonalizes A T A. From what we discussed before, it follows that u 1 = 1 Av 1 = 1 σ ( 1/ 2 1/ 2 ) = 1/ 2 1/ 2 0 The remaining column vectors of U must form an orthonormal basis for N(A T ). We can compute a basis {x 2, x 3 } for N(A T ) in the usual way. x 2 = (1, 1, 0) T, x 3 = (0, 0, 1) T Since these vectors are already orthogonal, it is not necessary to use the Gram-Schmidt process to obtain an orthonormal basis.

61 . We need only set u 2 = 1 x 2 x 2 = ( 1 2, 1 2, 0) T, u 3 = 1 x 3 x 3 = (0, 0, 1) T It then follows that A = UΣV T = ( )

62 . OBS If A has singular value decomposition UΣV T, then A can be represented by the outer product expansion A = σ 1 u 1 v T 1 + σ 2 u 2 v T σ n u n v T n The closest matrix of rank k, is obtained by truncating this sum, after the first k terms: A = σ 1 u 1 v T 1 + σ 1 u 2 v T σ n u k v T k, k < n

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products. MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products. Orthogonal projection Theorem 1 Let V be a subspace of R n. Then any vector x R n is uniquely represented

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

Linear Algebra Review

Linear Algebra Review January 29, 2013 Table of contents Metrics Metric Given a space X, then d : X X R + 0 and z in X if: d(x, y) = 0 is equivalent to x = y d(x, y) = d(y, x) d(x, y) d(x, z) + d(z, y) is a metric is for all

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Chapter 6 Inner product spaces

Chapter 6 Inner product spaces Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Lecture 2: Linear Algebra Review

Lecture 2: Linear Algebra Review EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

CS 143 Linear Algebra Review

CS 143 Linear Algebra Review CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Basic Calculus Review

Basic Calculus Review Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Further Mathematical Methods (Linear Algebra)

Further Mathematical Methods (Linear Algebra) Further Mathematical Methods (Linear Algebra) Solutions For The Examination Question (a) To be an inner product on the real vector space V a function x y which maps vectors x y V to R must be such that:

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

More information

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63

Contents. Appendix D (Inner Product Spaces) W-51. Index W-63 Contents Appendix D (Inner Product Spaces W-5 Index W-63 Inner city space W-49 W-5 Chapter : Appendix D Inner Product Spaces The inner product, taken of any two vectors in an arbitrary vector space, generalizes

More information

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix. Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric

More information

Section 7.5 Inner Product Spaces

Section 7.5 Inner Product Spaces Section 7.5 Inner Product Spaces With the dot product defined in Chapter 6, we were able to study the following properties of vectors in R n. ) Length or norm of a vector u. ( u = p u u ) 2) Distance of

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Algebra II. Paulius Drungilas and Jonas Jankauskas

Algebra II. Paulius Drungilas and Jonas Jankauskas Algebra II Paulius Drungilas and Jonas Jankauskas Contents 1. Quadratic forms 3 What is quadratic form? 3 Change of variables. 3 Equivalence of quadratic forms. 4 Canonical form. 4 Normal form. 7 Positive

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Numerical Linear Algebra Background Cho-Jui Hsieh UC Davis May 15, 2018 Linear Algebra Background Vectors A vector has a direction and a magnitude

More information

MATH 423 Linear Algebra II Lecture 28: Inner product spaces.

MATH 423 Linear Algebra II Lecture 28: Inner product spaces. MATH 423 Liner Algebr II Lecture 28: Inner product spces. Norm The notion of norm generlizes the notion of length of vector in R 3. Definition. Let V be vector spce over F, where F = R or C. A function

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

MAT 610: Numerical Linear Algebra. James V. Lambers

MAT 610: Numerical Linear Algebra. James V. Lambers MAT 610: Numerical Linear Algebra James V Lambers January 16, 2017 2 Contents 1 Matrix Multiplication Problems 7 11 Introduction 7 111 Systems of Linear Equations 7 112 The Eigenvalue Problem 8 12 Basic

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

Mathematics Department Stanford University Math 61CM/DM Inner products

Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Linear Algebra (Review) Volker Tresp 2018

Linear Algebra (Review) Volker Tresp 2018 Linear Algebra (Review) Volker Tresp 2018 1 Vectors k, M, N are scalars A one-dimensional array c is a column vector. Thus in two dimensions, ( ) c1 c = c 2 c i is the i-th component of c c T = (c 1, c

More information

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010 Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010 Diagonalization of Symmetric Real Matrices (from Handout Definition 1 Let δ ij = { 1 if i = j 0 if i j A basis V = {v 1,..., v n } of R n

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013. MATH 431: FIRST MIDTERM Thursday, October 3, 213. (1) An inner product on the space of matrices. Let V be the vector space of 2 2 real matrices (that is, the algebra Mat 2 (R), but without the mulitiplicative

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space. Chapter 1 Preliminaries The purpose of this chapter is to provide some basic background information. Linear Space Hilbert Space Basic Principles 1 2 Preliminaries Linear Space The notion of linear space

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Math 24 Spring 2012 Sample Homework Solutions Week 8

Math 24 Spring 2012 Sample Homework Solutions Week 8 Math 4 Spring Sample Homework Solutions Week 8 Section 5. (.) Test A M (R) for diagonalizability, and if possible find an invertible matrix Q and a diagonal matrix D such that Q AQ = D. ( ) 4 (c) A =.

More information