Spring, 2012 CIS 515. Fundamentals of Linear Algebra and Optimization Jean Gallier

Size: px
Start display at page:

Download "Spring, 2012 CIS 515. Fundamentals of Linear Algebra and Optimization Jean Gallier"

Transcription

1 Spring 0 CIS 55 Fundamentals of Linear Algebra and Optimization Jean Gallier Homework 5 & 6 + Project 3 & 4 Note: Problems B and B6 are for extra credit April 7 0; Due May 7 0 Problem B (0 pts) Let A be any real or complex n n matrix and let be any operator norm Prove that for every m m I + A k k! e A k= Deduce from the above that the sequence (E m ) of matrices E m = I + converges to a limit denoted e A and called the exponential of A Problem B (Extra Credit 60 pts) Recall that the affine maps ρ: R R defined such that ( ) ( ) ( ) ( ) x cos θ sin θ x w ρ = + y sin θ cos θ y where θ w w R are rigid motions (or direct affine isometries) and that they form the group SE() Given any map ρ as above if we let ( cos θ sin θ R = sin θ cos θ ) X = m k= A k k! w ( ) x and W = y then ρ can be represented by the 3 3 matrix ( ) cos θ sin θ w R W A = = sin θ cos θ w ( w w )

2 in the sense that iff ( ) ρ(x) = ( R W 0 ρ(x) = RX + W ) ( ) X (a) Consider the set of matrices of the form 0 θ u θ 0 v where θ u v R Verify that this set of matrices is a vector space isomorphic to (R 3 +) This vector space is denoted by se() Show that in general BC CB if B C se() (b) Given a matrix 0 θ u B = θ 0 v prove that if θ = 0 then 0 u e B = 0 v 0 0 and that if θ 0 then u cos θ sin θ sin θ + v (cos θ ) e B θ θ = u sin θ cos θ ( cos θ + ) + v sin θ θ θ 0 0 Hint Prove that and that B 3 = θ B e B = I 3 + sin θ θ B + cos θ B θ (c) Check that e B is a direct affine isometry in SE() Prove that the exponential map exp: se() SE() is surjective If θ kπ (k Z) how do you need to restrict θ to get an injective map? Problem B3 ( pts) Consider the affine maps ρ: R R defined such that ( ) ( ) ( ) ( ) x cos θ sin θ x w ρ = α + y sin θ cos θ y w

3 where θ w w α R with α > 0 These maps are called (direct) affine similitudes (for short similitudes) The number α > 0 is the scale factor of the similitude These affine maps are the composition of a rotation of angle θ a rescaling by α > 0 and a translation (a) Prove that these maps form a group that we denote by SIM() Given any map ρ as above if we let ( cos θ sin θ R = sin θ cos θ ) X = ( ) x and W = y then ρ can be represented by the 3 3 matrix ( ) α cos θ α sin θ w αr W A = = α sin θ α cos θ w ( w w ) in the sense that iff ( ) ρ(x) = ( αr W 0 ρ(x) = αrx + W ) ( ) X (b) Consider the set of matrices of the form λ θ u θ λ v where θ λ u v R Verify that this set of matrices is a vector space isomorphic to (R 4 +) This vector space is denoted by sim() (c) Given a matrix prove that Hint Write with ( ) λ θ Ω = θ λ ( cos θ e Ω = e λ sin θ Ω = λi + θj J = ( ) 0 0 ) sin θ cos θ 3

4 Observe that J = I and prove by induction on k that Ω k = ( (λ + iθ) k + (λ iθ) k) I + ( (λ + iθ) k (λ iθ) k) J i (d) As in (c) write let and let Prove that where Ω 0 = I Prove that where (e) Prove that ( ) λ θ Ω = θ λ U = B = B n = e B = V = I + k V = I + k ( ) u v ( ) Ω U 0 0 ( ) Ω n Ω n U 0 0 ( ) e Ω V U 0 Ω k (k + )! Ω k (k + )! = 0 e Ωt dt Use this formula to prove that if λ = θ = 0 then else V = λ + θ V = I ( ) λ(e λ cos θ ) + e λ θ sin θ θ( e λ cos θ) e λ λ sin θ θ( e λ cos θ) + e λ λ sin θ λ(e λ cos θ ) + e λ θ sin θ Observe that for λ = 0 the above gives back the expression in B(b) for θ 0 Conclude that if λ = θ = 0 then e B = ( ) I U 0 4

5 else with and V = λ + θ e B = ( cos θ e Ω = e λ sin θ and that e B SIM() with scale factor e λ ( ) e Ω V U 0 ) sin θ cos θ ( ) λ(e λ cos θ ) + e λ θ sin θ θ( e λ cos θ) e λ λ sin θ θ( e λ cos θ) + e λ λ sin θ λ(e λ cos θ ) + e λ θ sin θ (f) Prove that the exponential map exp: sim() SIM() is surjective (g) Similitudes can be used to describe certain deformations (or flows) of a deformable body B t in the plane Given some initial shape B in the plane (for example a circle) a deformation of B is given by a piecewise differentiable curve D : [0 T ] SIM() where each D(t) is a similitude (for some T > 0) The deformed body B t at time t is given by B t = D(t)(B) The surjectivity of the exponential map exp: sim() SIM() implies that there is a map log: SIM() sim() although it is multivalued The exponential map and the log function allows us to work in the simpler (noncurved) Euclidean space sim() For instance given two similitudes A A SIM() specifying the shape of B at two different times we can compute log(a ) and log(a ) which are just elements of the Euclidean space sim() form the linear interpolant ( t) log(a ) + t log(a ) and then apply the exponential map to get an interpolating deformation t e ( t) log(a )+t log(a ) t [0 ] Also given a sequence of snapshots of the deformable body B say A 0 A A m where each is A i is a similitude we can try to find an interpolating deformation (a curve in SIM()) by finding a simpler curve t C(t) in sim() (say a B-spline) interpolating log A log A log A m Then the curve t e C(t) yields a deformation in SIM() interpolating A 0 A A m () (75 pts) Write a program interpolating between two deformations () (5 pts) Write a program using your cubic spline interpolation program from the first project to interpolate a sequence of deformations given by similitudes A 0 A A m 5

6 Problem B4 (40 pts) Recall that the coordinates of the cross product u v of two vectors u = (u u u 3 ) and v = (v v v 3 ) in R 3 are (u v 3 u 3 v u 3 v u v 3 u v u v ) (a) If we let U be the matrix 0 u 3 u U = u 3 0 u u u 0 check that the coordinates of the cross product u v are given by 0 u 3 u v u v 3 u 3 v u 3 0 u v = u 3 v u v 3 u u 0 v 3 u v u v (b) Show that the set of matrices of the form 0 u 3 u U = u 3 0 u u u 0 is a vector space isomorphic to (R 3 +) This vector space is denoted by so(3) Show that such matrices are never invertible Find the kernel of the linear map associated with a matrix U Describe geometrically the action of the linear map defined by a matrix U Show that when restricted to the plane orthogonal to u = (u u u 3 ) if u is a unit vector then U behaves like a rotation by π/ Problem B5 (00 pts) (a) For any matrix 0 c b A = c 0 a b a 0 if we let θ = a + b + c and a ab ac B = ab b bc ac bc c prove that A = θ I + B AB = BA = 0 6

7 From the above deduce that A 3 = θ A (b) Prove that the exponential map exp: so(3) SO(3) is given by or equivalently by if θ 0 with exp(0 3 ) = I 3 exp A = e A = cos θ I 3 + sin θ ( cos θ) A + B θ θ e A = I 3 + sin θ ( cos θ) A + A θ θ (c) Prove that e A is an orthogonal matrix of determinant + ie a rotation matrix (d) Prove that the exponential map exp: so(3) SO(3) is surjective For this proceed as follows: Pick any rotation matrix R SO(3); () The case R = I is trivial () If R I and tr(r) then { } θ exp (R) = sin θ (R RT ) + cos θ = tr(r) (Recall that tr(r) = r + r + r 3 3 the trace of the matrix R) Note that both θ and π θ yield the same matrix exp(r) (3) If R I and tr(r) = then prove that the eigenvalues of R are that R = R and that R = I Prove that the matrix S = (R I) is a symmetric matrix whose eigenvalues are 0 Thus S can be diagonalized with respect to an orthogonal matrix Q as 0 0 S = Q 0 0 Q Prove that there exists a skew symmetric matrix 0 d c U = d 0 b c b 0 7

8 so that U = S = (R I) Observe that (c + d ) bc bd U = bc (b + d ) cd bd cd (b + c ) and use this to conclude that if U = S then b + c + d = Then show that 0 d c exp (R) = (k + )π d 0 b k Z c b 0 where (b c d) is any unit vector such that for the corresponding skew symmetric matrix U we have U = S (e) To find a skew symmetric matrix U so that U = S = (R I) as in (d) we can solve the system b bc bd bc c cd = S bd cd d We immediately get b c d and then since one of b c d is nonzero say b if we choose the positive square root of b we can determine c and d from bc and bd Implement a computer program to solve the above system Problem B6 (Extra Credit 00 pts) (a) Consider the set of affine maps ρ of R 3 defined such that ρ(x) = RX + W where R is a rotation matrix (an orthogonal matrix of determinant +) and W is some vector in R 3 Every such a map can be represented by the 4 4 matrix ( ) R W 0 in the sense that iff ( ) ρ(x) = ( R W 0 ρ(x) = RX + W ) ( ) X Prove that these maps are affine bijections and that they form a group denoted by SE(3) (the direct affine isometries or rigid motions of R 3 ) 8

9 (b) Let us now consider the set of 4 4 matrices of the form ( ) Ω W B = 0 0 where Ω is a skew-symmetric matrix 0 c b Ω = c 0 a b a 0 and W is a vector in R 3 Verify that this set of matrices is a vector space isomorphic to (R 6 +) This vector space is denoted by se(3) Show that in general BC CB (c) Given a matrix as in (b) prove that where Ω 0 = I 3 Given B n = B = ( ) Ω W 0 0 ( ) Ω n Ω n W c b Ω = c 0 a b a 0 let θ = a + b + c Prove that if θ = 0 then e B = ( I3 W 0 ) and that if θ 0 then where (d) Prove that and e B = V = I 3 + k ( ) e Ω V W 0 Ω k (k + )! e Ω = I 3 + sin θ ( cos θ) Ω + Ω θ θ V = I 3 + ( cos θ) (θ sin θ) Ω + Ω θ θ 3 9

10 Hint Use the fact that Ω 3 = θ Ω (e) Prove that e B is a direct affine isometry in SE(3) Prove that V is invertible and that Z = I Ω + ( θ sin θ ) Ω θ ( cos θ) for θ 0 Hint Assume that the inverse of V is of the form Z = I 3 + aω + bω and show that a b are given by a system of linear equations that always has a unique solution Prove that the exponential map exp: se(3) SE(3) is surjective Remark: As in the case of the plane rigid motions in SE(3) can be used to describe certain deformations of bodies in R 3 Problem B7 (80 pts) Let A be a real matrix ( ) a A = a a a () Prove that the squares of the singular values σ σ of A are the roots of the quadratic equation X tr(a A)X + det(a) = 0 () If we let prove that µ(a) = a + a + a + a a a a a cond (A) = σ σ = µ(a) + (µ(a) ) / (3) Consider the subset S of invertible matrices whose entries a i j are integers such that 0 a ij 00 Prove that the functions cond (A) and µ(a) reach a maximum on the set S for the same values of A Check that for the matrix we have A m = µ(a m ) = ( 00 ) det(a m ) =

11 and cond (A m ) (4) Prove that for all A S if det(a) then µ(a) Conclude that the maximum of µ(a) on S is achieved for matrices such that det(a) = ± Prove that finding matrices that maximize µ on S is equivalent to finding some integers n n n 3 n 4 such that 0 n 4 n 3 n n 00 n + n + n 3 + n = n n 4 n n 3 = You may use without proof that the fact that the only solution to the above constraints is the multiset { } (5) Deduce from part (4) that the matrices in S for which µ has a maximum value are ( ) ( ) ( ) ( ) A m = and check that µ has the same value for these matrices Conclude that max A S cond (A) = cond (A m ) (6) Solve the system ( ) ( ) x = Perturb the right-hand side b by δb = x ( ) ( ) and solve the new system where y = (y y ) Check that A m y = b + δb ( ) δx = y x = 003 Compute x δx b δb and estimate c = δx x ( ) δb b

12 Ckeck that c cond (A m ) = Problem B8 (0 pts) Let E be a real vector space of finite dimension n Say that two bases (u u n ) and (v v n ) of E have the same orientation iff det(p ) > 0 where P the change of basis matrix from (u u n ) and (v v n ) namely the matrix whose jth columns consist of the coordinates of v j over the basis (u u n ) (a) Prove that having the same orientation is an equivalence relation with two equivalence classes An orientation of a vector space E is the choice of any fixed basis say (e e n ) of E Any other basis (v v n ) has the same orientation as (e e n ) (and is said to be positive or direct) iff det(p ) > 0 else it is said to have the opposite orientation of (e e n ) (or to be negative or indirect) where P is the change of basis matrix from (e e n ) to (v v n ) An oriented vector space is a vector space with some chosen orientation (a positive basis) (b) Let B = (u u n ) and B = (v v n ) be two orthonormal bases For any sequence of vectors (w w n ) in E let det B (w w n ) be the determinant of the matrix whose columns are the coordinates of the w j s over the basis B and similarly for det B (w w n ) Prove that if B and B have the same orientation then det B (w w n ) = det B (w w n ) Given any oriented vector space E for any sequence of vectors (w w n ) in E the common value det B (w w n ) for all positive orthonormal bases B of E is denoted and called a volume form of (w w n ) λ E (w w n ) (c) Given any Euclidean oriented vector space E of dimension n for any n vectors w w n in E check that the map x λ E (w w n x) is a linear form Then prove that there is a unique vector denoted w w n such that λ E (w w n x) = (w w n ) x for all x E The vector w w n is called the cross-product of (w w n ) It is a generalization of the cross-product in R 3 (when n = 3)

13 Problem B9 (40 pts) Given p vectors (u u p ) in a Euclidean space E of dimension n p the Gram determinant (or Gramian) of the vectors (u u p ) is the determinant u u u u u p u u u u u p Gram(u u p ) = u p u u p u u p () Prove that Gram(u u n ) = λ E (u u n ) Hint If (e e n ) is an orthonormal basis and A is the matrix of the vectors (u u n ) over this basis det(a) = det(a A) = det(a i A j ) where A i denotes the ith column of the matrix A and (A i A j ) denotes the n n matrix with entries A i A j () Prove that u u n = Gram(u u n ) Hint Letting w = u u n observe that and show that λ E (u u n w) = w w = w w 4 = λ E (u u n w) = Gram(u u n w) = Gram(u u n ) w Problem B0 (60 pts) () Implement the Gram-Schmidt orthonormalization procedure and the modified Gram-Schmidt procedure You may use the pseudo-code showed in () () Implement the method to compute the QR decomposition of an invertible matrix You may use the following pseudo-code: function qr(a: matrix): [Q R] pair of matrices begin n = dim(a); R = 0; (the zero matrix) Q(: ) = A(: ); R( ) = sqrt(q(: ) Q(: )); Q(: ) = Q(: )/R( ); for k := to n do 3

14 w = A(: k + ); for i := to k do R(i k + ) = A(: k + ) Q(: i); w = w R(i k + )Q(: i); endfor; Q(: k + ) = w; R(k + k + ) = sqrt(q(: k + ) Q(: k + )); Q(: k + ) = Q(: k + )/R(k + k + ); endfor; end Test it on various matrices including those involved in Project (3) Given any invertible matrix A define the sequences A k Q k R k as follows: A = A Q k R k = A k A k+ = R k Q k for all k where in the second equation Q k R k is the QR decomposition of A k given by part () Run the above procedure for various values of k (50 00 ) and various real matrices A in particular some symmetric matrices; also run the Matlab command eig on A k and compare the diagonal entries of A k with the eigenvalues given by eig(a k ) What do you observe? How do you explain this behavior? TOTAL: points + 60 extra 4

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class Fall, 2003 CIS 610 Advanced geometric methods Homework 3 November 11, 2003; Due November 25, beginning of class You may work in groups of 2 or 3 Please, write up your solutions as clearly and concisely

More information

Spring 2018 CIS 610. Advanced Geometric Methods in Computer Science Jean Gallier Homework 3

Spring 2018 CIS 610. Advanced Geometric Methods in Computer Science Jean Gallier Homework 3 Spring 2018 CIS 610 Advanced Geometric Methods in Computer Science Jean Gallier Homework 3 March 20; Due April 5, 2018 Problem B1 (80). This problem is from Knapp, Lie Groups Beyond an Introduction, Introduction,

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Math 462: Homework 2 Solutions

Math 462: Homework 2 Solutions Math 46: Homework Solutions Paul Hacking /6/. Consider the matrix A = (a) Check that A is orthogonal. (b) Determine whether A is a rotation or a reflection / rotary reflection. [Hint: What is the determinant

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Math 306 Topics in Algebra, Spring 2013 Homework 7 Solutions

Math 306 Topics in Algebra, Spring 2013 Homework 7 Solutions Math 306 Topics in Algebra, Spring 203 Homework 7 Solutions () (5 pts) Let G be a finite group. Show that the function defines an inner product on C[G]. We have Also Lastly, we have C[G] C[G] C c f + c

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Properties of Linear Transformations from R n to R m

Properties of Linear Transformations from R n to R m Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation

More information

Methods in Computer Vision: Introduction to Matrix Lie Groups

Methods in Computer Vision: Introduction to Matrix Lie Groups Methods in Computer Vision: Introduction to Matrix Lie Groups Oren Freifeld Computer Science, Ben-Gurion University June 14, 2017 June 14, 2017 1 / 46 Definition and Basic Properties Definition (Matrix

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions Final Exam Linear Algebra Summer Math SX (3) Corrin Clarkson August th, Name: Solutions Instructions: This is a closed book exam. You may not use the textbook, notes or a calculator. You will have 9 minutes

More information

Differential Topology Solution Set #2

Differential Topology Solution Set #2 Differential Topology Solution Set #2 Select Solutions 1. Show that X compact implies that any smooth map f : X Y is proper. Recall that a space is called compact if, for every cover {U } by open sets

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017 Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Review 1 Math 321: Linear Algebra Spring 2010

Review 1 Math 321: Linear Algebra Spring 2010 Department of Mathematics and Statistics University of New Mexico Review 1 Math 321: Linear Algebra Spring 2010 This is a review for Midterm 1 that will be on Thursday March 11th, 2010. The main topics

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Linear Algebra and Matrices

Linear Algebra and Matrices Linear Algebra and Matrices 4 Overview In this chapter we studying true matrix operations, not element operations as was done in earlier chapters. Working with MAT- LAB functions should now be fairly routine.

More information

Differential equations

Differential equations Differential equations Math 7 Spring Practice problems for April Exam Problem Use the method of elimination to find the x-component of the general solution of x y = 6x 9x + y = x 6y 9y Soln: The system

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Part IA. Vectors and Matrices. Year

Part IA. Vectors and Matrices. Year Part IA Vectors and Matrices Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2018 Paper 1, Section I 1C Vectors and Matrices For z, w C define the principal value of z w. State de Moivre s

More information

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

Homework Set #8 Solutions

Homework Set #8 Solutions Exercises.2 (p. 19) Homework Set #8 Solutions Assignment: Do #6, 8, 12, 14, 2, 24, 26, 29, 0, 2, 4, 5, 6, 9, 40, 42 6. Reducing the matrix to echelon form: 1 5 2 1 R2 R2 R1 1 5 0 18 12 2 1 R R 2R1 1 5

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

18.06 Quiz 2 April 7, 2010 Professor Strang

18.06 Quiz 2 April 7, 2010 Professor Strang 18.06 Quiz 2 April 7, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number or instructor is 2. 3. 1. (33 points) (a) Find the matrix P that projects every vector b in R 3 onto the line

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Math 396. An application of Gram-Schmidt to prove connectedness

Math 396. An application of Gram-Schmidt to prove connectedness Math 396. An application of Gram-Schmidt to prove connectedness 1. Motivation and background Let V be an n-dimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

Math 320, spring 2011 before the first midterm

Math 320, spring 2011 before the first midterm Math 320, spring 2011 before the first midterm Typical Exam Problems 1 Consider the linear system of equations 2x 1 + 3x 2 2x 3 + x 4 = y 1 x 1 + 3x 2 2x 3 + 2x 4 = y 2 x 1 + 2x 3 x 4 = y 3 where x 1,,

More information

The Cartan Dieudonné Theorem

The Cartan Dieudonné Theorem Chapter 7 The Cartan Dieudonné Theorem 7.1 Orthogonal Reflections Orthogonal symmetries are a very important example of isometries. First let us review the definition of a (linear) projection. Given a

More information

Linear Algebra

Linear Algebra . Linear Algebra Final Solutions. (pts) Let A =. (a) Find an orthogonal matrix S a diagonal matrix D such that S AS = D. (b) Find a formula for the entries of A t,wheret is a positive integer. Also find

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Math 1060 Linear Algebra Homework Exercises 1 1. Find the complete solutions (if any!) to each of the following systems of simultaneous equations:

Math 1060 Linear Algebra Homework Exercises 1 1. Find the complete solutions (if any!) to each of the following systems of simultaneous equations: Homework Exercises 1 1 Find the complete solutions (if any!) to each of the following systems of simultaneous equations: (i) x 4y + 3z = 2 3x 11y + 13z = 3 2x 9y + 2z = 7 x 2y + 6z = 2 (ii) x 4y + 3z =

More information

Linear Algebra and Dirac Notation, Pt. 3

Linear Algebra and Dirac Notation, Pt. 3 Linear Algebra and Dirac Notation, Pt. 3 PHYS 500 - Southern Illinois University February 1, 2017 PHYS 500 - Southern Illinois University Linear Algebra and Dirac Notation, Pt. 3 February 1, 2017 1 / 16

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

1 Matrices and Systems of Linear Equations

1 Matrices and Systems of Linear Equations March 3, 203 6-6. Systems of Linear Equations Matrices and Systems of Linear Equations An m n matrix is an array A = a ij of the form a a n a 2 a 2n... a m a mn where each a ij is a real or complex number.

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

More information

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017 Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.

More information

The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows up in an interesting way in computer vision.

The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows up in an interesting way in computer vision. 190 CHAPTER 3. REVIEW OF GROUPS AND GROUP ACTIONS 3.3 The Lorentz Groups O(n, 1), SO(n, 1) and SO 0 (n, 1) The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows

More information

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

A matrix over a field F is a rectangular array of elements from F. The symbol

A matrix over a field F is a rectangular array of elements from F. The symbol Chapter MATRICES Matrix arithmetic A matrix over a field F is a rectangular array of elements from F The symbol M m n (F ) denotes the collection of all m n matrices over F Matrices will usually be denoted

More information

Algebra I Fall 2007

Algebra I Fall 2007 MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.

More information

MATH 235. Final ANSWERS May 5, 2015

MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

More information

f(a)f(b) = ab for all a, b E. WhenE = F, an affine isometry f: E E is also called a rigid motion.

f(a)f(b) = ab for all a, b E. WhenE = F, an affine isometry f: E E is also called a rigid motion. 7.4. AFFINE ISOMETRIES (RIGID MOTIONS) 289 7.4 Affine Isometries (Rigid Motions) Definition 7.4.1 Given any two nontrivial Euclidean affine spaces E and F of the same finite dimension n, a function f:

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Chapter 2. Introduction to Manifolds and Classical Lie Groups

Chapter 2. Introduction to Manifolds and Classical Lie Groups Chapter 2 Introduction to Manifolds and Classical Lie Groups Le rôle prépondérant de la théorie des groupes en mathématiques a été longtemps insoupçonné; il y a quatre-vingts ans, le nom même de groupe

More information

Notation. For any Lie group G, we set G 0 to be the connected component of the identity.

Notation. For any Lie group G, we set G 0 to be the connected component of the identity. Notation. For any Lie group G, we set G 0 to be the connected component of the identity. Problem 1 Prove that GL(n, R) is homotopic to O(n, R). (Hint: Gram-Schmidt Orthogonalization.) Here is a sequence

More information

4. Determinants.

4. Determinants. 4. Determinants 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 2 2 determinant 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 3 3 determinant 4.1.

More information

Reduction to the associated homogeneous system via a particular solution

Reduction to the associated homogeneous system via a particular solution June PURDUE UNIVERSITY Study Guide for the Credit Exam in (MA 5) Linear Algebra This study guide describes briefly the course materials to be covered in MA 5. In order to be qualified for the credit, one

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

Spring, 2006 CIS 610. Advanced Geometric Methods in Computer Science Jean Gallier Homework 1

Spring, 2006 CIS 610. Advanced Geometric Methods in Computer Science Jean Gallier Homework 1 Spring, 2006 CIS 610 Advanced Geometric Methods in Computer Science Jean Gallier Homework 1 January 23, 2006; Due February 8, 2006 A problems are for practice only, and should not be turned in. Problem

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

ELEMENTARY LINEAR ALGEBRA

ELEMENTARY LINEAR ALGEBRA ELEMENTARY LINEAR ALGEBRA K R MATTHEWS DEPARTMENT OF MATHEMATICS UNIVERSITY OF QUEENSLAND First Printing, 99 Chapter LINEAR EQUATIONS Introduction to linear equations A linear equation in n unknowns x,

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

LINEAR ALGEBRA MICHAEL PENKAVA

LINEAR ALGEBRA MICHAEL PENKAVA LINEAR ALGEBRA MICHAEL PENKAVA 1. Linear Maps Definition 1.1. If V and W are vector spaces over the same field K, then a map λ : V W is called a linear map if it satisfies the two conditions below: (1)

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts

More information

Linear Algebra Homework and Study Guide

Linear Algebra Homework and Study Guide Linear Algebra Homework and Study Guide Phil R. Smith, Ph.D. February 28, 20 Homework Problem Sets Organized by Learning Outcomes Test I: Systems of Linear Equations; Matrices Lesson. Give examples of

More information

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Linear Algebra A Brief Reminder Purpose. The purpose of this document

More information

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS Instructor: Shmuel Friedland Department of Mathematics, Statistics and Computer Science email: friedlan@uic.edu Last update April 18, 2010 1 HOMEWORK ASSIGNMENT

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information