Reconstruction and Higher Dimensional Geometry

Size: px
Start display at page:

Download "Reconstruction and Higher Dimensional Geometry"

Transcription

1 Reconstruction and Higher Dimensional Geometry Hongyu He Department of Mathematics Louisiana State University Abstract Tutte proved that, if two graphs, both with more than two vertices, have the same collection of vertex-deleted subgraphs, then the determinants of the two corresponding adjacency matrices are the same. In this paper, we give a geometric proof of Tutte s theorem using vectors and angles. We further study the lowest eigenspaces of these adjacency matrices. Introduction Given the graph G = {V, E}, let G i be the graph obtained by deleting the i th vertex v i. Fix n 3 from now on. Let G and H be two graphs of n vertices. The main conjecture in reconstruction theory, states that if G i is isomorphic to H i for every i, then G and H are isomorphic (up to a reordering of V ). This conjecture is also known as the Ulam s conjecture. The reconstruction conjecture can be formulated in purely algebraic terms. Consider two n n real symmetric matrices A and B. Let A i and B i be the matrices obtaining by deleting the i-th row and i-th column of A and B, respectively. Definition Let σ i be a n by n permutation matrix. Let A and B be two n n real symmetric matrices. We say that A and B are hypomorphic if there exists a set of n n permutation matrices {σ, σ,..., σ n }, such that B i = σ i A i σ t i for every i. Put Σ = {σ, σ,..., σ n }. We write B = Σ(A). Σ is called a hypomorphism. The algebraic version of the reconstruction conjecture can be stated as follows. Conjecture Let A and B be two n n symmetric matrices. If there exists a hypomorphism Σ such that B = Σ(A), then there exists a n n permutation matrix τ such that B = τaτ t. We start by fixing some notations. If M is a symmetric real matrix, then the eigenvalues of M are real. We write eigen(m) = (λ (M) λ (M)... λ n (M)). If α is an eigenvalue of M, we denote the corresponding eigenspace by eigen α (M). Let n be the n-dimensional row vector (,,..., ). We may drop the subscript n if it is implicit. Put J = t. If A and B are hypomorphic, so are A + tj and B + tj.

2 Theorem (Tutte) Let B and A be two real n n symmetric matrices. If B and A are hypomorphic then det(b λi + tj) = det(a λi + tj) for all t, λ R. In this paper, we will study the geometry related to Conjecture. be stated as follows. Out main result can Theorem (Main Theorem) Let B and A be two real n n symmetric matrices. Let Σ be a hypomorphism such that B = Σ(A). Let t be a real number. Then there exists an open interval T such that for t T we have. λ n (A + tj) = λ n (B + tj);. eigen λn (A + tj) and eigen λn (B + tj) are both one dimensional; 3. eigen λn (A + tj) = eigen λn (B + tj). A similar statement holds for the highest eigenspaces. Since the sets of majors of A + tj and of B + tj are the same, for every t T and λ R, det(a + tj λi) det(b + tj λi) = det(a + tj) det(b + tj). () If t T, by taking λ = λ n (A + tj), we obtain det(a + tj) det(b + tj) = det(a + tj λi) det(b + tj λi) = 0. Since the above statement is true for t T, det(a + tj) = det(b + tj) for every t. By Equation., we obtain det(b λi + tj) = det(a λi + tj) for all t, λ R. This is Tutte s theorem, which was proved using rank polynomials and Hamiltonian circuits. I should also mention that Kocay [] found a simpler way to deduce the reconstructibility of characteristic polynomials. Here is the content of this paper. We begin by presenting a positive semidefinite matrix A + λi by n vectors in R n. We then interpret the reconstruction conjecture as a generalization of a congruence theorem in Eulidean geometry. Next we study the presentations of A + λi under the perturbation by tj. We define a norm of angles in higher dimensions and establish a comparison theorem. Our comparison theorem then forces hypomorphic matrices to have the same lowest eigenvalue and eigenvector. I would like to thank the referee for his valuable comments. Notations Unless stated otherwise,. all linear spaces in this paper will be finite dimensional real Euclidean spaces;. all linear subspaces will be equipped with the induced Euclidean metric; 3. all vectors will be column vectors; 4. vectors are sometimes regarded as points in R n. Let U = {u, u,... u m } be an ordered set of m vectors in R n. U is also interpreted as a n m matrix.. Let conv U be the convex hull spanned by U, namely, m m { α i u i α i 0, α i = }. i= i=

3 . Let aff U be the affine space spanned by U, namely, m m { α i u i α i = }. i= 3. Let span U be the linear span of U, namely, Then conv U aff U span U. i= m { α i u i α i R}. i= Let A be a matrix. We denote the (i, j)-th entry of A by a ij. We denote the transpose of A by A t. Let R +n be the set of vectors with only positive coordinates. 3 Geometric Interpretation Fix a standard Euclidean space (R n, (, )). Definition Let A be a symmetric positive semidefinite real matrix. An ordered set of vectors V = {v, v,... v n } is said to be a presentation of A if and only if (v i, v j ) = a ij. Regarding v i as column vectors and V as a n n matrix, V is a presentation of A if and only if V t V = A. Every positive semidefinite real matrix A has a presentation. In addition, the presentation V is unique up to a left multiplication by an orthogonal matrix. Definition 3 Let S and T be two sets of vectors in R n. S and T are said to be congruent if there exists an orthogonal linear transformation in R n that maps S onto T. So A = σbσ t for some permutation σ if and only if A and B are presented by two congruent subsets in R n. Now consider two hypomorphic matrices B = Σ(A). Observe that B + λi = Σ(A + λi). Without loss of generality, assume A and B are both positive semidefinite. Let U and V be their presentations respectively. Since B i = σ i A i σ t i, U {u i} is congruent to V {v i }. Then the reconstruction conjecture can be stated as follows. Conjecture (Geometric reconstruction) Let and S = {u, u,..., u n } T = {v, v,... v n } be two finite sets of vectors in R m. Assume that S {u i } is congruent to T {v i } for every i. Then S and T are congruent. Generically, m = n. Definition 4 We say that U = {u i } n is in good position if the point 0 is in the interior of the convex hull of U and the convex hull of U is of dimension n. Lemma Let A be a symmetric positive semidefinite matrix. The following are equivalent.. A has a presentation in good position. 3

4 . Every presentation of A is in good position. 3. rank(a) = n and eigen 0 (A) = Rα for some α (R + ) n. Proof: Since A is symmetric positive semidefinite, A has a presentation. Let U be a presentation of A. If U is in good position, then every presentation obtained from an orthogonal linear transformation is also in good position. Since a presentation is unique up to an orthogonal linear transformation, () (). Suppose U is in good position. Then rank(u) = n. So rank(a) = n. Since 0 is in the interior of the convex hull of U, there exists α = (α, α,... α n ) t such that 0 = α i u i ; α i = ; α i > 0 i. Since rank(u) = n, α is unique. Now Uα = 0 implies Aα = U t Uα = U t 0 = 0. Since rank(a) = rank(u) = n, eigen 0 (A) = Rα. So () (3). Conversely, suppose rank(a) = n and eigen 0 (A) = Rα with α R +n. Then i α iu i = 0 and the linear span span U is of dimension n. Thus, 0 is in conv U. It follows that aff U = span U. So dim(conv U) = dim(aff U) = dim(span U) = n. So (3) (). Q.E.D. Lemma Let U be a presentation of A. Suppose that U is in good position. Let α i be the volume of the convex hull of {0, u, u,..., û i,..., u n }. Then is a lowest eigenvector. α = (α, α,..., α n ) t The proof can be found in many places. For the sake of completeness, I will give a proof using the language of exterior product. Proof: Choosing an orthonormal basis properly, we may assume that every u i R n. U becomes a (n ) n matrix. Let x, x... x n be the row vectors of U. Consider the exterior product x x... x n. Let β i be the i-th coordinate in terms of the standard basis {( ) i e e... ê i... e n i [, n]}. Put β = (β, β,..., β n ) t. Notice that x i (x x... x n ) = 0 for i n. Therefore, (x i, β) = 0 for every i. So Uβ = 0. It follows that n i= β iu i = 0. Since 0 is in the convex hull of {u i } n, β i must be either all negative or all positive. Clearly, β i = u... û i... u n = (n )!α i. Therefore, we have Uα = 0. Then Aα = U t Uα = 0. α is a lowest eigenvector. Q.E.D. 4

5 Theorem 3 Suppose that B = Σ(A). Suppose that A and B have presentations in good position. Then eigen 0 (A) = eigen 0 (B) = R. Proof: Let U and V be presentations of A and B respectively. Then U and V are in good position. Notice that the volume of the convex hull of equals the volume of the convex hull of {0, u, u,..., û i,... u n } {0, v, v,..., ˆv i,... v n } By Lemma and Lemma, eigen 0 (A) = eigen 0 (B) = R. So the lowest eigenspace of A is equal to the lowest eigenspace of B. Q.E.D. 4 Perturbation by J Recall that J = t n n. We know that B = Σ(A) if and only if B + tj = Σ(A + tj). Let us see how presentations of A + tj depend on t. Let A be a positive definite matrix. Let U = {u i } n be a presentation of A. Let aff U be the affine space spanned by U. Then {u i } are affinely independent. Let u 0 be the orthogonal projection of the origin onto aff U. Then (u 0, u i u 0 ) = 0 for every i. We obtain U t u 0 = u 0. It follows that u 0 = u 0 (U t ). Consequently, u 0 = (u 0, u 0 ) = u 0 4 t U (U t ) = u 0 4 t A. Clearly, u 0 = t A. We obtain the following lemma. Lemma 3 Let A be a positive definite matrix. Let U = {u i } n be a presentation of A. Let u 0 be the orthogonal projection of the origin onto aff U. Then u 0 = t A and Consider {u i su 0 } n. Notice that u 0 = t A (U t ). (u i su 0, u j su 0 ) = (u i u 0 +( s)u 0, u j u 0 +( s)u 0 ) = (u i u 0, u j u 0 )+( s) (u 0, u 0 ). Taking s = 0, we have Therefore (u i, u j ) = (u i u 0, u j u 0 ) + (u 0, u 0 ). (u i su 0, u j su 0 ) = (u i, u j ) (u 0, u 0 ) + ( s) (u 0, u 0 ) = (u i, u j ) + (s s) u 0. We see clearly that A + (s s) u 0 J is presented by {u i su 0 } n. Observe that span(u su 0, u su 0,..., u n su 0 ) is of dimension n for all s. So A + (s s) u 0 J is positive definite for all s. If s =, we see that A u 0 J is presented by {u i u 0 } n whose linear span is of dimension n. We obtain the following lemma. 5

6 Lemma 4 Let A be a symmetric positive definite matrix. Let U be a presentation of A. Let u 0 be the orthogonal projection of the origin onto aff U. Then {u i su 0 } n is a presentation of A + (s s) u 0 J. Let t = (s s) u 0. Then A + tj is positive definite for all t > u 0 and positive semidefinite for t = u 0. Notice that u 0 = t A (U t ) = t A U(U (U t ) ) = t A UA. Theorem 4 Let A be a symmetric positive definite matrix. Let U be a presentation of A. Let u 0 be the orthogonal projection of the origin onto aff U. Then u 0 = t A UA and the following are equivalent.. A u 0 J has a presentation in good position;. u 0 is in the interior of conv U; 3. A R +n. Corollary Let A be a real symmetric matrix. There exists λ 0 such that for every λ λ 0 there exists a real number t such that A + λi + tj has a presentation in good position. Proof: Instead, consider I + sa with s = λ. I + sa is related to A + λi by a constant multiplication: λ(i + sa) = λi + A. Let s 0 = A + where A denote the operator norm. Suppose that 0 s s 0. Then I + sa is positive definite. For s = 0, (I + sa) R +n. Since s (I + sa) is continuous on (0, s 0 ), there exists a s (0, s 0 ) such that (I + sa) R +n for every s (0, s ]. So for every λ [ s, ), (A + λi) = λ (sa + I) R +n. Let λ 0 = s. So for every λ λ 0, (A + λi) R +n. By Theorem 4, for every λ λ 0 there exists a t such that A + λi + tj has a presentation in good position. Q.E.D. 5 Higher Dimensional Angle and Comparison Theorem Definition 5 Let U = {u, u,... u n } be a subset in R n. R n may be contained in some other Euclidean space. Let u be a point in R n. The angle (u, U) is defined to be the region { α i (u i u) α i 0}. Two angles are congruent if there exists an isometry that maps one angle to the other. Let B be the unit ball in R n. The norm of (u, U) is defined to be the volume of (u, U) B, denote it by (u, U). Let me make a few remarks.. Firstly, if two angles are congruent, their norms are the same. But, unlike the dimensional case, if the norms of two angles are the same, these two angles may not be congruent.. Secondly, if {u i u} n are linearly dependent, then (u, U) = 0. If u happens to be in aff U, then (u, U) = 0. 6

7 3. According to our definition, (u, U) is always less than half of the volume of B. 4. More generally, one can allow {α i } n to be in a collection of other sign patterns which correspond to quadrants in two dimensional case. Then the norm of an angle can be greater than half of the volume of B. Lemma 5 If (u, U) (u, V ), then (u, U) (u, V ). If (u, U) > 0 and (u, U) is a proper subset of (u, V ) then (u, U) < (u, V ). Theorem 5 (Comparison Theorem) Let (u, U) be an angle and (u, U) = 0. Suppose that v is contained in the interior of the convex hull of {u} U. Then (u, U) < (v, U). Proof: Without loss of generality, assume u = 0. Suppose (0, U) > 0. Let U = {u, u,... u n }. Then U is linearly independent. Since v is in the interior of conv(0, U), v can be written as α i u i with α i R + and n i α i <. i= Let U = {u i v} n. It suffices to prove that (0, U) is a proper subset of (0, U ). Let x be a point in (0, U) with x 0. Then x = i x iu i for some x i 0 with i x i > 0. Define for each i xi y i = x i + α i α i The reader can easily verify that y i (u i v) = x. Observe that y i > x i 0. So (0, U) is a proper subset of (0, U ). It follows that Q.E.D. (0, U) < (0, U ) = (v, U). Theorem 6 Let U = {u, u,..., u n } R m for some m n. Suppose that (u, U) > 0. Suppose that the orthogonal projection of u onto aff U is in the interior of conv U. Let v be a vector such that (u v, u u i ) = 0 for every i. If u v then (u, U) > (v, U). Proof: Without loss of generality, assume that u = 0. Then U is linearly independent and v u i for every i. Let u 0 be the orthogonal projection of u onto aff U. By our assumption, u 0 is in the interior of conv U and u 0 = 0. Let v v = ( u 0 + )u 0. Then v u 0 = ( v u 0 + ) u 0 = v + u 0. 7

8 Notice that v u i and u 0 u i u 0. We obtain (u i v, u j v v ) =(u i u 0 + u 0 + u v 0, u j u 0 + u 0 + u 0) =(u i u 0, u j u 0 ) + ( v u 0 + ) u 0 =(u i u 0, u j u 0 ) + (u 0, u 0 ) + (v, v) =(u i, u j ) + (v, v) =(u i v, u j v). () Hence (v, U) = (v, U). Notice that v u 0 + < 0. So the origin sits between v and u 0 which is in the interior of conv U. Therefore, 0 is in the interior of (v, U). By the Comparison Theorem, (v, U) > (0, U). Consequently, (v, U) > (0, U). Q.E.D. Theorem 7 Suppose that B = Σ(A). Let λ 0 be as in Cor. for both B and A. Fix λ λ 0. Let t and t be two real numbers such that A + λi + t J and B + λi + t J have presentations in good position. Then t = t. Proof: We prove by contradiction. Without loss of generality, suppose that t > t. Let U be a presentation of A + λi + t J. Then U is in good position. So 0 is in the interior of conv U. Let V be a representation of B + λi + t J. Then V is in good position. So 0 is in the interior of conv V and dim(span V ) = n. Let v 0 span V and v 0 = t t. Let V = {v i + v 0 } n. Clearly, V is a presentation of B + λi + t J. By Thm. 6, for every i, (0, V \{v i }) > ( v 0, V \{v i }) = (0, V \{v i + v 0 }). Since B + λi + t J = Σ(A + λi + t J), V \{v i + v 0 } is congruent to U\{u i } for every i. Therefore (0, V \{v i + v 0 }) = (0, U\{u i }). Since 0 is in the interiors of the convex hulls of U and of V, we have V ol(b) = (0, V \{v i }) > i= (0, V \{v i + v 0 }) = i= This is a contradiction. Therefore, t = t. Q.E.D. 6 Proof of the Main Theorem (0, U\{u i }) = V ol(b). Suppose B = Σ(A). Suppose λ 0 satisfies Cor. for both A and B. So for every λ λ 0 there exist real numbers t and t such that A+λI +t J has a presentation in good position and B + λi + t J has a presentation in good position. By Theorem 7, t = t. Because of the dependence on λ, put t(λ) = t = t. By Theorem 3, i= eigen 0 (A + λi + t(λ)j) = eigen 0 (B + λi + t(λ)j) = R. Since 0 is the lowest eigenvalue of A + λi + t(λ)j and B + λi + t(λ)j, λ is the lowest eigenvalue of A + t(λ)j and B + t(λ)j. In addition, eigen λ (A + t(λ)j) = eigen λ (B + t(λ)j) = R. 8

9 Now it suffices to show that t([λ 0, )) covers a nonempty open interval. By Lemme 4 and Lemma 3, t(λ) = u 0 = t (A + λi). So t(λ) is a rational function. Clearly, t([λ 0, )) contains a nonempty open interval T. For t T, we have λ n (A + tj) = λ n (B + tj) and eigen λn (A + tj) = eigen λn (B + tj) = R. This finishes the proof of Theorem. Q.E.D. Tutte s proof involves certain polynomials associated with a graph. It is algebraic in nature. The main instrument in our proof is the comparison theorem. Presumably, there is a connection between the geometry in this paper and the polynomials defined in Tutte s paper. In particular, given n unit vectors u, u,... u n, can we compute the function (0, U) explicitly in terms of U t U = A? This question turns out to be hard to answer. The norm (0, U) as a function of A may be closely related to the functions studied in Tutte s paper []. References [] [K] W. L. Kocay, An extension of Kelly s Lemma to Spanning Graphs, Congr. Numer. (3), 98, (09-0). [] [Tutte] W. T. Tutte, All the King s Horses (A Guide to Reconstruction), Graph Theory and Related Topics, Academic Press, 979, (5-33). 9

Eigenvectors and Reconstruction

Eigenvectors and Reconstruction Eigenvectors and Reconstruction Hongyu He Department of Mathematics Louisiana State University, Baton Rouge, USA hongyu@mathlsuedu Submitted: Jul 6, 2006; Accepted: Jun 14, 2007; Published: Jul 5, 2007

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

MATH 235. Final ANSWERS May 5, 2015

MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued) 1 A linear system of equations of the form Sections 75, 78 & 81 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written in matrix

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues Lecture Notes: Eigenvalues and Eigenvectors Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk 1 Definitions Let A be an n n matrix. If there

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Auerbach bases and minimal volume sufficient enlargements

Auerbach bases and minimal volume sufficient enlargements Auerbach bases and minimal volume sufficient enlargements M. I. Ostrovskii January, 2009 Abstract. Let B Y denote the unit ball of a normed linear space Y. A symmetric, bounded, closed, convex set A in

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

Optimization Theory. A Concise Introduction. Jiongmin Yong

Optimization Theory. A Concise Introduction. Jiongmin Yong October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

More information

Lecture 1 and 2: Random Spanning Trees

Lecture 1 and 2: Random Spanning Trees Recent Advances in Approximation Algorithms Spring 2015 Lecture 1 and 2: Random Spanning Trees Lecturer: Shayan Oveis Gharan March 31st Disclaimer: These notes have not been subjected to the usual scrutiny

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Topic 1: Matrix diagonalization

Topic 1: Matrix diagonalization Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

CS 143 Linear Algebra Review

CS 143 Linear Algebra Review CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

Math 341: Convex Geometry. Xi Chen

Math 341: Convex Geometry. Xi Chen Math 341: Convex Geometry Xi Chen 479 Central Academic Building, University of Alberta, Edmonton, Alberta T6G 2G1, CANADA E-mail address: xichen@math.ualberta.ca CHAPTER 1 Basics 1. Euclidean Geometry

More information

Chapter 1. Preliminaries

Chapter 1. Preliminaries Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between

More information

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by Linear Algebra Determinants and Eigenvalues Introduction: Many important geometric and algebraic properties of square matrices are associated with a single real number revealed by what s known as the determinant.

More information

Algebraic Methods in Combinatorics

Algebraic Methods in Combinatorics Algebraic Methods in Combinatorics Po-Shen Loh 27 June 2008 1 Warm-up 1. (A result of Bourbaki on finite geometries, from Răzvan) Let X be a finite set, and let F be a family of distinct proper subsets

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. Definition: Let V T V be a linear transformation.

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C Topic 1 Quiz 1 text A reduced row-echelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correct-choice may have 0, 1, 2, or 3 choice may have 0,

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

The spectra of super line multigraphs

The spectra of super line multigraphs The spectra of super line multigraphs Jay Bagga Department of Computer Science Ball State University Muncie, IN jbagga@bsuedu Robert B Ellis Department of Applied Mathematics Illinois Institute of Technology

More information

Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: 11/21/07) Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0 SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM () We find a least squares solution to ( ) ( ) A x = y or 0 0 a b = c 4 0 0. 0 The normal equation is A T A x = A T y = y or 5 0 0 0 0 0 a b = 5 9. 0 0 4 7

More information

Vector Spaces, Affine Spaces, and Metric Spaces

Vector Spaces, Affine Spaces, and Metric Spaces Vector Spaces, Affine Spaces, and Metric Spaces 2 This chapter is only meant to give a short overview of the most important concepts in linear algebra, affine spaces, and metric spaces and is not intended

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Page of 7 Linear Algebra Practice Problems These problems cover Chapters 4, 5, 6, and 7 of Elementary Linear Algebra, 6th ed, by Ron Larson and David Falvo (ISBN-3 = 978--68-78376-2,

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

Matrix Vector Products

Matrix Vector Products We covered these notes in the tutorial sessions I strongly recommend that you further read the presented materials in classical books on linear algebra Please make sure that you understand the proofs and

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = 30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

2018 Fall 2210Q Section 013 Midterm Exam II Solution

2018 Fall 2210Q Section 013 Midterm Exam II Solution 08 Fall 0Q Section 0 Midterm Exam II Solution True or False questions points 0 0 points) ) Let A be an n n matrix. If the equation Ax b has at least one solution for each b R n, then the solution is unique

More information

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

MATH 149, FALL 2008 DISCRETE GEOMETRY LECTURE NOTES

MATH 149, FALL 2008 DISCRETE GEOMETRY LECTURE NOTES MATH 149, FALL 2008 DISCRETE GEOMETRY LECTURE NOTES LENNY FUKSHANSKY Contents 1. Introduction 2 2. Norms, sets, and volumes 5 3. Lattices 12 4. Quadratic forms 23 5. Theorems of Blichfeldt and Minkowski

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100 Preliminary Linear Algebra 1 Copyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 100 Notation for all there exists such that therefore because end of proof (QED) Copyright c 2012

More information

DAVIS WIELANDT SHELLS OF NORMAL OPERATORS

DAVIS WIELANDT SHELLS OF NORMAL OPERATORS DAVIS WIELANDT SHELLS OF NORMAL OPERATORS CHI-KWONG LI AND YIU-TUNG POON Dedicated to Professor Hans Schneider for his 80th birthday. Abstract. For a finite-dimensional operator A with spectrum σ(a), the

More information

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8 Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v

More information

Lec 2: Mathematical Economics

Lec 2: Mathematical Economics Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Distance Geometry-Matrix Completion

Distance Geometry-Matrix Completion Christos Konaxis Algs in Struct BioInfo 2010 Outline Outline Tertiary structure Incomplete data Tertiary structure Measure diffrence of matched sets Def. Root Mean Square Deviation RMSD = 1 n x i y i 2,

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

MATRIX ANALYSIS HOMEWORK

MATRIX ANALYSIS HOMEWORK MATRIX ANALYSIS HOMEWORK VERN I. PAULSEN 1. Due 9/25 We let S n denote the group of all permutations of {1,..., n}. 1. Compute the determinants of the elementary matrices: U(k, l), D(k, λ), and S(k, l;

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013 Math Exam Sample Problems Solution Guide October, Note that the following provides a guide to the solutions on the sample problems, but in some cases the complete solution would require more work or justification

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

The Matrix-Tree Theorem

The Matrix-Tree Theorem The Matrix-Tree Theorem Christopher Eur March 22, 2015 Abstract: We give a brief introduction to graph theory in light of linear algebra. Our results culminates in the proof of Matrix-Tree Theorem. 1 Preliminaries

More information

City Suburbs. : population distribution after m years

City Suburbs. : population distribution after m years Section 5.3 Diagonalization of Matrices Definition Example: stochastic matrix To City Suburbs From City Suburbs.85.03 = A.15.97 City.15.85 Suburbs.97.03 probability matrix of a sample person s residence

More information