Eigenvectors and Reconstruction
|
|
- Archibald Barnett
- 5 years ago
- Views:
Transcription
1 Eigenvectors and Reconstruction Hongyu He Department of Mathematics Louisiana State University, Baton Rouge, USA Submitted: Jul 6, 2006; Accepted: Jun 14, 2007; Published: Jul 5, 2007 Mathematics Subject Classification: 05C88 Abstract In this paper, we study the simple eigenvectors of two hypomorphic matrices using linear algebra We also give new proofs of results of Godsil and McKay 1 Introduction We start by fixing some notations ( [HE1]) Let A be a n n real symmetric matrix Let A i be the matrix obtaining by deleting the i-th row and i-th column of A We say that two symmetric matrices A and B are hypomorphic if, for each i, B i can be obtained by simultaneously permuting the rows and columns of A i Let Σ be the set of permutations We write B = Σ(A) If M is a symmetric real matrix, then the eigenvalues of M are real We write eigen(m) = (λ 1 (M) λ 2 (M) λ n (M)) If α is an eigenvalue of M, we denote the corresponding eigenspace by eigen α (M) Let 1 be the n-dimensional vector (1, 1,, 1) Put J = 1 t 1 In [HE1], we proved the following theorem Theorem 1 ( [HE1]) Let B and A be two real n n symmetric matrices Let Σ be a hypomorphism such that B = Σ(A) Let t be a real number Then there exists an open interval T such that for t T we have 1 λ n (A + tj) = λ n (B + tj); 2 eigen λn (A + tj) and eigen λn (B + tj) are both one dimensional; I would like to thank the referee for his valuable comments the electronic journal of combinatorics 14 (2007), #N14 1
2 3 eigen λn (A + tj) = eigen λn (B + tj) As proved in [HE1], our result implies Tutte s theorem which says that eigen(a + tj) = eigen(b + tj) So det(a + tj λi) = det(b + tj λi) In this paper, we shall study the eigenvectors of A and B Most of the results in this paper are not new Our approach is new We apply Theorem 1 to derive several wellknown results We first prove that the squares of the entries of simple unit eigenvectors of A can be reconstructed as functions of eigen(a) and eigen(a i ) This yields a proof of a Theorem of Godsil-McKay We then study how the eigenvectors of A change after a perturbation of rank 1 symmetric matrices Combined with Theorem 1, we prove another result of Godsil-McKay which states that the simple eigenvectors that are perpendicular to 1 are reconstructible We further show that the orthogonal projection of 1 onto higher dimensional eigenspaces is reconstructible Our investigation indicates that the following conjecture could be true Conjecture 1 Let A be a real n n symmetric matrix Then there exists a subgroup G(A) O(n) such that a real symmetric matrix B satisfies the properties that eigen(b) = eigen(a) and eigen(b i ) = eigen(a i ) for each i if and only if B = UAU t for some U G(A) This conjecture is clearly true if rank(a) = 1 For rank(a) = 1, the group G(A) can be chosen as Z n 2, all in the form of diagonal matrices In some other cases, G(A) can be a subgroup of the permutation group S n 2 Reconstruction of Square Functions Theorem 2 Let A be a n n real symmetric matrix Let (λ 1 λ 2 λ n ) be the eigenvalues of A Suppose λ i is a simple eigenvalue of A Let = (p 1,i, p 2,i,, p n,i ) t be a unit vector in eigen λi (A) Then for every m, p 2 m,i can be expressed as a function of eigen(a) and eigen(a m ) Proof: Let λ i be a simple eigenvalue of A Let = (p 1,i, p 2,i,, p n,i ) t be a unit vector in eigen λi (A) There exists an orthogonal matrix P such that P = (p 1, p 2,, p n ) and A = P DP t where λ λ 2 0 D = 0 0 λ n Then A λ i I = P DP t λ i I = P (D λ i I)P t = j i (λ j λ i )p j p t j the electronic journal of combinatorics 14 (2007), #N14 2
3 which equals λ 1 λ i 0 0 p 1,1 p 1,i p 1,n p 2,1 p 2,i p 2,n 0 λ i λ i 0 p n,1 p n,i p n,n 0 0 λ n λ i p 1,1 p 2,1 p n,1 p 1,i p 2,i p n,i p 1,n p 2,n p n,n Deleting the m-th row and m-th column, we obtain p 1,1 p 1,i p 1,n λ 1 λ i 0 0 p m,1 p m,i p m,n 0 λ i λ i 0 p n,1 p n,i p n,n 0 0 λ n λ i p 1,1 p m,1 p n,1 p 1,i p m,i p n,i p 1,n p m,n p n,n This is A m λ i I n 1 Notice that P is orthogonal Let P m,i be the matrix obtained by deleting the m-th row and i-th column Then det Pm,i 2 = p2 m,i where p m,i is the (m, i)-th entry of P Taking the determinant, we have det(a m λ i I n 1 ) = p 2 m,i (λ j λ i ) j i It follows that QED p 2 m,i = n 1 j=1 (λ j(a m ) λ i ) j i (λ j λ i ) Corollary 1 Let A and B be two n n real symmetric matrices Suppose that eigen(a) = eigen(b) and eigen(a i ) = eigen(b i ) Let λ i be a simple eigenvalue of A and B Let the electronic journal of combinatorics 14 (2007), #N14 3
4 = (p 1,i, p 2,i,, p n,i ) t be a unit vector in eigen λi (A) and = (q 1,i, q 2,i,, q n,i ) t be a unit vector in eigen λi (B) Then p 2 j,i = q2 j,i j [1, n] Corollary 2 (Godsil-McKay, see Theorem 32, [GM]) Let A and B be two n n real symmetric matrices Suppose that A and B are hypomorphic Let λ i be a simple eigenvalue of A and B Let = (p 1,i, p 2,i,, p n,i ) t be a unit vector in eigen λi (A) and = (q 1,i, q 2,i,, q n,i ) t be a unit vector in eigen λi (B) Then p 2 j,i = q2 j,i j [1, n] 3 Eigenvalues and Eigenvectors under the perturbation of a rank one symmetric matrix Let A be a n n real symmetric matrix Let x be a n-dimensional row column vector Let M = xx t Now consider A + tm We have A + tm = P DP t + tm = P (D + tp t MP )P t = P (D + tp t xx t P )P t Let P t x = q So = (, x) for each i [1, n] Then Put D(t) = D + tqq t A + tm = P (D + tqq t )P t Lemma 1 det(d + tqq t λi) = det(a λi)(1 + i tq 2 i λ i λ ) Proof: det(d λi + tqq t ) can be written as a sum of products of λ i λ and q j For each S a subset of [1, n], combine the terms containing only (λ i λ) Since the rank of qq t is one, only for S = n, n 1, the coefficients may be nonzero We obtain det(d + tqq t λi) = n (λ i λ) + i=1 n i=1 tq 2 i (λ i λ) j i The Lemma follows Put P t (λ) = 1 + i tq 2 i λ i λ Lemma 2 Fix t < 0 Suppose that λ 1, λ 2,, λ n are distinct and 0 for every i Then P t (λ) has exactly n roots (µ 1, µ 2,, µ n ) satisfying an interlacing relation: λ 1 > µ 1 > λ 2 > µ 2 > > µ n 1 > λ n > µ n the electronic journal of combinatorics 14 (2007), #N14 4
5 tq 2 i Proof: Clearly, dpt(λ) = dλ i (λ i < 0 So P λ) 2 t (λ) is always decreasing On the interval (, λ n ), lim λ P t (λ) = 1 and lim λ λ n P t (λ) = So P t (λ) has a unique root µ n (, λ n ) Similar statement holds for each (λ i 1, λ i ) On (λ 1, ), lim λ P t (λ) = 1 and lim λ λ + P t(λ) = So P 1 t (λ) does not have any roots in (λ 1, ) QED Theorem 3 Fix t < 0 and x R n Let M = xx t Let l be the number of distinct eigenvalues satisfying (x, eigen λ (A)) 0 Choose an orthonormal basis of each eigenspace of A so that one of the eigenvectors is a multiple of the orthogonal projection of x onto the eigenspace if this projection is nonzero Denote this basis by { } and let P = (p 1, p 2,, p n ) Let S = {i 1 > i 2 > > i l } such that (x, ) 0 for every i S and (x, ) = 0 for every i / S Then there exists (µ 1,, µ l ) such that λ i1 > µ 1 > λ i2 > µ 2 > > λ il > µ l and eigen(a + tm) = {λ i (A) i / S} {µ 1, µ 2, µ l } Furthermore, eigen µj (A + tm) contains Here the index set {i 1, i 2,, i l } may not be unique statement holds for t > 0 with I shall also point out a similar µ 1 > λ i1 > µ 2 > λ i2 > > µ l > λ il Proof: Recall that = (, x) Since (x, eigen λij (A)) 0, j 0 For i / S, = 0 Notice l tqi 2 P t (λ) = 1 + j λ ij λ Applying Lemma 2 to S, we obtain the roots of P t (λ), {µ 1, µ 2,, µ l }, satisfying j=1 λ i1 > µ 1 > λ i2 > µ 2 > > λ il > µ l It follows that the roots of det(a + tm λi) = P t (λ) n i=1 (λ i λ) can be obtained from eigen(a) be changing {λ i1 > λ i2 > > λ il } to {µ 1, µ 2, µ l } Therefore, eigen(a + tm) = {λ i (A) i / S} {µ 1, µ 2, µ l } the electronic journal of combinatorics 14 (2007), #N14 5
6 Fix a µ j Let {e i } be the standard basis for R n Notice that (A + tm) =P (D + tqq t )P t =P (D + tqq t ) e i λ i µ j λ i =P e i + t =P ( =P =µ j λ i e i µ j e i q 1 q n ) e i q 2 i (1) tq 2 i λ i µ j Notice that here we use the fact that P t (µ j ) = + 1 = 0 We have obtained that (A + tm) q λ i S λ i µ j = µ i j λ i µ j Therefore, QED eigen µj (A + tm) 4 Reconstruction of Simple Eigenvectors not perpendicular to 1 Now let M = J = 11 t Theorem 3 applies to A + tj and B + tj Theorem 4 (Godsil-McKay, [GM]) Let B and A be two real n n symmetric matrices Let Σ be a hypomorphism such that B = Σ(A) Let S [1, n], A = P DP t and B = UDU t be as in Theorem 3 For i S, we have = or = In particular, if λ i is a simple eigenvalue of A and (eigen λi (A), 1) 0, then eigen λi (A) = eigen λi (B) Proof: By Tutte s theorem, eigen(a) = eigen(b) Let A = P DP t and B = UDU t Since det(a + tj λi) = det(b + tj λi), by Lemma 1, det(a λi)(1 + i t(1, ) 2 λ i λ ) = det(b λi)(1 + i t(1, ) 2 λ i λ ) the electronic journal of combinatorics 14 (2007), #N14 6
7 It follows that for every λ i, λ j =λ i (1, p j ) 2 = λ j =λ i (1, u j ) 2 Consequently, the l for A is the same as the l for B Let S be as in Theorem 3 for both A and B Without loss of generality, suppose that A = P DP t and B = UDU t as in Theorem 3 In particular, for every i [1, n], we have (, 1) 2 = (, 1) 2 (2) Let T be as in the proof of Theorem 1 in [HE1] for A and B Without loss of generality, suppose T = (t 1, t 2 ) R Let t T and let µ l (t) be the µ l in Theorem 3 for A and B Notice that the lowest eigenvectors of A + tj and B + tj are in R +n (see Lemma 1, Theorem 7 and Proof of Theorem 2 in [HE1]) So they are not perpendicular to 1 By Theorem 3, µ l (t) = λ n (A + tj) = λ n (B + tj) By Theorem 1, (,1) eigen µ1 (t)(a + tj) = eigen µl (t)(b + tj) = R (,1) So is parallel to λ i µ l (t) Since {p λ i µ l (t) i} and { } are orthonormal, by Equation 2, (, 1) λ i µ l (t) 2 = (, 1) λ i µ l (t) 2 It follows that for every t T, (, 1) λ i µ l (t) = ± (, 1) λ i µ l (t) Recall that 1 = qi 2 t i Notice that the function ρ qi 2 λ i µ l (t) i λ i is a continuous ρ and one-to-one mapping from (, λ n ) onto (0, ) There exists a nonempty interval T 0 (, λ n ) such that if ρ T 0, then qi 2 i ( 1 λ i ρ t 1, 1 t 2 ) So every ρ T 0 is a µ l (t) for some t (t 1, t 2 ) It follow that for every ρ T 0, (, 1) λ i ρ = ± (, 1) λ i ρ Notice that both vectors are nonzero and depend continuously on ρ Either, or, (, 1) λ i ρ = (, 1) λ i ρ (, 1) λ i ρ = (, 1) λ i ρ (ρ T 0 ); (ρ T 0 ); Notice that the functions {ρ 1 λ ij ρ } i j S are linearly independent For every i S, we have (, 1) = ± (, 1) Because and are both unit vectors, = ± In particular, for every simple λ i with (, 1) 0 we have eigen λi (A) = eigen λi (B) QED the electronic journal of combinatorics 14 (2007), #N14 7
8 Corollary 3 Let B and A be two real n n symmetric matrices Suppose that B = Σ(A) for a hypomorphism Σ Let λ i be an eigenvalue of A such that (eigen λi (A), 1) 0 Then the orthogonal projection of 1 onto eigen λi (A) equals the orthogonal projection of 1 onto eigen λi (B) Proof: Notice that the projections are (, 1) and (, 1) Whether = or =, we always have (, 1) = (, 1) QED Conjecture 2 Let A and B be two hypomorphic matrices Let λ i be a simple eigenvalue of A Then there exists a permutation matrix τ such that τeigen λi (A) = eigen λi (B) This conjecture is apparently true if eigen λi (A) is not perpendicular to 1 References [Tutte] W T Tutte, All the King s Horses (A Guide to Reconstruction), Graph Theory and Related Topics, Academic Press, 1979, (15-33) [GM] C D Godsil and B D McKay, Spectral Conditions for the Reconstructiblity of a graph, J Combin Theory Ser B , No 3, ( ) [HE1] H He, Reconstruction and Higher Dimensional Geometry, Journal of Combinatorial Theory, Series B 97, No 3 ( ) [Ko] W Kocay, Some New Methods in Reconstruction Theory, Combinatorial mathematics, IX (Brisbane, 1981), LNM 952, (89-114) the electronic journal of combinatorics 14 (2007), #N14 8
Reconstruction and Higher Dimensional Geometry
Reconstruction and Higher Dimensional Geometry Hongyu He Department of Mathematics Louisiana State University email: hongyu@math.lsu.edu Abstract Tutte proved that, if two graphs, both with more than two
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationLINEAR ALGEBRA QUESTION BANK
LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,
More informationMath 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that
Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that det(a) = 2, det(b) = 2, det(c) = 1, det(d) = 4. 2 (a) Compute det(ad)+det((b
More informationLec 2: Mathematical Economics
Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen
More informationTherefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.
Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be
More informationCAAM 335 Matrix Analysis
CAAM 335 Matrix Analysis Solutions to Homework 8 Problem (5+5+5=5 points The partial fraction expansion of the resolvent for the matrix B = is given by (si B = s } {{ } =P + s + } {{ } =P + (s (5 points
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationMA 265 FINAL EXAM Fall 2012
MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationQuestion: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?
Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationand let s calculate the image of some vectors under the transformation T.
Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More information1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)
1 A linear system of equations of the form Sections 75, 78 & 81 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written in matrix
More informationUnit 5: Matrix diagonalization
Unit 5: Matrix diagonalization Juan Luis Melero and Eduardo Eyras October 2018 1 Contents 1 Matrix diagonalization 3 1.1 Definitions............................. 3 1.1.1 Similar matrix.......................
More informationMa/CS 6b Class 20: Spectral Graph Theory
Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Recall: Parity of a Permutation S n the set of permutations of 1,2,, n. A permutation σ S n is even if it can be written as a composition of an
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationMa/CS 6b Class 20: Spectral Graph Theory
Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationMath 315: Linear Algebra Solutions to Assignment 7
Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are
More information1. In this problem, if the statement is always true, circle T; otherwise, circle F.
Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationChapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015
Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal
More informationKernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman
Kernels of Directed Graph Laplacians J. S. Caughman and J.J.P. Veerman Department of Mathematics and Statistics Portland State University PO Box 751, Portland, OR 97207. caughman@pdx.edu, veerman@pdx.edu
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationWarm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions
Warm-up True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationOn the eigenvalues of Euclidean distance matrices
Volume 27, N. 3, pp. 237 250, 2008 Copyright 2008 SBMAC ISSN 00-8205 www.scielo.br/cam On the eigenvalues of Euclidean distance matrices A.Y. ALFAKIH Department of Mathematics and Statistics University
More informationc c c c c c c c c c a 3x3 matrix C= has a determinant determined by
Linear Algebra Determinants and Eigenvalues Introduction: Many important geometric and algebraic properties of square matrices are associated with a single real number revealed by what s known as the determinant.
More informationMATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.
MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ
More informationSingular Value Decomposition and Principal Component Analysis (PCA) I
Singular Value Decomposition and Principal Component Analysis (PCA) I Prof Ned Wingreen MOL 40/50 Microarray review Data per array: 0000 genes, I (green) i,i (red) i 000 000+ data points! The expression
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationCentral Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J
Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J Frank Curtis, John Drew, Chi-Kwong Li, and Daniel Pragel September 25, 2003 Abstract We study central groupoids, central
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationMATH. 20F SAMPLE FINAL (WINTER 2010)
MATH. 20F SAMPLE FINAL (WINTER 2010) You have 3 hours for this exam. Please write legibly and show all working. No calculators are allowed. Write your name, ID number and your TA s name below. The total
More informationProblem # Max points possible Actual score Total 120
FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to
More informationRecall the convention that, for us, all vectors are column vectors.
Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists
More informationMath Matrix Algebra
Math 44 - Matrix Algebra Review notes - (Alberto Bressan, Spring 7) sec: Orthogonal diagonalization of symmetric matrices When we seek to diagonalize a general n n matrix A, two difficulties may arise:
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationSolutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015
Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See
More informationNew feasibility conditions for directed strongly regular graphs
New feasibility conditions for directed strongly regular graphs Sylvia A. Hobart Jason Williford Department of Mathematics University of Wyoming Laramie, Wyoming, U.S.A sahobart@uwyo.edu, jwillif1@uwyo.edu
More informationTopic 1: Matrix diagonalization
Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it
More information1. Select the unique answer (choice) for each problem. Write only the answer.
MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +
More informationLecture 15, 16: Diagonalization
Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose
More informationMath Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p
Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the
More informationCity Suburbs. : population distribution after m years
Section 5.3 Diagonalization of Matrices Definition Example: stochastic matrix To City Suburbs From City Suburbs.85.03 = A.15.97 City.15.85 Suburbs.97.03 probability matrix of a sample person s residence
More informationMATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation
More informationMath Linear Algebra Final Exam Review Sheet
Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationReview of linear algebra
Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationGlossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB
Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the
More informationChapter Two Elements of Linear Algebra
Chapter Two Elements of Linear Algebra Previously, in chapter one, we have considered single first order differential equations involving a single unknown function. In the next chapter we will begin to
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationThe spectra of super line multigraphs
The spectra of super line multigraphs Jay Bagga Department of Computer Science Ball State University Muncie, IN jbagga@bsuedu Robert B Ellis Department of Applied Mathematics Illinois Institute of Technology
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationZ-Pencils. November 20, Abstract
Z-Pencils J. J. McDonald D. D. Olesky H. Schneider M. J. Tsatsomeros P. van den Driessche November 20, 2006 Abstract The matrix pencil (A, B) = {tb A t C} is considered under the assumptions that A is
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationMATH 235. Final ANSWERS May 5, 2015
MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your
More informationMATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationChapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors
Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there
More informationRecall : Eigenvalues and Eigenvectors
Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector
More informationORIE 6334 Spectral Graph Theory September 8, Lecture 6. In order to do the first proof, we need to use the following fact.
ORIE 6334 Spectral Graph Theory September 8, 2016 Lecture 6 Lecturer: David P. Williamson Scribe: Faisal Alkaabneh 1 The Matrix-Tree Theorem In this lecture, we continue to see the usefulness of the graph
More informationTopic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C
Topic 1 Quiz 1 text A reduced row-echelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correct-choice may have 0, 1, 2, or 3 choice may have 0,
More informationSolutions to Final Exam
Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns
More informationLecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues
Lecture Notes: Eigenvalues and Eigenvectors Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk 1 Definitions Let A be an n n matrix. If there
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationMA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS
MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is
More informationMTH 5102 Linear Algebra Practice Final Exam April 26, 2016
Name (Last name, First name): MTH 5 Linear Algebra Practice Final Exam April 6, 6 Exam Instructions: You have hours to complete the exam. There are a total of 9 problems. You must show your work and write
More informationEE263: Introduction to Linear Dynamical Systems Review Session 5
EE263: Introduction to Linear Dynamical Systems Review Session 5 Outline eigenvalues and eigenvectors diagonalization matrix exponential EE263 RS5 1 Eigenvalues and eigenvectors we say that λ C is an eigenvalue
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More informationvibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...
6 Eigenvalues Eigenvalues are a common part of our life: vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow, The simplest
More informationThe Matrix-Tree Theorem
The Matrix-Tree Theorem Christopher Eur March 22, 2015 Abstract: We give a brief introduction to graph theory in light of linear algebra. Our results culminates in the proof of Matrix-Tree Theorem. 1 Preliminaries
More information(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).
.(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)
More informationLecture 1 and 2: Random Spanning Trees
Recent Advances in Approximation Algorithms Spring 2015 Lecture 1 and 2: Random Spanning Trees Lecturer: Shayan Oveis Gharan March 31st Disclaimer: These notes have not been subjected to the usual scrutiny
More informationPRACTICE PROBLEMS FOR THE FINAL
PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show
More information4. Determinants.
4. Determinants 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 2 2 determinant 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 3 3 determinant 4.1.
More informationa 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12
24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2
More informationDimension. Eigenvalue and eigenvector
Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More information4. Linear transformations as a vector space 17
4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation
More informationMath 240 Calculus III
The Calculus III Summer 2015, Session II Wednesday, July 8, 2015 Agenda 1. of the determinant 2. determinants 3. of determinants What is the determinant? Yesterday: Ax = b has a unique solution when A
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationMATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.
MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial. Geometric properties of determinants 2 2 determinants and plane geometry
More informationCS 246 Review of Linear Algebra 01/17/19
1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationChapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationMa/CS 6b Class 23: Eigenvalues in Regular Graphs
Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues
More informationStudy Notes on Matrices & Determinants for GATE 2017
Study Notes on Matrices & Determinants for GATE 2017 Matrices and Determinates are undoubtedly one of the most scoring and high yielding topics in GATE. At least 3-4 questions are always anticipated from
More informationOnline Exercises for Linear Algebra XM511
This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2
More information