Synopsis of Numerical Linear Algebra

Size: px
Start display at page:

Download "Synopsis of Numerical Linear Algebra"

Transcription

1 Synopsis of Numerical Linear Algebra Eric de Sturler Department of Mathematics, Virginia Tech Iterative Methods for Linear Systems: Basics to Research Numerical Analysis and Software I

2 Systems of Linear Equations 1 Iterative Methods Page 1

3 Systems of Linear Equations 2 Monday, August 22, :08 AM Iterative Methods Page 2

4 Iterative Methods Page 3

5 Norms A norm on a vector space V is any function f : V such that 1. f ( x ) ³ 0 and f ( x) = 0 x = 0, 2. f ( ax) = a f ( x), 3. f ( x + y) f ( x) + f ( y), where x Î V and a Î. n n m n m n Important vector spaces in this course:,, and, (matrices). Note that the set of all m-by-n matrices (real or complex) is a vector space. Many matrix norms possess the submultiplicative or consistency property: m k k n f AB f A f B for all A Î and B Î (or real matrices). ( ) ( ) ( ) Note that strictly speaking this is a property of a family of norms, because in general each f is defined on a different vector space. 4

6 Norms We can define a matrix norm using a vector norm (an induced matrix norm): Ax a A = max = max Ax a x x ¹ 0 x 1 a a = a Induced norms are always consistent (satisfy consistency property). Two norms. and. are equivalent if there exist positive, real constants a a b and b such that " x : a x x b x a b a The constants depend on the two norms but not on x. All norms on a finite dimensional vector space are equivalent. 5

7 Norms Some useful norms on n n,, 1 n p p = ç æ x ö i= 1 i m n, m n : p-norms: x p çèå, especially p = 1, 2,, where x max ø = x. i i Induced matrix p-norms are: A A A 1 n = max å a (max absolute column sum) j ( A) i= 1 ij = s (max singular value harder to compute than others) 2 max = n max å a (max absolute row sum) i j= 1 ij Matrix Frobenius norm: 1 æ n 2ö 2 A = a ç å F ij, = 1 ij çè (similar to vector 2-norm for a matrix) ø All these norms are consistent (satisfy the submultiplicative property) 6

8 Norms & Inner Products 1 Friday, August 19, :25 AM Iterative Methods Page 4

9 Iterative Methods Page 5 Norms & Inner Products 2 Monday, August 22, :26 AM

10 Norms & Inner Products 3 Monday, August 22, :32 AM Iterative Methods Page 6

11 Iterative Methods Page 7

12 Norms & Inner Products 4 Monday, August 22, :19 PM Iterative Methods Page 8

13 Iterative Methods Page 9

14 Norms & Inner Products 5 Monday, August 22, :21 PM Iterative Methods Page 10

15 Iterative Methods Page 11

16 Norms & Inner Products 6 Monday, August 22, :01 PM Iterative Methods Page 12

17 Iterative Methods Page 13

18 Error Analysis Monday, August 22, :22 PM Iterative Methods Page 14

19

20

21

22

23 Inner Products Many methods to select z m from the Krylov space are related to projections. We call f : S S an inner product over the real vector space S, if for all vectors xyz,, and scalars a, 1. f (, xx ) ³ 0and f (, xx) = 0 x= 0 2. f ( axz, ) = afxz (, ) 3. f ( x + y, z) = f( x, z) + f( y, z) 4. f (, xz) = fzx (, ) For a complex inner product, f : S S, over a complex vector space S we have instead of property (4): f (, xz) = fzx (, ). Inner products are often written as, xy, (, ) We say x and y are orthogonal (w.r.t α-ip), x xy, or xy,, etc.. a ^ y if xy, = 0. a a 7

24 Inner products and Norms Each inner product defines, or induces, a norm: x = áx, xñ. (proof?) Many norms are induced by inner products, but not all. Those norms that are have additional nice properties (that we ll discuss soon). An inner product and its induced norm satisfy: áxy, ñ x y (CS ineq) A norm induced by an inner product satisfies the parallelogram equality: ( ) x + y + x - y = 2 x + y In this case we can find the inner product from the norm as well: Real case: áxy, ñ= ( x+ y - x-y ) 4 Complex case: Re áxy, ñ= ( x+ y - x-y ) Im áxy, ñ= x+ iy - x-iy , ( ) 8

25

26 T J- J-Lff Vl. s ouj J.-h.tl, J- J--6. onl-krma. l U- wn d i h D V) ( b - _; u > ==- D J?o, all u G U p ro vi des a.. ree«:.pe. Y- J?i d i vi LA J-ha,J- r-'!. d u.c.e.3 Cl.VJ op.ii h'l i t.cljiovt pro bl.e. W1 ( V\()...y-d / e tply1 s i ve. J to solvivi a. I iyjea.y' s s.j.e.vvi 6.eiua..kons [ re.\ o)-we. I.ea. s / e.he.o..p ). w 11.v f)/ s kow"" th is bd c.ons h-u.c..kch'l; whic.h,wi!/ alsb 3 \,J Oll-0 \th tl. J- U i 3 UYJ t'j, u.e.i2o Y-- ll. i i,v e.\'i 6_

27

28

29

30

31

32

33

34

35

36

37

38 Least Squares 1 Tuesday, August 30, :39 PM Iterative Methods Page 1

39 Least Squares 2 Tuesday, August 30, :57 PM Iterative Methods Page 2

40

41

42 Least Squares 3 Tuesday, August 30, :16 PM Iterative Methods Page 3

43 Iterative Methods Page 4

44

45

46

47

48

49

50

51 Eigenvalues and Eigenvectors Let Ax = lx and ya = ly (for same l). * * We call the vector x a (right) eigenvector, the vector y a left eigenvector, and l an eigenvalue of A, the triple together is called an eigentriple (of A), and,x l,y a (right) eigenpair and left eigenpair. ( l ) and ( ) The set of all eigenvalues of A, L ( A), is called the spectrum of A (when convenient we will count multiplicities in L ( A) ). If the matrix A is diagonalizable (has a complete set of eigenvectors) we have -1 A= VLV AV = VL, where V is a matrix with the right eigenvectors as columns and L is a diagonal matrix with the eigenvalues as coefficients. This is not always possible (soon). A similar decomposition can be given for the left eigenvectors. 8

52 Spectral Radius { } The spectral radius r ( A) is defined as r( A) max l : l L( A) = Î. Theorem: For all A and e > 0 a consistent norm. a exists such that A r( A) a + e. So, if r ( A) < 1, then a consistent norm. Take e 1 1 r( A) ( ) = - and apply theorem above. 2 a exists such that A < 1. a Define A = A (complex conjugate transpose). * T If A is Hermitian ( A r A = A. * = A ), then ( ) 2 If A is normal ( AA = A A),then r ( A) = A. 2 * * 11

53 Characteristic Polynomial The eigenvalues of A are determined by the characteristic polynomial of A. Ax = x w (A I)x = 0 So we're looking for (eigen)values such that the matrix (A I) is singular: det(a I) = 0 (this is a polynomial in ) This polynomial is called the characteristic polynomial of A. The eigenvalues of A are defined to be the roots of its characteristic polynomial. Since eigenvalues of matrix are roots of its characteristic polynomial, the Fundamental Theorem of Algebra implies that an n % n matrix A always has n eigenvalues. The eigenvalues, however, need be neither distinct nor real. Complex eigenvalues of a real matrix must come in complex conjugate pairs.

54 Multiplicity of eigenvalues Eigenvalues may be single or multiple (single or multiple roots). An eigenvalue with multiplicity k > 1 has k or fewer independent eigenvectors associated with it. It has at least one associated eigenvector. If it has fewer than k independent eigenvectors we call the eigenvalue (and the matrix) defective. The multiplicity of an eigenvalue as the (multiple) root of the char. polynomial is called its algebraic multiplicity. The number of independent eigenvectors associated with an eigenvalue is called its geometric multiplicity. The geometric multiplicity is smaller than or equal to the algebraic multiplicity. A matrix that is not defective is called diagonalizable: we have the decomposition A = X X 1 w X 1 AX = = diag( i ) where X contains the eigenvectors (as columns) and contains the eigenvalues.

55 Jordan form of a matrix For every matrix A Š n%n there exists a nonsingular matrix X such that X 1 AX = diag(j1,, Jq) J i = Ji Šmi%mi m1 + m2 + + mq = n i 1 i 1 and, and. i Each block has one corresponding eigenvector: q independent eigenvectors Each block has mi 1 principal vectors (of grade 2) If every block is of size 1, the matrix is diagonalizable Multiple blocks can have the same eigenvalue: i = j The sum of the sizes of all blocks with the same eigenvalue is the algebraic multiplicity of the eigenvalue. The number of blocks with the same eigenvalue is the geometric multiplicity of the eigenvalue.

56 Invariant Subspaces A generalization of eigenvectors to higher dimensions is an invariant subspace. We say a subspace n Ì is invariant under A n n Î if for all x ÎV : Ax Î V It is possible that an invariant subspace is the span of a set of eigenvectors, but this need not be the case. Moreover, in general, elements (vectors) of the subspace will not be themselves eigenvectors. In many cases it is useful to consider the restriction of the matrix A to the invariant subspace : A :. If V n k Î and ( V ) = with range = then AV VL L k k Î. Hence L represents A in the basis defined by V for the space 9

57 Similarity Transformation and Schur Decomposition Let A have eigenpairs (, v ) Av = l v l : i i i i i For nonsingular B, define the similarity transformation: BAB -1 The matrix 1 BAB - has the same eigenvalues, l i, as A and eigenvectors Bv : i In fact, ( ) -1 BAB Bv = BAv = l Bv i i i i 1 BAB - has the same Jordan-block structure as A. In many cases, we are interested in a (complex) unitary (or real, orthogonal) similarity transformation: * with QAQ * * QQ = Q Q = I Schur decomposition: * QAQ = U (upper triangular) 11

58 Similarity Transformation 1 Similarity transformation for A: BAB - ; this can be done with any nonsingular B. Let Ax = lx, then -1 BAB Bx BAx lbx = =. 1 BAB - has the same eigenvalues as A, and eigenvectors Bx where x is an eigenvector of A. Although any nonsingular B possible, most stable and accurate algorithms with orthogonal (unitary) matrix. For example used in the QR algorithm. 2

59 Similarity Transformation Orthogonal similarity transformation for A: * where QQ= I. * QAQ, If * QAQ æl1 Fö = ç 0 L çè 2 ø, then ( A) = ( L ) È ( L ). 1 2 If we can find Q º éq1q ù ê 2 ë ú that yields such a decomposition û we have reduced the problem to two smaller problems. Moreover, AQ1 Q1L1 range Q 1 is invariant subspace. Eigenpair Lz 1 = lz gives eigenpair AQ1z = Q1L1z = lq 1z. = and ( ) 3

60 Approximation over Search Space For large matrices we cannot use full transformations. Often we do not need all eigenvalues/vectors. Look for proper basis Q 1 that captures relevant eigenpairs. We do not need Q 2. Approximations over subspace range( Q 1 ): L = Q AQ * When is an approximation good (enough)? We will rarely find AQ1 - Q1L1 = 0 unless we do huge amount of work. Not necessary. We are working with approximations and we must deal with numerical error anyway. 4

61 Approximation over Search Space Let AQ1 - Q1L1 = R with R small relative to A. Now, ( ) A RQ Q QL AQ R QL - * = = 0. range( Q 1 ) is exact invariant subspace of perturbed matrix, Â. * Â= A- RQ 1 and Â- A A = R A If R A sufficiently small, then Q 1 is acceptable. In fact, this is as good as we can expect (unless we're lucky). Any numerical operation involves perturbed operands! Note that we cannot always say that Q 1 is accurate. 5

62 Eigenvalue problems Before we compute eigenvalues and eigenvectors numerically, we must understand what we can and cannot compute (accurately) or should not compute. We may want Ax x for single eigenpair, for example with as small as possible. Say minimum energy level. In many cases we want Axi ix i for i 1 M (M large), where i are smallest M eigenvalues. It may be important that we do not skip any eigenvalues. We may want the invariant subspace accurately. We may want every eigenvector accurately. 2

63 Usefulness of Computed Results In general we need to consider the accuracy of a computed answer, without knowing the exact answer. This involves the sensitivity of the result we want to compute. If some result is very sensitive to small changes in the problem, it may be impossible to compute exactly. In other cases results may be computable but at very high price, for example, an algorithm may convergence very slowly. Sometimes it is better to compute related but less sensitive result. 3

64 Sensitivity of an Eigenvalue Sensitivity of eigenvalues to perturbations in the matrix. Different eigenvalues or eigenvectors of a matrix are not equally sensitive to perturbations of the matrix. * * Let Ax x and ya y, where x y 1. Consider A E x e x e and drop second order terms. Ax Ae Ex x e x Ae Ex e x * * * * * * yae yex ye yex ye yx yex * * * y Ex yx yx E yx * * Condition number of simple eigenvalue: * yx 1 4

65 Sensitivity of an Eigenvalue For symmetric/hermitian matrix, right and left eigenvectors are the same. So, eigenvalues are inherently well-conditioned. More generally, eigenvalues are well conditioned for normal matrices, but eigenvalues of nonnormal matrices need not be well conditioned. Nonnormal matrices may not have a full set of eigenvectors. The algebraic multiplicity, the multiplicity of as a root of det A I 0, is not equal to the geometric multiplicity, dim null A I. In that case we can consider conditioning of the invariant subspace associated with a Jordan block. 5

66 Sensitivity of an Eigenvalue If is an eigenvalue of A E, then A exists: 1 XEX X E where X is the matrix of eigenvectors of A and 1 X X X is a condition number (consistent norm). A useful backward error result is given by the residual. Let r Ax x and x 1. Then there exists a perturbation E with E A E x x. Proof: Take E * rx. r such that 6

67 Sensitivity of Eigenvectors Consider A x 1, x Consider perturbation E with E 1 2, 2 arb. small. Enough to give A any eigenvectors not equal to x 1 and x 2. Let E E1 E2 and X ˆ xˆ1 xˆ 2 (unitary) Let E * 1 0 and E2 Xˆ Xˆ A E1 I (all nonzero vectors are eigenvectors) A E I X ˆ ˆ* ˆ ˆ* 0 X X X 2 7

68

69

70 Model Problems - pu - qu + ru + su + tu = f. Discretize ( ) ( ) x x y y x y D (i,j+1) C (i-1,j) V (i,j) (i+1,j) A (i,j-1) B Integrate equality over box V. Use Gauss divergence theorem to get æpu ö ( ) ( ) x ò pu + qu dxdy = n ds V x x y y ò ç V qu çè y ø And approximate the line integral numerically. 15

71 Model Problems Now we approximate the boundary integral æpu ö x ò ç nds V qu çè y. ø We approximate the integrals over each side of box V using the midpoint rule and we approximate the derivatives using central differences. C Dy ò pu n dy» p ( U -U ) and so on for the other sides B x 1 i 1/2, j i 1, j i, j D x + + We approximate the integrals over ru, su, tu, and f using the area of the box x y and the value at the midpoint of the box, where we use central differences for derivatives. So, u»( U -U 1, 1, ) /( 2D x x i j i j ), and so on. + - For various examples we will also do this while strong convection relative to the mesh size makes central differences a poor choice (as it gives interesting systems). 16

72 Model problems This gives the discrete equations Dy - é ê p U - U - p U - U Dx ë Dx - q U - U - p U - U ( ) ( ) i+ 12, j i+ 1, j ij, i-12, j ij, i-1, j é ù ê ij, ( ij, + 1 ij, ) ij, -1 2 ( ij, ij, -1) Dy ë ú û ( Dy /2) r ( U U ) ( Dx /2) s ( U U ) ij, i+ 1, j i- 1, j ij, ij, + 1 ij, -1 + DD x yt U = DD x yf ij, ij, ij, ù ú û Often we divide this result again by DD. x y 17

73 Rate of Convergence Let ˆx be the solution of Ax = b, and we have iterates x, x, x, { x k } converges (q-)linearly to ˆx if there are N ³ 0 and c Î [0,1) such that for k ³ N: x - xˆ c x - xˆ, k+ 1 k { x k } converges (q-)superlinearly to ˆx if there are N ³ 0 and a sequence { c k } that converges to 0 such that for k ³ N: x - xˆ c x - xˆ k+ 1 k k { x k } converges to ˆx with (q-)order at least p if there are p > 1, c ³ 0, and N ³ 0 such that k ³ N: x - x ˆ c x -x ˆ p (quadratic if p = 2, cubic k+ 1 k if p = 3, and so on) { x k } converges to ˆx with j-step (q-)order at least p if there are a fixed integer j ³ 1, p > 1, c ³ 0, and N ³ 0, such that k ³ N: x - xˆ c x - xˆ p + k j k 9

Synopsis of Numerical Linear Algebra

Synopsis of Numerical Linear Algebra Synopsis of Numerical Linear Algebra Eric de Sturler Department of Mathematics, Virginia Tech sturler@vt.edu http://www.math.vt.edu/people/sturler Iterative Methods for Linear Systems: Basics to Research

More information

Synopsis of Numerical Linear Algebra

Synopsis of Numerical Linear Algebra Synopsis of Numerical Linear Algebra Eric de Sturler Department of Mathematics, Virginia Tech sturler@vt.edu http://www.math.vt.edu/people/sturler Iterative Methods for Linear Systems: Basics to Research

More information

Eigenvalues and eigenvectors

Eigenvalues and eigenvectors Chapter 6 Eigenvalues and eigenvectors An eigenvalue of a square matrix represents the linear operator as a scaling of the associated eigenvector, and the action of certain matrices on general vectors

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

OVERVIEW. Basics Generalized Schur Form Shifts / Generalized Shifts Partial Generalized Schur Form Jacobi-Davidson QZ Algorithm

OVERVIEW. Basics Generalized Schur Form Shifts / Generalized Shifts Partial Generalized Schur Form Jacobi-Davidson QZ Algorithm OVERVIEW Basics Generalized Schur Form Shifts / Generalized Shifts Partial Generalized Schur Form Jacobi-Davidson QZ Algorithm BASICS Generalized Eigenvalue Problem Ax lbx We call (, x) Left eigenpair

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

Eigenvalue and Eigenvector Problems

Eigenvalue and Eigenvector Problems Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

Eigenpairs and Similarity Transformations

Eigenpairs and Similarity Transformations CHAPTER 5 Eigenpairs and Similarity Transformations Exercise 56: Characteristic polynomial of transpose we have that A T ( )=det(a T I)=det((A I) T )=det(a I) = A ( ) A ( ) = det(a I) =det(a T I) =det(a

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Recap: A N N matrix A has an eigenvector x (non-zero) with corresponding

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

Control Systems. Linear Algebra topics. L. Lanari

Control Systems. Linear Algebra topics. L. Lanari Control Systems Linear Algebra topics L Lanari outline basic facts about matrices eigenvalues - eigenvectors - characteristic polynomial - algebraic multiplicity eigenvalues invariance under similarity

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit V: Eigenvalue Problems Lecturer: Dr. David Knezevic Unit V: Eigenvalue Problems Chapter V.2: Fundamentals 2 / 31 Eigenvalues and Eigenvectors Eigenvalues and eigenvectors of

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) AMS526: Numerical Analysis (Numerical Linear Algebra for Computational and Data Sciences) Lecture 14: Eigenvalue Problems; Eigenvalue Revealing Factorizations Xiangmin Jiao Stony Brook University Xiangmin

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Total 170. Name. Final Examination M340L-CS

Total 170. Name. Final Examination M340L-CS 1 10 2 10 3 15 4 5 5 10 6 10 7 20 8 10 9 20 10 25 11 10 12 10 13 15 Total 170 Final Examination Name M340L-CS 1. Use extra paper to determine your solutions then neatly transcribe them onto these sheets.

More information

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J Class Notes 4: THE SPECTRAL RADIUS, NORM CONVERGENCE AND SOR. Math 639d Due Date: Feb. 7 (updated: February 5, 2018) In the first part of this week s reading, we will prove Theorem 2 of the previous class.

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

The Eigenvalue Problem: Perturbation Theory

The Eigenvalue Problem: Perturbation Theory Jim Lambers MAT 610 Summer Session 2009-10 Lecture 13 Notes These notes correspond to Sections 7.2 and 8.1 in the text. The Eigenvalue Problem: Perturbation Theory The Unsymmetric Eigenvalue Problem Just

More information

Numerical Methods for Solving Large Scale Eigenvalue Problems

Numerical Methods for Solving Large Scale Eigenvalue Problems Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

More information

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS p. 2/4 Eigenvalues and eigenvectors Let A C n n. Suppose Ax = λx, x 0, then x is a (right) eigenvector of A, corresponding to the eigenvalue

More information

Numerical Methods - Numerical Linear Algebra

Numerical Methods - Numerical Linear Algebra Numerical Methods - Numerical Linear Algebra Y. K. Goh Universiti Tunku Abdul Rahman 2013 Y. K. Goh (UTAR) Numerical Methods - Numerical Linear Algebra I 2013 1 / 62 Outline 1 Motivation 2 Solving Linear

More information

Linear System Theory

Linear System Theory Linear System Theory Wonhee Kim Lecture 4 Apr. 4, 2018 1 / 40 Recap Vector space, linear space, linear vector space Subspace Linearly independence and dependence Dimension, Basis, Change of Basis 2 / 40

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix. Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of Chapter 2 Linear Algebra In this chapter, we study the formal structure that provides the background for quantum mechanics. The basic ideas of the mathematical machinery, linear algebra, are rather simple

More information

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University Lecture 14 Eigenvalue Problems Songting Luo Department of Mathematics Iowa State University MATH 562 Numerical Analysis II Songting Luo ( Department of Mathematics Iowa State University[0.5in] MATH562

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Reduction to Hessenberg and Tridiagonal Forms; Rayleigh Quotient Iteration Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers.. EIGENVALUE PROBLEMS Background on eigenvalues/ eigenvectors / decompositions Perturbation analysis, condition numbers.. Power method The QR algorithm Practical QR algorithms: use of Hessenberg form and

More information

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. Determinants Ex... Let A = 0 4 4 2 0 and B = 0 3 0. (a) Compute 0 0 0 0 A. (b) Compute det(2a 2 B), det(4a + B), det(2(a 3 B 2 )). 0 t Ex..2. For

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

More information

Linear algebra II Tutorial solutions #1 A = x 1

Linear algebra II Tutorial solutions #1 A = x 1 Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Math Homework 8 (selected problems)

Math Homework 8 (selected problems) Math 102 - Homework 8 (selected problems) David Lipshutz Problem 1. (Strang, 5.5: #14) In the list below, which classes of matrices contain A and which contain B? 1 1 1 1 A 0 0 1 0 0 0 0 1 and B 1 1 1

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH

ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH V. FABER, J. LIESEN, AND P. TICHÝ Abstract. Numerous algorithms in numerical linear algebra are based on the reduction of a given matrix

More information

Computational Linear Algebra

Computational Linear Algebra Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 3: Iterative Methods PD

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012.

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012. Math 5620 - Introduction to Numerical Analysis - Class Notes Fernando Guevara Vasquez Version 1990. Date: January 17, 2012. 3 Contents 1. Disclaimer 4 Chapter 1. Iterative methods for solving linear systems

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Homework 2 Foundations of Computational Math 2 Spring 2019

Homework 2 Foundations of Computational Math 2 Spring 2019 Homework 2 Foundations of Computational Math 2 Spring 2019 Problem 2.1 (2.1.a) Suppose (v 1,λ 1 )and(v 2,λ 2 ) are eigenpairs for a matrix A C n n. Show that if λ 1 λ 2 then v 1 and v 2 are linearly independent.

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

9.1 Eigenvectors and Eigenvalues of a Linear Map

9.1 Eigenvectors and Eigenvalues of a Linear Map Chapter 9 Eigenvectors and Eigenvalues 9.1 Eigenvectors and Eigenvalues of a Linear Map Given a finite-dimensional vector space E, letf : E! E be any linear map. If, by luck, there is a basis (e 1,...,e

More information

Chapter Two Elements of Linear Algebra

Chapter Two Elements of Linear Algebra Chapter Two Elements of Linear Algebra Previously, in chapter one, we have considered single first order differential equations involving a single unknown function. In the next chapter we will begin to

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

MATH 1553-C MIDTERM EXAMINATION 3

MATH 1553-C MIDTERM EXAMINATION 3 MATH 553-C MIDTERM EXAMINATION 3 Name GT Email @gatech.edu Please read all instructions carefully before beginning. Please leave your GT ID card on your desk until your TA scans your exam. Each problem

More information

LINEAR SYSTEMS (11) Intensive Computation

LINEAR SYSTEMS (11) Intensive Computation LINEAR SYSTEMS () Intensive Computation 27-8 prof. Annalisa Massini Viviana Arrigoni EXACT METHODS:. GAUSSIAN ELIMINATION. 2. CHOLESKY DECOMPOSITION. ITERATIVE METHODS:. JACOBI. 2. GAUSS-SEIDEL 2 CHOLESKY

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

MAT 610: Numerical Linear Algebra. James V. Lambers

MAT 610: Numerical Linear Algebra. James V. Lambers MAT 610: Numerical Linear Algebra James V Lambers January 16, 2017 2 Contents 1 Matrix Multiplication Problems 7 11 Introduction 7 111 Systems of Linear Equations 7 112 The Eigenvalue Problem 8 12 Basic

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information