Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen
|
|
- Stephanie Houston
- 6 years ago
- Views:
Transcription
1 .. Honors Linear Algebra, Spring 7. Homework 8 solutions by Yifei Chen 8... { { W {v R 4 v α v β } v x, x, x, x 4 x x + x 4 x + x x + x 4 } Solve the linear system above, we get a basis of W is {v,,,, v,,, } The formula of Gram-Schmidt orthogonalization process can be found in the proof of Theorem, page 8. α β β,, α β β α α β,, α α α,, α β β α α β α α,, α α α,, Therefore, {α, α, α } is an orthonormal basis of R Apply Gram-Schmidt orthogonalization process to β,, i, β,, + i, we get notice that α β x, x, x y, y, y x ȳ + x ȳ + x ȳ in C α β β,, i α β β α α + i,, i α α + i α,, i Therefore, {α, α } is an othonormal basis for the subspace spanned by β and β Let α, 4, then α So β α/ α /7, 4/7. a The formula for the projection p : V W is given by pv v ββ. Suppose v x, x, then [ pv Ex, x v ββ x x 7 + x + 4x 4 ] 7 7, x x, 6 49 x + 4x. v v x, x x x + x x x x + 4x x x u v x x 4 4 y y x y x y x y + 4x y, x x where u x, x, v y, y. See Example in page 7. b Let B {e,, e, } be the standard basis. So pe E, 9/49, 6/49 and pe E, 9/49, 64/49. So the matrix of E with respect to the standard basis is
2 c Try to find a vector v such that v α. Suppose v x, x, then v α x x +4 x +4x x + x. Let v,. Then W Span {v}. d Let γ v/ v 47,. So {β, γ} is an orthonormal basis of R. From the geometric meaning of the orthogonal projection, we get pβ β and pγ. So the matrix of E with respect to the orthonormal basis {β, γ} is 8... The matrix of T with respect to the standard basis is + i i T. So T i i i i. So T T + i i 5 Therefore, T does not commute with T. T T 6 + i i The vector space V has a basis B {, t, t, t }. Apply Gram-Schmidt process to the basis B, we can find an orthonormal basis B {, α, β, γ} calculate yourself... For a given real number t, let g t ω t + α t α + β t β + γ t γ, where ω t, α t, β t, γ t R, then ft g t a f + b f α + c f β + d f γ ω t + α t α + β t β + γ t γ a f ω f + b f α t + c f β t + d f γ t ft, where ft a f + b f α + c f β + d f γ V, a f, b f, c f, d f R. For a fixed t R, let f, t, t, t respectively. Then we have corresponding coefficients a f, b f, c f, d f and ft for each f. Therefore, we get 4 linear equations about the variables ω t, α t, β t, γ t. The linear system has solutions because f is chosen from a basis {, t, t, t }. Then we can get g t which satisfies f g t ft for all f V from the linearity of the inner product Use the above orthonormal basis B, and find the matrix A of the differentiation D with respect to B. Then take the conjugate and transpose A of A. We get A is the matrix of D with respect to the basis B. notice that the way follows from the corollary in page 94. The corollary works for an orthonormal basis a For any α, α V, suppose α β + γ and α β + γ, where β i, W and γ j W. Then Uα α β γ β + γ β β γ γ β + γ β γ α Uα Uα Uα β γ β γ β β + γ γ β + γ β + γ α α So U is self-adjoint and unitary. b W is a subspace spanned by,,. Let w,,,,,, any α V, let β α ww and γ α β, then β W and γ W and α β + γ. So Ue e ww e e ww e ww e,, e Ue e ww e e Ue e ww e,, e. So the matrix of U with respect to the standard basis is cos θ sin θ The eigenvalues of the matrix A are e sin θ cos θ ±iθ. AA A A, i.e., A is normal. By the Spetral Theorem, A is unitarily equivalent to... So W Span {w}. For We can easily check that e iθ e iθ.
3 Let λ be an eigenvalue of T, and α an eigenvector of λ notice that α, we are going to use the condition. Since T is unitary, we get λ e iθ. Indeed, T α λα λ α. So λ by α. Therefore, λ e iθ. On the other hand, λ is a positive real number since T is positive. From the definition of positive operator, we get T T and T v v > for all v V. Since T is self-adjoint, we get the eigenvalue λ is real. Actually, T α α λα α λα α λ λ since α α i.e., λ R. α T α α λα λα α T α α λα α > λ >. All eigenvalues λ of T satisfy λ e iθ and λ > implies λ. Apply the Spectral theorem, T is unitarily equivalent to I, i.e., U T U I for some unitary matrix U. So T UIU I T α β T α + it α β T α β + it α β α T β + iα T β α T it β. By the uniqueness of T, we get T T it. If T and T are self-adjoint operator which commute, then T i T i, i, and T T T T. So we get T T T + it T it T T it T + it T + T T T + T T it T + it T T. So T is normal. On the other hand, if T is normal, let T T + T, T i T T, see page 97 in the textbook. We can easily check that T T + it and T T T i, i, are self-adjoint. T and T commute since and T T, i.e., T T 4i T + T T T 4i T T 4i T T T + T T T Suppose the operator T is normal and nilpotent. From the Spectral theorem, T is unitarily λ λ equivalent to, where λ i are eigenvalues of T. On the other hand, T is nilpotent, so all eigenvalues λ i. So T is unitarily equivalent to zero matrix. Hence T is the zero operator. cos θ sin θ 9... It is easy to calculate that the matrix A of T θ is sin θ cos θ e ±iθ we have seen in exercise So the eigenvalues of A are By the proof of exercise , we get if T θ is positive, then the eigenvalues e ±iθ of T θ are positive numbers. Therefore, we get θ kπ, k, ±, ±, a It is easy to verify that the principal minors of A are positive. That is, A det > and A det >. So by Theorem 6 in page 8, A is positive.
4 b Applying the Gram-Schmidt process to the basis {X, X }, we get α X X X α X X α α So α α α c Let. P α, α then A P P P P. If you wonder why P should be taken so, please read the proof of Theorem 5 in page a Since A A, we get A A A AA, i.e., A is normal. By there exists a unitary matrix U, such that λ U λ AU, 6, where λ i, i,..., n are eigenvalues of A. U BU U k A k U U AU k k k λ k k λ k k λ k n e λ e λ e. So det B detu BU e λ+λ+ + e tr A. b B k A k A k k A k k e A. c If you know the property of exponent of matrices, then BB e A e A I. λ By the proof of a, we have A UΛU λ, where Λ. A A UΛ U UΛU Λ Λ λ i λ i λ i iθ i, i.e., λ i is a pure imaginary. So BB Ue Λ U Ue Λ U Ue Λ e Λ U U e iθ e iθ e iθ e iθ e iθn e iθn U UIU I. So B is unitary. 4
5 a Since U is unitary, U is normal. By the Spectral theorem, there exists a unitary matrix P, such that P UP Λ, where Λ is a diagonal matrix with the main diagonal consisting of eigenvalues of U. From the condition, Uα α implies α. We get all eigenvalues of U are not. Therefore I Λ is invertible. So is I U P I ΛP. Hence I U is well defined. We can plug U in the formula of fz and get fu ii + UI U. b We want to show fu fu. From a, we have I UfU ii + U. So [fui U] [ii + U] I U fu ii + U fu ii U I + U. We can use the same way in a to prove I U is invertible. I omitted the proof here. To show fu fu, it is equivalent to showing ii + UI U ii U I + U ii U I + U ii + U I U. The verification of the right hand side above is easy and left to you. c If T is self-adjoint, then T is normal. By the spectral theorem, ±i are not eigenvalues of T same method in the proof of a. Therefore, T ± ii are invertible. Since T is self-adjoint, we have U T iit + ii UT + ii T ii UT + ii T ii [UT + ii] [T ii] T iiu T + ii T iiu T + ii., we have T iiu UT +ii T +iit ii T +I T iit +ii U U T ii T iit +iit +ii I. This is why we need to show T ± ii are invertible See the definition for self-adjoint algebra in page 45. Let F be the family of operators obtained by letting B varying over all diagonal matrices, i.e. F {L B B is diagonal }. Since L B L B L B B and L B L B L BB L BB L B L B, we get F is a commutative subalgebra of LV, V. On the other hand, it is not difficult to verify that L B L B hint: verify that L B A C A L B C, i.e., Tr BAC Tr AC B. The last equality comes from the property of trace. Since B is diagonal, B is diagonal as well. In other words, L B is in F. By the definition of self-adjoint algebra, we get F is self-adjoint. To find the spectral resolution of F, we need to thoroughly understand the proof of Theorem 5 in page 4. Let {e ij } be the standard basis of V, then {T i L eii }, i,..., n is a basis of F verify it. Let E i : V W i be the orthogonal projection of V on W i, where W i Span {e ij }, j,..., n. Let E i : V W i be the orthogonal projection of V on W i, where W i Span {e kj }, k, j,..., n, k i. Then we have I E + E E + E E n + E n resolution of identity and L eii E i resolution of L eii and notice that the eigenvalues of L eii are,. So So I E + E E + E E n + E n. α j,j,...,j n E j E j E njn α, α V. If E j E j E njn for some sequence j,..., j n, then j k and j l for l k verify it when n. Let α e, and β E E E E n α α. Then T i β L eii β c i β, where c i { i i For any T L B F, then L B m i b il eii m i b it i, where b i is the entry in the main diagonal of B. So T β b i T i β b β. 5
6 Let r : F F be a function of F and defined by r T b, where T b T + + b n T n, i.e., take the first component. V r i W. Similarly, we can get r i : F F which is defined by r i T b i where T b T b n T n. V r i W i. Let P j be the orthogonal projection of V on V r j, j n. Then T n j r jt P j is the spectral resolution of T in terms of F. The computation of other two families is similar work out yourself..... See Theorem and corollary in page 9-4 in Functional Analysis, Kosaku Yosida. One correction of the problem, in the textbook, actually in most textbooks I saw including the reference of Yosida, the polarization identity is 4x y x + y x y + i x + iy x iy. But the polarization identity given by Prof. Ha is suppose to be 4x y x+y x y i x+iy x iy NOT 4x y x+y x y i x+iy i x iy. They are both correct! The difference comes from Prof. Ha and some textbooks use different definition of inner product. cx y cx y by the textbook cx y cx y by Prof. Ha. This is just different convention. You will see this in problem 6 b as well.. If T is an operator on a complex finite-dimensional inner product space V, we want to show the following statement are equivalent: i T ii T x y for all x, y V iii T x x for all x V. Proof. i ii. Since T, we have T x y y for all x, y V. ii iii. Trivial! let y x iii i. By the property of inner product, we have x T x T x x. By the condition, we get x T x T x x, i.e., T is self adjoint. By the spectral theorem, T is unitary equivalent to the λ λ diagonal matrix. If we can show all eigenvalues of T are, then T is unitary equivalent to. Hence T. Let λ be an eigenvalue of T and ξ is the eigenvector of λ. Then T ξ ξ λξ ξ λξ ξ. Since ξ, we have ξ ξ >. Therefore λ. For a real vector space, i ii iii is true. But iii i is not true! For instance, let T be the rotation of π, then T x x for all x V. But T. 4. Notice: The definition of positive operator given by Prof. Ha is same as the definition of non-negative operator. The definition of strictly positive operator given by Prof. Ha is same as the definition of positive operator. See Definition in page 9 in the text and page in hw 8. a T T T T T T, so T T is self-adjoint. T T x x T x T x T x T x. The first equality comes from the definition of adjoint operator and the last inequality comes from the property of inner product. b If T is self-adjoint, then all eigenvalues of T are real. Indeed, suppose λ is an eigenvalue of T and ξ is the eigenvector of λ, then Since ξ, we have ξ ξ >. Hence λ λ. λξ ξ λξ ξ T ξ ξ ξ T ξ ξ λξ λξ ξ. T x x for all x V, we get all eigenvalues of T are nonnegative. Indeed, T ξ ξ λξ ξ λ. 6
7 c If T x x, then T x x R since there is no order on complex numbers, i.e., we couldn t say a complex number is greater than. Therefore, T x x T x x x T x. The second equality comes from the property of inner product. The whole equality means T is self-adjoint. d We only need to prove that T is positive if and only if T R for some self-adjoint operator R. The proof of the proposition implies the middle statement, T S S for some operator S. In part a, we have proved that if T R for some self-adjoint operator R, then T is positive. On the other hand, if T is positive, then T is self-adjoint. So T is normal. By spectral theorem, there λ exists a unitary matrix U, such that U λ T U Λ, where Λ is a diagonal matrix. Since T is positive, by part b, λ i, i,..., n. Then there exist real numbers a i, such that a a i λ a i, i,..., n. Let A. And let R U AU, then T R. Since A is a real diagonal matrix, A is self-adjoint. So is R. an If V is a real inner product space. The condition T x x no longer guarantees self-adjointness. For instance, Let V R with the standard inner product and T R θ : V V, where R θ is a rotation of an acute angle θ counter-clockwise. Then we have T x x for all x V. However T is not self-adjoint cos θ sin θ since the matrix of T with respect to the standard basis, is not self-adjoint symmetric sin θ cos θ in the real case. 5. Let A E kera Span {v }, where. Then eigenvalues of A are,. For eigenspace E ker A Span {v, v }, v, v, v Apply Gram-Schmidt method to v, v, we get an orthonormal basis {w, w } of E. Then normalize v, we get w is an orthonormal basis of E. Let Q w, w, w, then QAQ is diagonal. I omit here for concrete computation. Actually I have seen this problem in hw 6, problem. Same method can be applied to the other two matrices. 6. a See page 85 for the definition of orthogonal projection. See Theorem 4, iii for the formula to find the orthogonal projection of a vector β V. First, use Gram-Schmidt method to the v,,,, v,,,, v,,, and get an orthonormal basis {α, α, α } of Span {v, v, v }. Let α be the projection of β,,, 4, then, by the formula in Theorem 4, iii, we have You d better to practice the way. π α k β α k α k α k β α k α k. b To verify f g fθgθ dθ actually defines an inner product on V, we need to f g satisfies π the definition of inner product in page 7 in the text book. i f + g h π f + gθhθ dθ π π k fθhθ dθ π + π. gθhθ dθ f h + g h. π 7
8 ii cf g π cfθgθ dθ π c I mention the difference in problem as well. iii π f g fθgθ dθ π π π fθgθ dθ π fθgθ dθ π cf g π gθfθ dθ π g f. iv If f, then fx for some x [, π]. Since f is continuous, then there exists an open neighborhood U [, π] of x, such that f U. So f f π fθfθ dθ π π fθ dθ π fθ dθ U π >. notice that we need the continuity condition of f here. Otherwise, let f, fx for x, π], then f f π f π dθ. c Let f k sinkx, g k coskx, k, ±, ±. Then f k g l π π sinkx coslxdx 4π π [sinkx+lx+sinkx lx]dx [ cosk + lx + 4π k + l Similarly, we can show that f k f l, k l and g k g l, k l by the formula ] cosk lx π. k l sinkx sinlx [coskx + lx coskx lx], coskx coslx [coskx + lx + coskx lx]. Since f k, g l are mutually orthogonal, we can normalize them get an orthonormal basis of W. By the formula of part a, we get g. Then we can get f g f g, f g /. Explicit computation is cumbersome since an orthonormal basis of W consists of 9 vectors. You can compute yourself. d,e Let {, t, t, t, t 4 } be a basis of P. Use Gram-Schmidt method to get an orthonormal basis B of P. Then by the formula in part a, we can find the orthogonal projection of fθ sin θ. To find the adjoint of T d/dθ, we need to find the matrix A of T with respect to the orthonormal basis B. Then take the conjugate transpose of A. A is the adjoint of T. The method is routine, but I didn t calculate. 7. I am not clear of the definition of A log A, please ask Prof. Ha about problem 7 and a Let ξ x,..., x n, A a ij i,j, λ max i,j { a ij }. Aξ a j x j j a j x j j + a j x j j + a j x j j a j x j j nλ n j x j + + a nj x j j + + a nj x j j + a j x j j n λ ξ. Let C nλ. Then Aξ C ξ. So A C is finite. + + a nj x j j 8
9 b Follow the hints, let a sup{ Aξ η : ξ η }, b sup{ Aξ : ξ }, c sup{ Aξ ξ : ξ }, d sup{ Aξ η ξ η : ξ, η }. By Cauchy-Schwartz inequality, Aξ η Aξ η Aξ for ξ η. So a b. { Aξ : ξ } { Aξ ξ Aξ η xi η Aξ : ξ } since ξ Aξ when ξ. So b c. A ξ, notice that ξ η ξ η So d a. Therefore, we get d a b c. We still need to show that c d. to be continued... ξ η η. The results of the problem can be found in the standard functional analysis textbook. I am exhausted, let me have a rest! 9
Math Linear Algebra II. 1. Inner Products and Norms
Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationj=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.
LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationLINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM
LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationAlgebra I Fall 2007
MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary
More informationOctober 25, 2013 INNER PRODUCT SPACES
October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal
More informationThe Hilbert Space of Random Variables
The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More informationx 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7
Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationThe following definition is fundamental.
1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationMATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation
More informationDot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.
Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................
More informationContents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2
Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition
More informationArchive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker
Archive of past papers, solutions and homeworks for MATH 224, Linear Algebra 2, Spring 213, Laurence Barker version: 4 June 213 Source file: archfall99.tex page 2: Homeworks. page 3: Quizzes. page 4: Midterm
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationLinear Algebra Lecture Notes-II
Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More informationLecture 2: Linear operators
Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study
More informationSpectral Theorem for Self-adjoint Linear Operators
Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;
More informationMath 408 Advanced Linear Algebra
Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x
More informationMath 108b: Notes on the Spectral Theorem
Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator
More informationNumerical Linear Algebra
University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices
More informationElementary linear algebra
Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The
More informationMATH 583A REVIEW SESSION #1
MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),
More informationLinear Algebra Practice Final
. Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationLinear Algebra. Workbook
Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx
More informationPractice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5
Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationEigenvalues and Eigenvectors
Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationLECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY
LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector
More informationDiagonalizing Matrices
Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B
More informationNovember 18, 2013 ANALYTIC FUNCTIONAL CALCULUS
November 8, 203 ANALYTIC FUNCTIONAL CALCULUS RODICA D. COSTIN Contents. The spectral projection theorem. Functional calculus 2.. The spectral projection theorem for self-adjoint matrices 2.2. The spectral
More information1 Linear Algebra Problems
Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationEigenvectors and Hermitian Operators
7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding
More informationLinear Algebra and Dirac Notation, Pt. 2
Linear Algebra and Dirac Notation, Pt. 2 PHYS 500 - Southern Illinois University February 1, 2017 PHYS 500 - Southern Illinois University Linear Algebra and Dirac Notation, Pt. 2 February 1, 2017 1 / 14
More informationLinear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4
Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary
More informationMathematical Methods wk 2: Linear Operators
John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm
More informationNONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction
NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques
More informationChapter 2 Spectra of Finite Graphs
Chapter 2 Spectra of Finite Graphs 2.1 Characteristic Polynomials Let G = (V, E) be a finite graph on n = V vertices. Numbering the vertices, we write down its adjacency matrix in an explicit form of n
More informationMATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018
Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry
More informationGQE ALGEBRA PROBLEMS
GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More informationSymmetric and self-adjoint matrices
Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that
More informationGeometry of the Special Unitary Group
Geometry of the Special Unitary Group The elements of SU 2 are the unitary 2 2 matrices with determinant 1. It is not hard to see that they have the form a 1) ) b, b ā with āa+bb = 1. This is the transpose
More informationAppendix A. Vector addition: - The sum of two vectors is another vector that also lie in the space:
Tor Kjellsson Stockholm University Appendix A A.1 Q. Consider the ordinary vectors in 3 dimensions (a x î+a y ĵ+a zˆk), with complex components. Do the following subsets constitute a vector space? If so,
More information4. Linear transformations as a vector space 17
4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation
More informationLinear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.
Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationMath 413/513 Chapter 6 (from Friedberg, Insel, & Spence)
Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationUniversity of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm
University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions
More informationINNER PRODUCT SPACE. Definition 1
INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of
More informationThe Spectral Theorem for normal linear maps
MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question
More information1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations
Math 46 - Abstract Linear Algebra Fall, section E Orthogonal matrices and rotations Planar rotations Definition: A planar rotation in R n is a linear map R: R n R n such that there is a plane P R n (through
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationPRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.
Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence
More informationLecture notes on Quantum Computing. Chapter 1 Mathematical Background
Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For
More informationLinear Algebra using Dirac Notation: Pt. 2
Linear Algebra using Dirac Notation: Pt. 2 PHYS 476Q - Southern Illinois University February 6, 2018 PHYS 476Q - Southern Illinois University Linear Algebra using Dirac Notation: Pt. 2 February 6, 2018
More information18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in
806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state
More informationReflections and Rotations in R 3
Reflections and Rotations in R 3 P. J. Ryan May 29, 21 Rotations as Compositions of Reflections Recall that the reflection in the hyperplane H through the origin in R n is given by f(x) = x 2 ξ, x ξ (1)
More informationEXAM. Exam 1. Math 5316, Fall December 2, 2012
EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationReal symmetric matrices/1. 1 Eigenvalues and eigenvectors
Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that
More informationOrthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016
Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That
More information(v, w) = arccos( < v, w >
MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,
More informationLinear algebra and applications to graphs Part 1
Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces
More informationQuantum Computing Lecture 2. Review of Linear Algebra
Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationVector spaces and operators
Vector spaces and operators Sourendu Gupta TIFR, Mumbai, India Quantum Mechanics 1 2013 22 August, 2013 1 Outline 2 Setting up 3 Exploring 4 Keywords and References Quantum states are vectors We saw that
More informationHOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)
HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationOptimization Theory. A Concise Introduction. Jiongmin Yong
October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationThroughout these notes we assume V, W are finite dimensional inner product spaces over C.
Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal
More informationMATH 235. Final ANSWERS May 5, 2015
MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your
More informationMath 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES
Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More information1 Math 241A-B Homework Problem List for F2015 and W2016
1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let
More informationJordan normal form notes (version date: 11/21/07)
Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let
More informationSingular Value Decomposition (SVD) and Polar Form
Chapter 2 Singular Value Decomposition (SVD) and Polar Form 2.1 Polar Form In this chapter, we assume that we are dealing with a real Euclidean space E. Let f: E E be any linear map. In general, it may
More informationMath 113 Final Exam: Solutions
Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More information(VII.E) The Singular Value Decomposition (SVD)
(VII.E) The Singular Value Decomposition (SVD) In this section we describe a generalization of the Spectral Theorem to non-normal operators, and even to transformations between different vector spaces.
More informationLecture 2: Linear Algebra Review
EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1
More informationMATRICES ARE SIMILAR TO TRIANGULAR MATRICES
MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,
More informationREPRESENTATION THEORY WEEK 7
REPRESENTATION THEORY WEEK 7 1. Characters of L k and S n A character of an irreducible representation of L k is a polynomial function constant on every conjugacy class. Since the set of diagonalizable
More information