Quantum Computing Lecture 2. Review of Linear Algebra

Similar documents
Review of Linear Algebra

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra using Dirac Notation: Pt. 2

Linear Algebra and Dirac Notation, Pt. 1

Linear Algebra: Matrix Eigenvalue Problems

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

CS 246 Review of Linear Algebra 01/17/19

LINEAR ALGEBRA REVIEW

Linear Algebra Review. Vectors

Linear Algebra (Review) Volker Tresp 2018

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Mathematical Methods wk 2: Linear Operators

Knowledge Discovery and Data Mining 1 (VO) ( )

Vector spaces and operators

Math Linear Algebra II. 1. Inner Products and Norms

Symmetric and anti symmetric matrices

Introduction to Matrix Algebra

Chapter III. Quantum Computation. Mathematical preliminaries. A.1 Complex numbers. A.2 Linear algebra review

Spectral Theorem for Self-adjoint Linear Operators

Lecture 3: Hilbert spaces, tensor products

A = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].

Matrices and Linear Algebra

1. General Vector Spaces

Chapter 2. Linear Algebra. rather simple and learning them will eventually allow us to explain the strange results of

Mathematical Foundations of Quantum Mechanics

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Review problems for MA 54, Fall 2004.

The following definition is fundamental.

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

2. Introduction to quantum mechanics

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Basic Concepts in Matrix Algebra

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Lecture 8 : Eigenvalues and Eigenvectors

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010

Lecture 3: Matrix and Matrix Operations

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

. Here we are using the standard inner-product over C k to define orthogonality. Recall that the inner-product of two vectors φ = i α i.

Math 108b: Notes on the Spectral Theorem

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Foundations of Matrix Analysis

2. Matrix Algebra and Random Vectors

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Assignment 11 (C + C ) = (C + C ) = (C + C) i(c C ) ] = i(c C) (AB) = (AB) = B A = BA 0 = [A, B] = [A, B] = (AB BA) = (AB) AB

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Linear Algebra (Review) Volker Tresp 2017

Lecture 2: Linear operators

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

Chapter 3 Transformations

Matrix Representation

Review of linear algebra

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

1 Last time: least-squares problems

Basic Concepts in Linear Algebra

Categories and Quantum Informatics: Hilbert spaces

Linear algebra and applications to graphs Part 1

Notes on Linear Algebra and Matrix Theory

Linear Algebra in Actuarial Science: Slides to the lecture

Background Mathematics (2/2) 1. David Barber

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Review of Basic Concepts in Linear Algebra

LinGloss. A glossary of linear algebra

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Matrices. Chapter Definitions and Notations

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Linear Algebra Review

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

This appendix provides a very basic introduction to linear algebra concepts.

Matrix Algebra Determinant, Inverse matrix. Matrices. A. Fabretti. Mathematics 2 A.Y. 2015/2016. A. Fabretti Matrices

The quantum state as a vector

Chapter 2 Notes, Linear Algebra 5e Lay

Lecture If two operators A, B commute then they have same set of eigenkets.

Chem 3502/4502 Physical Chemistry II (Quantum Mechanics) 3 Credits Fall Semester 2006 Christopher J. Cramer. Lecture 5, January 27, 2006

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Quantum Mechanics and Quantum Computing: an Introduction. Des Johnston, Notes by Bernd J. Schroers Heriot-Watt University

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Notes on basis changes and matrix diagonalization

Review of Linear Algebra

Linear Algebra Massoud Malek

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Mathematical foundations - linear algebra

The Singular Value Decomposition

Lecture 3: Review of Linear Algebra

Linear Algebra Highlights

2. Review of Linear Algebra

Quantum Computation. Alessandra Di Pierro Computational models (Circuits, QTM) Algorithms (QFT, Quantum search)

Linear Algebra Review (Course Notes for Math 308H - Spring 2016)

Lecture 3: Review of Linear Algebra

Lecture 6: Geometry of OLS Estimation of Linear Regession

Phys 201. Matrices and Determinants

Matrices A brief introduction

Transcription:

Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces and linear operators are studied in linear algebra This lecture reviews definitions from linear algebra that we will need in the rest of the course! Unless stated otherwise, all vector spaces are over C Recall: Any z C is of the form z = a + ib for some a, b R

Matrices An n m matrix is an array of numbers a ij arranged as follows: a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm For any A, we write A ij to denote the entry in the i-th row and the j-th column of A In the matrix above we have A ij = a ij We denote the set of all n m complex matrices by M n,m (C) Basic matrix operations Matrices can be added (if they are of the same shape) and multiplied by a constant (scalar) Both operations are performed entry-wise If A, B M n,m (C) and c C then (A + B) ij = A ij + B ij (ca) ij = ca ij Example: ( ) ( ) ( ) a11 a 12 a 13 b11 b + 12 b 13 a11 + b = 11 a 12 + b 12 a 13 + b 13 a 21 a 22 a 23 b 21 b 22 b 23 a 21 + b 21 a 22 + b 22 a 23 + b 23 ( ) ( ) a11 a c 12 a 13 ca11 ca = 12 ca 13 a 21 a 22 a 23 ca 21 ca 22 ca 23

Conjugate transpose For A M n,m (C) we define complex conjugate: A and ( Ā ) (A ) ij = (A ij ) a11 a 12 a 13 = a 21 a 22 a 23 ( ) a 11 a 12 a 13 a 21 a 22 a 23 transpose: A T ( ) T a (A T a11 a ) ij = A 12 a 11 a 21 13 ji = a a 21 a 22 a 12 a 22 23 a 13 a 23 conjugate transpose (adjoint): A = (A ) T ( ) a (A ) ij = A a11 a 12 a 11 a 21 13 ji = a a 21 a 22 a 12 a 22 23 a 13 a 23 Warning: mathematicians often use A to denote A Vectors and inner product Vector is a matrix that has only one row or one column: row vector ( a1 a 2 a n ) column vector b 1 b 2 Just like matrices, vectors of the same shape can be added and multiplied by scalars They form a vector space denoted C n A row vector and a column vector can be multiplied together if they have the same number of entries The result is a number: b 1 ( ) b 2 a1 a 2 a n = a 1b 1 + a 2 b 2 + + a n b n = b n This is a special case of matrix product b n n a i b i i=1

Matrix multiplication If A is n m and B is m l matrix then C = A B is the n l matrix with entries given by m C ik = A ij B jk j=1 for all i = 1,, n and k = 1,, l Note that C ik is just the product of the i-th row of A and the k-th column of B: l Note: It is possible to multiply A and B only if the number of columns of A agrees with the number of rows of B: [n m] [m l] = [n l] n m A m i B C k Properties of matrix multiplication Associativity: (A B) C = A (B C) = ABC Distributivity: A(B + C) = AB + AC and (A + B)C = AB + BC Scalar multiplication: λ(ab) = (λa)b = A(λB) = (AB)λ Complex conjugate: (AB) = A B Transpose: (AB) T = B T A T (reversed order!) Conjugate transpose: (AB) = B A (reversed order!) Warning: Matrix multiplication is not commutative: AB BA In fact, BA might not even make sense even if AB is well-defined (eg, when A is 3 2 and B is 2 1)

Dirac bra-ket notation Column vector (ket) a 1 a 2 ψ = a n Row vector (bra) ψ = ( ) a 1 a 2 a n Note: Ket and bra are conjugate transposes of each other: ψ ψ ψ ψ For ψ, ϕ C n and A M n,n (C) the following are valid products: inner product: ψ ϕ ψ ϕ C outer product: ψ ϕ ψ ϕ M n n (C) matrix-vector product: A ψ A ψ and ψ A ψ A Quiz Let ψ, ϕ C n and A M n,n (C) Which of these make sense? (a) ψ + ϕ (h) ψ ϕ A (b) ψ ϕ (i) ψ A ϕ (c) A ψ (j) ψ A ϕ (d) ψ A (k) ψ A ϕ + ψ ϕ (e) ψ A (l) ψ ϕ ψ (f) ψ A + ϕ (m) ψ ϕ A (g) ψ ϕ (n) ψ ψ ϕ = ψ ϕ ψ

Inner product The complex number u v is called the inner product of u, v C n If u = a 1 a n and v = Simple observations: b 1 b n u v = ( ) a 1 a n u v = ( v u ) = v u then their inner product is b 1 b n = n a i b i i=1 u u = n i=1 a i 2 is a real number u u and u u = iff u = (the all-zeroes vector) Norm and orthogonality The norm of a vector u is the non-negative real number u = n u u = i=1 a i 2 A unit vector is a vector with norm 1, ie, n i=1 a i 2 = 1 Recall: qubit states α + β 1 are unit vectors since α 2 + β 2 = 1 Cauchy Schwarz inequality: u v u v for any u, v C n If u and v are unit vectors then u v [, 1], so u v = cos θ for some angle θ [, π/2] Vectors u and v are orthogonal if u v = (ie, θ = π/2)

Basis A basis of C n is a minimal collection of vectors v 1,, v n such that every vector v C n can be expressed as a linear combination of these: v = α 1 v 1 + + α n v n for some coefficients α i C In particular, v 1,, v n are linearly independent, meaning that no v i can be expressed as a linear combination of the rest The number n (the size of the basis) is uniquely determined by the vector space and is called its dimension The dimension of C n is n Given a basis, every vector v can be uniquely represented as an n-tuple of scalars α 1,, α n, called the coordinates of v in this basis An orthonormal basis is a basis made up of unit vectors that are pairwise orthogonal: { 1 if i = j v i v j = δ ij if i j Standard basis for C n The following are all bases for C 2 (the last two are orthonormal): ( ( ) ( ( ( ) ( ) 3 4 1 1 1 1 1,,, 2) i ) 1) 2 1 2 1 The standard or computational basis for C n is 1 1 2 1 n 1 (Sometimes we will use, 1,, n 1 instead, eg, when n = 2) Any vector can be expanded in the standard basis: a 1 n = a i i a i=1 n

Outer product and projectors With every pair of vectors u C n, v C m we can associate a matrix u v M n,m (C) known as the outer product of u and v For any w C m, it satisfies the property ( u v ) w = v w u If u is a unit vector then u u is the projector on the one-dimensional subspace generated by u This can be easily seen if u and v are real and we set cos θ = u v Then u u v = u v u v θ u v u u Expanding a matrix in the standard basis 1 1 ( ) 1 2 = 1 = Similarly, i j has 1 at coordinates (i, j) and zeroes everywhere else Any n m matrix can be expressed as a linear combination of outer products: a 11 a 12 a 1m a 21 a 22 a 2m = n m a ij i j i=1 j=1 a n1 a n2 a nm We can recover the (i, j) entry of A as follows: i A j = a ij

Eigenvalues and eigenvectors An eigenvector of A M n,n (C) is a non-zero vector v C n such that A v = λ v for some complex number λ (the eigenvalue corresponding to v ) The eigenvalues of A are the roots of the characteristic polynomial: det(a λi) = where det denotes the determinant and I is the n n identity matrix: I = n i i i=1 Each n n matrix has at least one eigenvalue Diagonal representation Matrix A M n,n (C) is diagonalisable if A = n λ i v i v i i=1 where v 1,, v n is an orthonormal set of eigenvectors of A with corresponding eigenvalues λ 1,, λ n This is called spectral decomposition of A Equivalently, A can be written as a diagonal matrix λ 1 λ n in the basis v 1,, v n of its eigenvectors

Normal and Hermitian matrices Matrix A M n,n (C) is normal if AA = A A Theorem: A matrix is diagonalisable if and only if it is normal A is Hermitian if A = A A normal matrix is Hermitian if and only if it has real eigenvalues Unitary matrices Matrix A M n,n (C) is unitary if AA = A A = I where I is the n n identity matrix Unitary operators are normal and therefore diagonalisable Unitary operators preserve inner products: if U is unitary and u = U u and v = U v then u v = (U u ) (U v ) = u U U v = u v All eigenvalues of a unitary operator have absolute value 1

Tensor product Let A and B be matrices of arbitrary shape Their tensor product is the following block matrix: A 11 B A 12 B A 1m B A 21 B A 22 B A 2m B A B = A n1 B A n2 B A nm B where the block at coordinates (i, j) is equal to A ij B If A is n m and B is n m then A B is nn mm Tensor product applies to vectors too, eg, ψ ϕ and ψ ϕ Note: ψ ϕ is a shorthand for ψ ϕ Summary Matrix multiplication: AB = C where j A ijb jk = C ik, A, B, C have dimensions [n m] [m l] = [n l] Conjugate transpose: A = ĀT, (AB) = B A Dirac notation: ψ ψ and ψ ψ Inner product: ψ ϕ, it is for orthogonal vectors Norm: ψ = ψ ψ, it is 1 for unit vectors Orthonormal basis: v i v j = δ ij Outer product: ψ ϕ, projector: ψ ψ where ψ = 1 Eigenvalues and eingevectors: A v = λ v Spectral decomposition: A = i λ i v i v i iff A is normal Normal: AA = A A Hermitian: A = A Unitary: AA = A A = I Tensor product: A B is a block matrix with (i, j)-th block A ij B