QALGO workshop, Riga. 1 / 26. Quantum algorithms for linear algebra.

Size: px
Start display at page:

Download "QALGO workshop, Riga. 1 / 26. Quantum algorithms for linear algebra."

Transcription

1 QALGO workshop, Riga. 1 / 26 Quantum algorithms for linear algebra., Center for Quantum Technologies and Nanyang Technological University, Singapore. September 22, 2015

2 QALGO workshop, Riga. 2 / 26 Overview 1 Introduction 2 The augmented QRAM 3 Quantum Singular Value Estimation 4 Applications, Questions

3 QALGO workshop, Riga. > Introduction 3 / 26 Introduction Quantum algorithms for linear algebra: Linear systems [HHL09], preprints on quantum machine learning [LMR13...].

4 QALGO workshop, Riga. > Introduction 3 / 26 Introduction Quantum algorithms for linear algebra: Linear systems [HHL09], preprints on quantum machine learning [LMR13...]. Assumptions about data like sparsity or bounded l norm.

5 QALGO workshop, Riga. > Introduction 3 / 26 Introduction Quantum algorithms for linear algebra: Linear systems [HHL09], preprints on quantum machine learning [LMR13...]. Assumptions about data like sparsity or bounded l norm. Polynomial dependence on dimension of data for the general case.

6 QALGO workshop, Riga. > Introduction 3 / 26 Introduction Quantum algorithms for linear algebra: Linear systems [HHL09], preprints on quantum machine learning [LMR13...]. Assumptions about data like sparsity or bounded l norm. Polynomial dependence on dimension of data for the general case. Quantum algorithms that work for all data with poly-logarithmic dependence on dimension.

7 QALGO workshop, Riga. > Introduction 3 / 26 Introduction Quantum algorithms for linear algebra: Linear systems [HHL09], preprints on quantum machine learning [LMR13...]. Assumptions about data like sparsity or bounded l norm. Polynomial dependence on dimension of data for the general case. Quantum algorithms that work for all data with poly-logarithmic dependence on dimension. What linear algebra problems can we can solve with such guarantees?

8 QALGO workshop, Riga. > Introduction 4 / 26 Introduction How to design a quantum algorithm for linear algebra/machine learning?

9 QALGO workshop, Riga. > Introduction 4 / 26 Introduction How to design a quantum algorithm for linear algebra/machine learning? The encode-process-extract framework.

10 QALGO workshop, Riga. > Introduction 4 / 26 Introduction How to design a quantum algorithm for linear algebra/machine learning? The encode-process-extract framework. Encoding vectors and matrices into quantum states: the augmented QRAM.

11 QALGO workshop, Riga. > Introduction 4 / 26 Introduction How to design a quantum algorithm for linear algebra/machine learning? The encode-process-extract framework. Encoding vectors and matrices into quantum states: the augmented QRAM. Processing: Coherent operations on quantum encodings, quantum singular value estimation.

12 QALGO workshop, Riga. > Introduction 4 / 26 Introduction How to design a quantum algorithm for linear algebra/machine learning? The encode-process-extract framework. Encoding vectors and matrices into quantum states: the augmented QRAM. Processing: Coherent operations on quantum encodings, quantum singular value estimation. Extraction: Sampling quantum state to obtain classically useful information, low rank approximation [P14] and recommender systems [KP15].

13 QALGO workshop, Riga. > Introduction 5 / 26 Quantum memory models O( n) speedup for quantum search assumes that the data is stored in QRAM, i [n] φ i i QRAM i [n] φ i i x i (1)

14 QALGO workshop, Riga. > Introduction 5 / 26 Quantum memory models O( n) speedup for quantum search assumes that the data is stored in QRAM, i [n] φ i i QRAM i [n] φ i i x i (1) Storage time is discounted, pre-processing can be done at time of storage.

15 QALGO workshop, Riga. > Introduction 5 / 26 Quantum memory models O( n) speedup for quantum search assumes that the data is stored in QRAM, i [n] φ i i QRAM i [n] φ i i x i (1) Storage time is discounted, pre-processing can be done at time of storage. O(nnz(x)) overhead for pre-processing vectors is reasonable, all entries of vector must be read.

16 QALGO workshop, Riga. > Introduction 5 / 26 Quantum memory models O( n) speedup for quantum search assumes that the data is stored in QRAM, i [n] φ i i QRAM i [n] φ i i x i (1) Storage time is discounted, pre-processing can be done at time of storage. O(nnz(x)) overhead for pre-processing vectors is reasonable, all entries of vector must be read. Quantum big data model: Massive dataset stored in quantum memory, storage time discounted, find quantum speedups for linear algebra and machine learning tasks.

17 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM

18 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM Definition Vector state: Given x R n the vector state x is defined as i [n] x i i. 1 x

19 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM Definition Vector state: Given x R n the vector state x is defined as i [n] x i i. 1 x Vector state preparation using QRAM requires time Õ( n) in the worst case by the search lower bounds.

20 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM Definition Vector state: Given x R n the vector state x is defined as i [n] x i i. 1 x Vector state preparation using QRAM requires time Õ( n) in the worst case by the search lower bounds. But we can pre-process, change memory organization and add classical data structures once we have a QRAM.

21 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM Definition Vector state: Given x R n the vector state x is defined as i [n] x i i. 1 x Vector state preparation using QRAM requires time Õ( n) in the worst case by the search lower bounds. But we can pre-process, change memory organization and add classical data structures once we have a QRAM.

22 QALGO workshop, Riga. > Introduction 6 / 26 Encoding, augmented QRAM Definition Vector state: Given x R n the vector state x is defined as i [n] x i i. 1 x Vector state preparation using QRAM requires time Õ( n) in the worst case by the search lower bounds. But we can pre-process, change memory organization and add classical data structures once we have a QRAM. Theorem Vector state x can be prepared in time O(polylog(n)) if x R n is stored in the augmented QRAM.

23 QALGO workshop, Riga. > Introduction 7 / 26 Augmented QRAM A high level view of the augmented QRAM. Augmented RAM proposals GLM08 Augmented QRAM Data structures, memory organization. Data structures, memory organization. RAM proposals GLM08 QRAM

24 QALGO workshop, Riga. > Introduction 7 / 26 Augmented QRAM A high level view of the augmented QRAM. Augmented RAM proposals GLM08 Augmented QRAM Data structures, memory organization. Data structures, memory organization. RAM proposals GLM08 QRAM The bottleneck is the realization of the QRAM, augmentations are classical and can be implemented.

25 QALGO workshop, Riga. > Introduction 8 / 26 Quantum singular value estimation Obtain generic linear algebra algorithm with running time poly-logarithmic in matrix dimensions using augmented QRAM.

26 QALGO workshop, Riga. > Introduction 8 / 26 Quantum singular value estimation Obtain generic linear algebra algorithm with running time poly-logarithmic in matrix dimensions using augmented QRAM. Let M R m n have singular value decomposition M = i σ iu i v t i.

27 QALGO workshop, Riga. > Introduction 8 / 26 Quantum singular value estimation Obtain generic linear algebra algorithm with running time poly-logarithmic in matrix dimensions using augmented QRAM. Let M R m n have singular value decomposition M = i σ iu i v t i. Given a superposition over the singular vectors i α i v i (respectively u i ) we want to obtain i α i v i σ i where σ i σ i ± ɛ M F for all i.

28 QALGO workshop, Riga. > Introduction 8 / 26 Quantum singular value estimation Obtain generic linear algebra algorithm with running time poly-logarithmic in matrix dimensions using augmented QRAM. Let M R m n have singular value decomposition M = i σ iu i v t i. Given a superposition over the singular vectors i α i v i (respectively u i ) we want to obtain i α i v i σ i where σ i σ i ± ɛ M F for all i.

29 QALGO workshop, Riga. > Introduction 8 / 26 Quantum singular value estimation Obtain generic linear algebra algorithm with running time poly-logarithmic in matrix dimensions using augmented QRAM. Let M R m n have singular value decomposition M = i σ iu i v t i. Given a superposition over the singular vectors i α i v i (respectively u i ) we want to obtain i α i v i σ i where σ i σ i ± ɛ M F for all i. Theorem If M R m n stored in the augmented QRAM, there is an algorithm with running time O(polylog(mn)/ɛ) that performs quantum singular value estimation with probability at least 1 1/poly(n).

30 QALGO workshop, Riga. > Introduction 9 / 26 Quantum singular value estimation The self analysis approach from LMR13 requires time Õ(1/ɛ3 ), analysis of coherence is required [P14].

31 QALGO workshop, Riga. > Introduction 9 / 26 Quantum singular value estimation The self analysis approach from LMR13 requires time Õ(1/ɛ3 ), analysis of coherence is required [P14]. Reduce singular value estimation to estimating the principal angles between subspaces associated with M. [Jordan s lemma]

32 QALGO workshop, Riga. > Introduction 9 / 26 Quantum singular value estimation The self analysis approach from LMR13 requires time Õ(1/ɛ3 ), analysis of coherence is required [P14]. Reduce singular value estimation to estimating the principal angles between subspaces associated with M. [Jordan s lemma] Implement reflection in the subspaces as unitary operators using augmented QRAM.

33 QALGO workshop, Riga. > Introduction 9 / 26 Quantum singular value estimation The self analysis approach from LMR13 requires time Õ(1/ɛ3 ), analysis of coherence is required [P14]. Reduce singular value estimation to estimating the principal angles between subspaces associated with M. [Jordan s lemma] Implement reflection in the subspaces as unitary operators using augmented QRAM. Apply phase estimation to the product of the reflection operators to estimate the angles and thus the singular values.

34 QALGO workshop, Riga. > Introduction 9 / 26 Quantum singular value estimation The self analysis approach from LMR13 requires time Õ(1/ɛ3 ), analysis of coherence is required [P14]. Reduce singular value estimation to estimating the principal angles between subspaces associated with M. [Jordan s lemma] Implement reflection in the subspaces as unitary operators using augmented QRAM. Apply phase estimation to the product of the reflection operators to estimate the angles and thus the singular values. Applications: quantum projections/linear systems and low rank approximation by column selection/ recommender systems.

35 QALGO workshop, Riga. > Introduction 10 / 26 Overview Vector state preparation.

36 QALGO workshop, Riga. > Introduction 10 / 26 Overview Vector state preparation. Augmented QRAM.

37 QALGO workshop, Riga. > Introduction 10 / 26 Overview Vector state preparation. Augmented QRAM. Quantum singular value estimation.

38 QALGO workshop, Riga. > Introduction 10 / 26 Overview Vector state preparation. Augmented QRAM. Quantum singular value estimation. Applications and open questions.

39 QALGO workshop, Riga. > The augmented QRAM 11 / 26 Vector states using QRAM There is a unitary U 0 = φ where, φ = 1 n i [n] ( ) xi i 0 + β i 1 x (2)

40 QALGO workshop, Riga. > The augmented QRAM 11 / 26 Vector states using QRAM There is a unitary U 0 = φ where, φ = 1 n i [n] ( ) xi i 0 + β i 1 x (2) QRAM query, conditional rotation, post select on 0, erase 1 query register. Success probability. n x 2

41 QALGO workshop, Riga. > The augmented QRAM 11 / 26 Vector states using QRAM There is a unitary U 0 = φ where, φ = 1 n i [n] ( ) xi i 0 + β i 1 x (2) QRAM query, conditional rotation, post select on 0, erase 1 query register. Success probability. n x 2 φ = sin(θ) x, 0 + cos(θ) x, 1, reflection in S φ = US 0 U 1 and reflection in x, 0 is phase flip if ancilla is 1.

42 QALGO workshop, Riga. > The augmented QRAM 11 / 26 Vector states using QRAM There is a unitary U 0 = φ where, φ = 1 n i [n] ( ) xi i 0 + β i 1 x (2) QRAM query, conditional rotation, post select on 0, erase 1 query register. Success probability. n x 2 φ = sin(θ) x, 0 + cos(θ) x, 1, reflection in S φ = US 0 U 1 and reflection in x, 0 is phase flip if ancilla is 1. The product of reflections S φ S x is rotation by 2θ, after k iterations: ( S φ S x ) k φ = sin((2k + 1)θ) x, 0 + cos((2k + 1)θ) x, 1 (3)

43 QALGO workshop, Riga. > The augmented QRAM 12 / 26 Amplitude amplification As θ sin(θ) = 1 n x, for k = O( n x ) the success probability is a constant. x, 1 φ S φ S x φ θ 2θ x, 0 S φ S x φ S x φ

44 QALGO workshop, Riga. > The augmented QRAM 12 / 26 Amplitude amplification As θ sin(θ) = 1 n x, for k = O( n x ) the success probability is a constant. x, 1 φ S φ S x φ θ 2θ x, 0 S φ S x φ S x φ Amplitude amplification can be made exact if the success probability is known.

45 QALGO workshop, Riga. > The augmented QRAM 13 / 26 Coherent exact amplitude amplification There is a unitary operator U such that U i, 0 log n +1 = φ i = sin(θ i ) i, x i, 0 + cos(θ i ) i, x i, 1.

46 QALGO workshop, Riga. > The augmented QRAM 13 / 26 Coherent exact amplitude amplification There is a unitary operator U such that U i, 0 log n +1 = φ i = sin(θ i ) i, x i, 0 + cos(θ i ) i, x i, 1. x i R n and sin(θ i ) are stored in QRAM.

47 QALGO workshop, Riga. > The augmented QRAM 13 / 26 Coherent exact amplitude amplification There is a unitary operator U such that U i, 0 log n +1 = φ i = sin(θ i ) i, x i, 0 + cos(θ i ) i, x i, 1. x i R n and sin(θ i ) are stored in QRAM. T (U) Then i α i i, 0 l i α i i, x i requires time Õ( min i sin(θ i ) ).

48 QALGO workshop, Riga. > The augmented QRAM 13 / 26 Coherent exact amplitude amplification There is a unitary operator U such that U i, 0 log n +1 = φ i = sin(θ i ) i, x i, 0 + cos(θ i ) i, x i, 1. x i R n and sin(θ i ) are stored in QRAM. T (U) Then i α i i, 0 l i α i i, x i requires time Õ( min i sin(θ i ) ). Proof: Let t i be the exact number of rotations required, π/2 = 2t θ i + 1 choosing the largest θ i θ i. i Compute i α i i, 0 l, t i.

49 QALGO workshop, Riga. > The augmented QRAM 13 / 26 Coherent exact amplitude amplification There is a unitary operator U such that U i, 0 log n +1 = φ i = sin(θ i ) i, x i, 0 + cos(θ i ) i, x i, 1. x i R n and sin(θ i ) are stored in QRAM. T (U) Then i α i i, 0 l i α i i, x i requires time Õ( min i sin(θ i ) ). Proof: Let t i be the exact number of rotations required, π/2 = 2t θ i + 1 choosing the largest θ i θ i. i Compute i α i i, 0 l, t i. Apply unitary R( S φ S x ) where R 0 = 0, R t = t 1 if t > 1 and reflections S φ and S x are conditioned on auxiliary register being non zero.

50 QALGO workshop, Riga. > The augmented QRAM 14 / 26 Quantum key value map A quantum key value map with (K i, V i ) N for i [m] implements the following transformation in time Õ(1), α i V i (4) i [m] α i K i i [n]

51 QALGO workshop, Riga. > The augmented QRAM 14 / 26 Quantum key value map A quantum key value map with (K i, V i ) N for i [m] implements the following transformation in time Õ(1), α i V i (4) i [m] α i K i i [n] Store K i at address f(v i ), V i at address f(k i ), α i K i f(k),qram α i K i, f(k i ), V i i [n] i [n] f(k),f(v ) i [n] α i K i, f(v i ), V i QRAM,f(V ) i [n] α i f(v i ), V i (5)

52 QALGO workshop, Riga. > The augmented QRAM 14 / 26 Quantum key value map A quantum key value map with (K i, V i ) N for i [m] implements the following transformation in time Õ(1), α i V i (4) i [m] α i K i i [n] Store K i at address f(v i ), V i at address f(k i ), α i K i f(k),qram α i K i, f(k i ), V i i [n] i [n] f(k),f(v ) i [n] α i K i, f(v i ), V i QRAM,f(V ) i [n] α i f(v i ), V i (5) Collisions can be handled.

53 QALGO workshop, Riga. > The augmented QRAM 15 / 26 Sparse vector state preparation Set up a key value map X 0 + i t i between memory addresses and indices of v. v 1 0 v 3 0 v 5 0 v 7 v Sparse v R Address, content QRAM v 1 v 3 v 5 v 7 v 8 Address Index Quantum key value map Can prepare v in time Õ( nnz(v)). k 1 AA i [k] i 1 k i [k] v t i i + X 0 1 k i [k] v t i t i

54 QALGO workshop, Riga. > The augmented QRAM 16 / 26 Augmented QRAM components Vector state preparation time is Õ( n v ), this is constant if entries of v lie in [1/ n, 2/ n].

55 QALGO workshop, Riga. > The augmented QRAM 16 / 26 Augmented QRAM components Vector state preparation time is Õ( n v ), this is constant if entries of v lie in [1/ n, 2/ n]. Augmented QRAM is therefore organized into bins: v 1 v 2 v B 2 1, 8 2, 6 2, 7 3, 3 3, 4 Index Key value map 2, 1 2, 2 2, 3 2, 4 2, 5 Memory Address

56 QALGO workshop, Riga. > The augmented QRAM 16 / 26 Augmented QRAM components Vector state preparation time is Õ( n v ), this is constant if entries of v lie in [1/ n, 2/ n]. Augmented QRAM is therefore organized into bins: v 1 v 2 v B 2 1, 8 2, 6 2, 7 3, 3 3, 4 Index Key value map 2, 1 2, 2 2, 3 2, 4 2, 5 Memory Address In addition, we maintain counts and offsets as metadata in quantum memory.

57 QALGO workshop, Riga. > The augmented QRAM 17 / 26 Augmented QRAM architecture Augmented QRAM architecture: v 1, v 2, v 3 R 8 QRAM3 Metadata Controller Offsets v v v Key-value map (i, j) (k, t) QRAM2 B 1 B 2 B 3 B 4 B QRAM1

58 QALGO workshop, Riga. > The augmented QRAM 17 / 26 Augmented QRAM architecture Augmented QRAM architecture: v 1, v 2, v 3 R 8 QRAM3 Metadata Controller Offsets v v v Key-value map (i, j) (k, t) QRAM2 B 1 B 2 B 3 B 4 B QRAM1 Pre-processing can be done as the controller streams over the vector entries.

59 QALGO workshop, Riga. > The augmented QRAM 18 / 26 State preparation with augmented QRAM We describe i, 0 i, x, for query in superposition use coherent exact amplitude estimation.

60 QALGO workshop, Riga. > The augmented QRAM 18 / 26 State preparation with augmented QRAM We describe i, 0 i, x, for query in superposition use coherent exact amplitude estimation. The metadata: Counts c ik and offsets o ik for the elements of x in bin B k.

61 QALGO workshop, Riga. > The augmented QRAM 18 / 26 State preparation with augmented QRAM We describe i, 0 i, x, for query in superposition use coherent exact amplitude estimation. The metadata: Counts c ik and offsets o ik for the elements of x in bin B k. Let l(k) = 2 log(c(k)), smallest power of 2 more than c k.

62 QALGO workshop, Riga. > The augmented QRAM 18 / 26 State preparation with augmented QRAM We describe i, 0 i, x, for query in superposition use coherent exact amplitude estimation. The metadata: Counts c ik and offsets o ik for the elements of x in bin B k. Let l(k) = 2 log(c(k)), smallest power of 2 more than c k. Prepare state 1 n k, t + oik as follows, i 1 k [b] l(k) k [b] 1 i k [b] l(k) k [b] l(k) k, 0 log N t [c(k)] k, t, 0 + c(k)<t l(k) k, t, 1 (6)

63 QALGO workshop, Riga. > The augmented QRAM 19 / 26 State preparation with augmented QRAM Add ancilla and apply conditional rotation, 1 ( xj ) k, t + oik n 2 k 0 + β 1 (7)

64 QALGO workshop, Riga. > The augmented QRAM 19 / 26 State preparation with augmented QRAM Add ancilla and apply conditional rotation, 1 ( xj ) k, t + oik n 2 k 0 + β 1 (7) x j [2 k 1, 2 k ], success probability is at least 1/4.

65 QALGO workshop, Riga. > The augmented QRAM 19 / 26 State preparation with augmented QRAM Add ancilla and apply conditional rotation, 1 ( xj ) k, t + oik n 2 k 0 + β 1 (7) x j [2 k 1, 2 k ], success probability is at least 1/4. Exact amplitude amplification to get 1 n xj i, k, t + o ik.

66 QALGO workshop, Riga. > The augmented QRAM 19 / 26 State preparation with augmented QRAM Add ancilla and apply conditional rotation, 1 ( xj ) k, t + oik n 2 k 0 + β 1 (7) x j [2 k 1, 2 k ], success probability is at least 1/4. Exact amplitude amplification to get 1 n xj i, k, t + o ik. Key value map to obtain 1 n xi i, i, j = i, x.

67 QALGO workshop, Riga. > Quantum Singular Value Estimation 20 / 26 Jordan s lemma P, Q are projectors onto P, Q and P QP = λ i >0 λ iv i v t i.

68 QALGO workshop, Riga. > Quantum Singular Value Estimation 20 / 26 Jordan s lemma P, Q are projectors onto P, Q and P QP = λ i >0 λ iv i v t i. w i := Qv i / Qv i 2 is an eigenvector for QP Q with eigenvalue λ i and Span(v i, w i ) is invariant subspaces under the action of P, Q,

69 QALGO workshop, Riga. > Quantum Singular Value Estimation 20 / 26 Jordan s lemma P, Q are projectors onto P, Q and P QP = λ i >0 λ iv i v t i. w i := Qv i / Qv i 2 is an eigenvector for QP Q with eigenvalue λ i and Span(v i, w i ) is invariant subspaces under the action of P, Q, w i = Qw i QP v i σ 2 i = λ i = cos 2 (θ) QP Qw i θ P QP v i v i = P v i

70 QALGO workshop, Riga. > Quantum Singular Value Estimation 20 / 26 Jordan s lemma P, Q are projectors onto P, Q and P QP = λ i >0 λ iv i v t i. w i := Qv i / Qv i 2 is an eigenvector for QP Q with eigenvalue λ i and Span(v i, w i ) is invariant subspaces under the action of P, Q, w i = Qw i QP v i σ 2 i = λ i = cos 2 (θ) QP Qw i θ P QP v i v i = P v i Estimating singular values of P Q reduces to estimating the principal angles θ i.

71 QALGO workshop, Riga. > Quantum Singular Value Estimation 21 / 26 Singular values and principal angles If M F = 1 then M = A t B with A R mn m, B R mn n such that A t A = I m and B t B = I n.

72 QALGO workshop, Riga. > Quantum Singular Value Estimation 21 / 26 Singular values and principal angles If M F = 1 then M = A t B with A R mn m, B R mn n such that A t A = I m and B t B = I n. Proof: Select a i = i, m i and b j = p, j where p = i [m] m i i.

73 QALGO workshop, Riga. > Quantum Singular Value Estimation 21 / 26 Singular values and principal angles If M F = 1 then M = A t B with A R mn m, B R mn n such that A t A = I m and B t B = I n. Proof: Select a i = i, m i and b j = p, j where p = i [m] m i i. Orthogonality follows from definition and a i b j = m ij.

74 QALGO workshop, Riga. > Quantum Singular Value Estimation 21 / 26 Singular values and principal angles If M F = 1 then M = A t B with A R mn m, B R mn n such that A t A = I m and B t B = I n. Proof: Select a i = i, m i and b j = p, j where p = i [m] m i i. Orthogonality follows from definition and a i b j = m ij. Define projector P = AA t and Q = BB t, then P Q = AMB t is iso-spectral with M t M.

75 QALGO workshop, Riga. > Quantum Singular Value Estimation 21 / 26 Singular values and principal angles If M F = 1 then M = A t B with A R mn m, B R mn n such that A t A = I m and B t B = I n. Proof: Select a i = i, m i and b j = p, j where p = i [m] m i i. Orthogonality follows from definition and a i b j = m ij. Define projector P = AA t and Q = BB t, then P Q = AMB t is iso-spectral with M t M. It suffices to estimate the singular values of P Q, which are the principal angles between P and Q.

76 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)?

77 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)? Let U be the multiplication by A unitary, U x, 0 log n = Ax = i x i i, m i.

78 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)? Let U be the multiplication by A unitary, U x, 0 log n = Ax = i x i i, m i. R A = UR 0 U 1 can be implemented as unitary using augmented QRAM.

79 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)? Let U be the multiplication by A unitary, U x, 0 log n = Ax = i x i i, m i. R A = UR 0 U 1 can be implemented as unitary using augmented QRAM. Multiplication by B is simpler, V 0 log m, x = p, x.

80 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)? Let U be the multiplication by A unitary, U x, 0 log n = Ax = i x i i, m i. R A = UR 0 U 1 can be implemented as unitary using augmented QRAM. Multiplication by B is simpler, V 0 log m, x = p, x. R B = V R 0 V 1 can be implemented as unitary using augmented QRAM.

81 QALGO workshop, Riga. > Quantum Singular Value Estimation 22 / 26 Reflections What is the reflection in P = Col(A)? Let U be the multiplication by A unitary, U x, 0 log n = Ax = i x i i, m i. R A = UR 0 U 1 can be implemented as unitary using augmented QRAM. Multiplication by B is simpler, V 0 log m, x = p, x. R B = V R 0 V 1 can be implemented as unitary using augmented QRAM. R A R B has eigenvalues e i2θ i, use phase estimation.

82 QALGO workshop, Riga. > Quantum Singular Value Estimation 23 / 26 Quantum singular value estimation

83 QALGO workshop, Riga. > Quantum Singular Value Estimation 23 / 26 Quantum singular value estimation Theorem If M R m n has SV D given by M = i σ iu i vi t and is stored in the augmented QRAM, there is a quantum algorithm that transforms i α i v i i α i v i σ i such that σ i σ i ± ɛ M F for all i with probability at least 1 1/poly(n) in time O(polylog(mn)/ɛ).

84 QALGO workshop, Riga. > Quantum Singular Value Estimation 23 / 26 Quantum singular value estimation Theorem If M R m n has SV D given by M = i σ iu i vi t and is stored in the augmented QRAM, there is a quantum algorithm that transforms i α i v i i α i v i σ i such that σ i σ i ± ɛ M F for all i with probability at least 1 1/poly(n) in time O(polylog(mn)/ɛ). Can be used for quantum projections.

85 QALGO workshop, Riga. > Applications, Questions 24 / 26 Applications Low rank approximation by column sampling.

86 QALGO workshop, Riga. > Applications, Questions 24 / 26 Applications Low rank approximation by column sampling. A comparison of quantum and classical CX decomposition for power law decay.

87 QALGO workshop, Riga. > Applications, Questions 25 / 26 Questions Quantum recommender systems.

88 QALGO workshop, Riga. > Applications, Questions 25 / 26 Questions Quantum recommender systems. Can quantum computing help with big data?

89 QALGO workshop, Riga. > Applications, Questions 25 / 26 Questions Quantum recommender systems. Can quantum computing help with big data? Quadratic speedups for Markov chain methods?

90 QALGO workshop, Riga. > Applications, Questions 25 / 26 Questions Quantum recommender systems. Can quantum computing help with big data? Quadratic speedups for Markov chain methods? Iterative quantum algorithms? Gradient descent? Optimization?

91 QALGO workshop, Riga. > Applications, Questions 25 / 26 Questions Quantum recommender systems. Can quantum computing help with big data? Quadratic speedups for Markov chain methods? Iterative quantum algorithms? Gradient descent? Optimization? Learning and inference on quantum max entropy/graphical models? Quantum exponential family?

92 QALGO workshop, Riga. > Applications, Questions 26 / 26 The Last Slide Thank You.

Quantum Recommendation Systems

Quantum Recommendation Systems Quantum Recommendation Systems Iordanis Kerenidis 1 Anupam Prakash 2 1 CNRS, Université Paris Diderot, Paris, France, EU. 2 Nanyang Technological University, Singapore. April 4, 2017 The HHL algorithm

More information

A Quantum Interior Point Method for LPs and SDPs

A Quantum Interior Point Method for LPs and SDPs A Quantum Interior Point Method for LPs and SDPs Iordanis Kerenidis 1 Anupam Prakash 1 1 CNRS, IRIF, Université Paris Diderot, Paris, France. September 26, 2018 Semi Definite Programs A Semidefinite Program

More information

arxiv: v3 [quant-ph] 1 May 2017

arxiv: v3 [quant-ph] 1 May 2017 Quantum gradient descent for linear systems and least squares Iordanis Kerenidis Anupam Prakash arxiv:704.04992v3 [quant-ph] May 207 May 2, 207 Abstract Quantum Machine Learning is an exciting new area

More information

Quantum NP - Cont. Classical and Quantum Computation A.Yu Kitaev, A. Shen, M. N. Vyalyi 2002

Quantum NP - Cont. Classical and Quantum Computation A.Yu Kitaev, A. Shen, M. N. Vyalyi 2002 Quantum NP - Cont. Classical and Quantum Computation A.Yu Kitaev, A. Shen, M. N. Vyalyi 2002 1 QMA - the quantum analog to MA (and NP). Definition 1 QMA. The complexity class QMA is the class of all languages

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

CHAPTER 11. A Revision. 1. The Computers and Numbers therein

CHAPTER 11. A Revision. 1. The Computers and Numbers therein CHAPTER A Revision. The Computers and Numbers therein Traditional computer science begins with a finite alphabet. By stringing elements of the alphabet one after another, one obtains strings. A set of

More information

High-precision quantum algorithms. Andrew Childs

High-precision quantum algorithms. Andrew Childs High-precision quantum algorithms Andrew hilds What can we do with a quantum computer? Factoring Many problems with polynomial speedup (combinatorial search, collision finding, graph properties, oolean

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

Quantum algorithm for linear systems of equations Final Report Mayank Sharma

Quantum algorithm for linear systems of equations Final Report Mayank Sharma Quantum algorithm for linear systems of equations Final Report Mayank Sharma Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such

More information

Linear Algebra for Machine Learning. Sargur N. Srihari

Linear Algebra for Machine Learning. Sargur N. Srihari Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

Lecture 2: From Classical to Quantum Model of Computation

Lecture 2: From Classical to Quantum Model of Computation CS 880: Quantum Information Processing 9/7/10 Lecture : From Classical to Quantum Model of Computation Instructor: Dieter van Melkebeek Scribe: Tyson Williams Last class we introduced two models for deterministic

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Principal Component Analysis

Principal Component Analysis Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used

More information

Parallel Singular Value Decomposition. Jiaxing Tan

Parallel Singular Value Decomposition. Jiaxing Tan Parallel Singular Value Decomposition Jiaxing Tan Outline What is SVD? How to calculate SVD? How to parallelize SVD? Future Work What is SVD? Matrix Decomposition Eigen Decomposition A (non-zero) vector

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Numerical Linear Algebra Background Cho-Jui Hsieh UC Davis May 15, 2018 Linear Algebra Background Vectors A vector has a direction and a magnitude

More information

A Cross-Associative Neural Network for SVD of Nonsquared Data Matrix in Signal Processing

A Cross-Associative Neural Network for SVD of Nonsquared Data Matrix in Signal Processing IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 12, NO. 5, SEPTEMBER 2001 1215 A Cross-Associative Neural Network for SVD of Nonsquared Data Matrix in Signal Processing Da-Zheng Feng, Zheng Bao, Xian-Da Zhang

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Discrete quantum random walks

Discrete quantum random walks Quantum Information and Computation: Report Edin Husić edin.husic@ens-lyon.fr Discrete quantum random walks Abstract In this report, we present the ideas behind the notion of quantum random walks. We further

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Background Mathematics (2/2) 1. David Barber

Background Mathematics (2/2) 1. David Barber Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and

More information

Lecture 3: Constructing a Quantum Model

Lecture 3: Constructing a Quantum Model CS 880: Quantum Information Processing 9/9/010 Lecture 3: Constructing a Quantum Model Instructor: Dieter van Melkebeek Scribe: Brian Nixon This lecture focuses on quantum computation by contrasting it

More information

Introduction to Quantum Algorithms Part I: Quantum Gates and Simon s Algorithm

Introduction to Quantum Algorithms Part I: Quantum Gates and Simon s Algorithm Part I: Quantum Gates and Simon s Algorithm Martin Rötteler NEC Laboratories America, Inc. 4 Independence Way, Suite 00 Princeton, NJ 08540, U.S.A. International Summer School on Quantum Information, Max-Planck-Institut

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Iterative solvers for linear equations

Iterative solvers for linear equations Spectral Graph Theory Lecture 23 Iterative solvers for linear equations Daniel A. Spielman November 26, 2018 23.1 Overview In this and the next lecture, I will discuss iterative algorithms for solving

More information

Quantum algorithms. Andrew Childs. Institute for Quantum Computing University of Waterloo

Quantum algorithms. Andrew Childs. Institute for Quantum Computing University of Waterloo Quantum algorithms Andrew Childs Institute for Quantum Computing University of Waterloo 11th Canadian Summer School on Quantum Information 8 9 June 2011 Based in part on slides prepared with Pawel Wocjan

More information

Matrix Mathematics. Theory, Facts, and Formulas with Application to Linear Systems Theory. Dennis S. Bernstein

Matrix Mathematics. Theory, Facts, and Formulas with Application to Linear Systems Theory. Dennis S. Bernstein Matrix Mathematics Theory, Facts, and Formulas with Application to Linear Systems Theory Dennis S. Bernstein PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD Contents Special Symbols xv Conventions, Notation,

More information

Linear Solvers. Andrew Hazel

Linear Solvers. Andrew Hazel Linear Solvers Andrew Hazel Introduction Thus far we have talked about the formulation and discretisation of physical problems...... and stopped when we got to a discrete linear system of equations. Introduction

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18 Overview

More information

Iterative solvers for linear equations

Iterative solvers for linear equations Spectral Graph Theory Lecture 15 Iterative solvers for linear equations Daniel A. Spielman October 1, 009 15.1 Overview In this and the next lecture, I will discuss iterative algorithms for solving linear

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

Math 1553, Introduction to Linear Algebra

Math 1553, Introduction to Linear Algebra Learning goals articulate what students are expected to be able to do in a course that can be measured. This course has course-level learning goals that pertain to the entire course, and section-level

More information

From Adversaries to Algorithms. Troy Lee Rutgers University

From Adversaries to Algorithms. Troy Lee Rutgers University From Adversaries to Algorithms Troy Lee Rutgers University Quantum Query Complexity As in classical complexity, it is difficult to show lower bounds on the time or number of gates required to solve a problem

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Lecture 5: Numerical Linear Algebra Cho-Jui Hsieh UC Davis April 20, 2017 Linear Algebra Background Vectors A vector has a direction and a magnitude

More information

EIGENVALUES IN LINEAR ALGEBRA *

EIGENVALUES IN LINEAR ALGEBRA * EIGENVALUES IN LINEAR ALGEBRA * P. F. Leung National University of Singapore. General discussion In this talk, we shall present an elementary study of eigenvalues in linear algebra. Very often in various

More information

https://goo.gl/kfxweg KYOTO UNIVERSITY Statistical Machine Learning Theory Sparsity Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT OF INTELLIGENCE SCIENCE AND TECHNOLOGY 1 KYOTO UNIVERSITY Topics:

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Applied Linear Algebra

Applied Linear Algebra Applied Linear Algebra Peter J. Olver School of Mathematics University of Minnesota Minneapolis, MN 55455 olver@math.umn.edu http://www.math.umn.edu/ olver Chehrzad Shakiban Department of Mathematics University

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 1: Quantum circuits and the abelian QFT

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 1: Quantum circuits and the abelian QFT Quantum algorithms (CO 78, Winter 008) Prof. Andrew Childs, University of Waterloo LECTURE : Quantum circuits and the abelian QFT This is a course on quantum algorithms. It is intended for graduate students

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Lecture 5 : Projections

Lecture 5 : Projections Lecture 5 : Projections EE227C. Lecturer: Professor Martin Wainwright. Scribe: Alvin Wan Up until now, we have seen convergence rates of unconstrained gradient descent. Now, we consider a constrained minimization

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Introduction to Quantum Computing

Introduction to Quantum Computing Introduction to Quantum Computing The lecture notes were prepared according to Peter Shor s papers Quantum Computing and Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a

More information

Numerical Methods in Matrix Computations

Numerical Methods in Matrix Computations Ake Bjorck Numerical Methods in Matrix Computations Springer Contents 1 Direct Methods for Linear Systems 1 1.1 Elements of Matrix Theory 1 1.1.1 Matrix Algebra 2 1.1.2 Vector Spaces 6 1.1.3 Submatrices

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Moore-Penrose Conditions & SVD

Moore-Penrose Conditions & SVD ECE 275AB Lecture 8 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 8 ECE 275A Moore-Penrose Conditions & SVD ECE 275AB Lecture 8 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 2/1

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

14 Singular Value Decomposition

14 Singular Value Decomposition 14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Outline Introduction: Problem Description Diculties Algebraic Structure: Algebraic Varieties Rank Decient Toeplitz Matrices Constructing Lower Rank St

Outline Introduction: Problem Description Diculties Algebraic Structure: Algebraic Varieties Rank Decient Toeplitz Matrices Constructing Lower Rank St Structured Lower Rank Approximation by Moody T. Chu (NCSU) joint with Robert E. Funderlic (NCSU) and Robert J. Plemmons (Wake Forest) March 5, 1998 Outline Introduction: Problem Description Diculties Algebraic

More information

Math 307 Learning Goals. March 23, 2010

Math 307 Learning Goals. March 23, 2010 Math 307 Learning Goals March 23, 2010 Course Description The course presents core concepts of linear algebra by focusing on applications in Science and Engineering. Examples of applications from recent

More information

SVD, PCA & Preprocessing

SVD, PCA & Preprocessing Chapter 1 SVD, PCA & Preprocessing Part 2: Pre-processing and selecting the rank Pre-processing Skillicorn chapter 3.1 2 Why pre-process? Consider matrix of weather data Monthly temperatures in degrees

More information

Linear Algebra and Eigenproblems

Linear Algebra and Eigenproblems Appendix A A Linear Algebra and Eigenproblems A working knowledge of linear algebra is key to understanding many of the issues raised in this work. In particular, many of the discussions of the details

More information

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Vahid Dehdari and Clayton V. Deutsch Geostatistical modeling involves many variables and many locations.

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition

More information

arxiv: v1 [quant-ph] 10 Dec 2014

arxiv: v1 [quant-ph] 10 Dec 2014 Quantum Deep Learning Nathan Wiebe, Ashish Kapoor, and Krysta M. Svore Microsoft Research, Redmond, WA (USA) In recent years, deep learning has had a profound impact on machine learning and artificial

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p

More information

Optimization methods

Optimization methods Optimization methods Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda /8/016 Introduction Aim: Overview of optimization methods that Tend to

More information

Hamiltonian simulation and solving linear systems

Hamiltonian simulation and solving linear systems Hamiltonian simulation and solving linear systems Robin Kothari Center for Theoretical Physics MIT Quantum Optimization Workshop Fields Institute October 28, 2014 Ask not what you can do for quantum computing

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

C/CS/Phys C191 Grover s Quantum Search Algorithm 11/06/07 Fall 2007 Lecture 21

C/CS/Phys C191 Grover s Quantum Search Algorithm 11/06/07 Fall 2007 Lecture 21 C/CS/Phys C191 Grover s Quantum Search Algorithm 11/06/07 Fall 2007 Lecture 21 1 Readings Benenti et al, Ch 310 Stolze and Suter, Quantum Computing, Ch 84 ielsen and Chuang, Quantum Computation and Quantum

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

9 Searching the Internet with the SVD

9 Searching the Internet with the SVD 9 Searching the Internet with the SVD 9.1 Information retrieval Over the last 20 years the number of internet users has grown exponentially with time; see Figure 1. Trying to extract information from this

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0. ) Find all solutions of the linear system. Express the answer in vector form. x + 2x + x + x 5 = 2 2x 2 + 2x + 2x + x 5 = 8 x + 2x + x + 9x 5 = 2 2 Solution: Reduce the augmented matrix [ 2 2 2 8 ] to

More information

On the complexity of stoquastic Hamiltonians

On the complexity of stoquastic Hamiltonians On the complexity of stoquastic Hamiltonians Ian Kivlichan December 11, 2015 Abstract Stoquastic Hamiltonians, those for which all off-diagonal matrix elements in the standard basis are real and non-positive,

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Course Notes: Week 1

Course Notes: Week 1 Course Notes: Week 1 Math 270C: Applied Numerical Linear Algebra 1 Lecture 1: Introduction (3/28/11) We will focus on iterative methods for solving linear systems of equations (and some discussion of eigenvalues

More information

Lecture 8 : Eigenvalues and Eigenvectors

Lecture 8 : Eigenvalues and Eigenvectors CPS290: Algorithmic Foundations of Data Science February 24, 2017 Lecture 8 : Eigenvalues and Eigenvectors Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Hermitian Matrices It is simpler to begin with

More information

1 Singular Value Decomposition and Principal Component

1 Singular Value Decomposition and Principal Component Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)

More information

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers.. EIGENVALUE PROBLEMS Background on eigenvalues/ eigenvectors / decompositions Perturbation analysis, condition numbers.. Power method The QR algorithm Practical QR algorithms: use of Hessenberg form and

More information

approximation algorithms I

approximation algorithms I SUM-OF-SQUARES method and approximation algorithms I David Steurer Cornell Cargese Workshop, 201 meta-task encoded as low-degree polynomial in R x example: f(x) = i,j n w ij x i x j 2 given: functions

More information

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

Data Mining Lecture 4: Covariance, EVD, PCA & SVD Data Mining Lecture 4: Covariance, EVD, PCA & SVD Jo Houghton ECS Southampton February 25, 2019 1 / 28 Variance and Covariance - Expectation A random variable takes on different values due to chance The

More information

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features

More information

BlockMatrixComputations and the Singular Value Decomposition. ATaleofTwoIdeas

BlockMatrixComputations and the Singular Value Decomposition. ATaleofTwoIdeas BlockMatrixComputations and the Singular Value Decomposition ATaleofTwoIdeas Charles F. Van Loan Department of Computer Science Cornell University Supported in part by the NSF contract CCR-9901988. Block

More information

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector 1 Query model In the quantum query model we wish to compute some function f and we access the input through queries. The complexity of f is the number of queries needed to compute f on a worst-case input

More information

Hamiltonian simulation with nearly optimal dependence on all parameters

Hamiltonian simulation with nearly optimal dependence on all parameters Hamiltonian simulation with nearly optimal dependence on all parameters Dominic Berry + Andrew Childs obin Kothari ichard Cleve olando Somma Quantum simulation by quantum walks Dominic Berry + Andrew Childs

More information

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning)

Regression. Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Linear Regression Regression Goal: Learn a mapping from observations (features) to continuous labels given a training set (supervised learning) Example: Height, Gender, Weight Shoe Size Audio features

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental

More information

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang

More information

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Peter J.C. Dickinson p.j.c.dickinson@utwente.nl http://dickinson.website version: 12/02/18 Monday 5th February 2018 Peter J.C. Dickinson

More information

TBP MATH33A Review Sheet. November 24, 2018

TBP MATH33A Review Sheet. November 24, 2018 TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

By allowing randomization in the verification process, we obtain a class known as MA.

By allowing randomization in the verification process, we obtain a class known as MA. Lecture 2 Tel Aviv University, Spring 2006 Quantum Computation Witness-preserving Amplification of QMA Lecturer: Oded Regev Scribe: N. Aharon In the previous class, we have defined the class QMA, which

More information

Compute the Fourier transform on the first register to get x {0,1} n x 0.

Compute the Fourier transform on the first register to get x {0,1} n x 0. CS 94 Recursive Fourier Sampling, Simon s Algorithm /5/009 Spring 009 Lecture 3 1 Review Recall that we can write any classical circuit x f(x) as a reversible circuit R f. We can view R f as a unitary

More information

Lecture 5: Web Searching using the SVD

Lecture 5: Web Searching using the SVD Lecture 5: Web Searching using the SVD Information Retrieval Over the last 2 years the number of internet users has grown exponentially with time; see Figure. Trying to extract information from this exponentially

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

Least Squares Optimization

Least Squares Optimization Least Squares Optimization The following is a brief review of least squares optimization and constrained optimization techniques. Broadly, these techniques can be used in data analysis and visualization

More information

Least Squares Optimization

Least Squares Optimization Least Squares Optimization The following is a brief review of least squares optimization and constrained optimization techniques. I assume the reader is familiar with basic linear algebra, including the

More information

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1) Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1) Travis Schedler Thurs, Nov 18, 2010 (version: Wed, Nov 17, 2:15

More information

Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora

Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora Scribe: Today we continue the

More information