Quantum query complexity of entropy estimation

Size: px
Start display at page:

Download "Quantum query complexity of entropy estimation"

Transcription

1 Quantum query complexity of entropy estimation Xiaodi Wu QuICS, University of Maryland MSR Redmond, July 19th, 2017 J o i n t C e n t e r f o r Quantum Information and Computer Science

2 Outline Motivation and Problem Statements Main Results Techniques Open Questions and On-going Work

3 Outline Motivation and Problem Statements Main Results Techniques Open Questions and On-going Work

4 Motivation Why do I come across this problem? Quantum property testing [e.g., survey [Montanaro, de Wolf]] (Quantum testers of classical properties!)

5 Motivation Why do I come across this problem? Quantum property testing [e.g., survey [Montanaro, de Wolf]] (Quantum testers of classical properties!) Testing of distributional properties. A well-motivated branch in the classical property testing literature!! also connected to learning problems.

6 Motivation Why do I come across this problem? Quantum property testing [e.g., survey [Montanaro, de Wolf]] (Quantum testers of classical properties!) Testing of distributional properties. A well-motivated branch in the classical property testing literature!! also connected to learning problems. Objects are distributions, rather than boolean functions. Might bring new insights or call for new techniques!

7 Entropies Given any distribution p over a discrete set X, the Shannon entropy of this distribution p is defined by H(p) := p x log p x. x X:p(x)>0

8 Entropies Given any distribution p over a discrete set X, the Shannon entropy of this distribution p is defined by H(p) := p x log p x. x X:p(x)>0 One important generalization of Shannon entropy is the Rényi entropy of order α, denoted H α (p), which is defined by { 1 H α (p) = 1 α log x X pα x, when α 1. H(p), when α = 1.

9 Problem Statement For convenience, assume X = [n].

10 Problem Statement For convenience, assume X = [n]. A natural question: Given access to samples obtained by taking independent draws from p, determine the necessary number of independent draws to estimate H(p) or H α (p) within error ɛ, with high probability. Motivations: this is a theoretically appealing topic with intimate connections to statistics, information theory, learning theory, and algorithm design.

11 Our Question Main Question: is there any quantum speed-up of estimation of Shannon and Rényi entropies? Our question aligns with the emerging topic called quantum property testing and focuses on investigating the quantum advantage in testing classical statistical properties.

12 Our Question Main Question: is there any quantum speed-up of estimation of Shannon and Rényi entropies? Our question aligns with the emerging topic called quantum property testing and focuses on investigating the quantum advantage in testing classical statistical properties. The first research paper on this topic is by Bravyi, Harrow, and Hassidim [BHH 11], where they have discovered quantum speed-ups of testing uniformity, orthogonality, and statistical difference on unknown distributions, followed by Chakraborty et al. [CFMdW 10].

13 Outline Motivation and Problem Statements Main Results Techniques Open Questions and On-going Work

14 Classical results Classically, this fundamental question has been intensively studied in both the communities of theoretical computer science and information theory.

15 Classical results Classically, this fundamental question has been intensively studied in both the communities of theoretical computer science and information theory. If ɛ is a constant, then the classical query complexity

16 Classical results Classically, this fundamental question has been intensively studied in both the communities of theoretical computer science and information theory. If ɛ is a constant, then the classical query complexity for Shannon entropy [VV 11], [JVHW 15]: Θ( n log n ).

17 Classical results Classically, this fundamental question has been intensively studied in both the communities of theoretical computer science and information theory. If ɛ is a constant, then the classical query complexity for Shannon entropy [VV 11], [JVHW 15]: Θ( n log n ). for Rényi entropy [AOST 17]: O( n α 1 log n ) and Ω(n 1 α o(1) ), when 0 < α < 1. O( n log n ) and Ω(n1 o(1) ), when α > 1, α / N. Θ(n 1 1 α ), when α > 1, α N.

18 Quantum results Our results: α classical bounds quantum bounds (this talk) 0 < α < 1 O( n α 1 log n ), Ω(n 1 α o(1) ) [AOST 17] Õ(n 1 α 1 2 ), Ω(max{n 1 7α o(1), n 1 3 }) α = 1 Θ( n log n ) [VV 11, JVHW 15] Õ( n), Ω(n 1 3 ) α > 1, α / N O( n log n ), Ω(n1 o(1) ) [AOST 17] Õ(n 1 1 2α α = 2 Θ( 1 n)[aost 17] Θ(n ), Ω(max{n 1 3, Ω(n ) 2α )}) α > 2, α N Θ(n 1 1/α ) [AOST 17] Õ(n ν(1 1/α) ), Ω(n α ), ν < 3/4 α = Θ( n log n ) [VV 11] Õ(Q( log n -distinctness)), Ω( n) Table 1: Summary of classical and quantum query complexity of H α (p) for α > 0, assuming ɛ = Θ(1).

19 Quantum results exp of n Classical tight bounds Quantum upper bounds Quantum lower bounds α Figure 1: Visualization of classical and quantum query complexity of H α (p). The x-axis represents α and the y-axis represents the exponent of n. Red curves and points represent quantum upper bounds. Green curves and points represent classical tight bounds. The Magenta curve represents quantum lower bounds.

20 Quantum results: in my April talk at MSR deg n 2.0 Classical tight bounds Quantum upper bounds 1.5 Quantum lower bounds α Figure 2: Visualization of classical and quantum query complexity of H α (p). The x-axis represents α and the y-axis represents the exponent of n. Red curves and points represent quantum upper bounds. Green curves and points represent classical tight bounds. The Magenta curve represents quantum lower bounds.

21 Outline Motivation and Problem Statements Main Results Techniques Open Questions and On-going Work

22 Sample vs Query model

23 Sample vs Query model Ref. [BHH 11] models any discrete distribution p = (p i ) n i=1 on [n] by an oracle O p : [S] [n]. Any probability p i (i [n]) is thus proportional to the size of pre-image of i under O p : p i = {s [S] : O p(s) = i} S i [n]. If one samples s uniformly from [S] and outputs O p (s), then one obtains a sample drawn from distribution p.

24 Sample vs Query model Ref. [BHH 11] models any discrete distribution p = (p i ) n i=1 on [n] by an oracle O p : [S] [n]. Any probability p i (i [n]) is thus proportional to the size of pre-image of i under O p : p i = {s [S] : O p(s) = i} S i [n]. If one samples s uniformly from [S] and outputs O p (s), then one obtains a sample drawn from distribution p. It is shown in [BHH 11] that the query complexity of the oracle model above and the sample complexity of independent samples are in fact equivalent classically.

25 Quantum Query Model Quantumly, O p is transformed into a unitary operator Ôp acting on C S C n+1 such that Ô p s 0 = s O p (s) s [S]. This is more powerful than O p because we may take a superposition of states as an input.

26 Roadmap of quantum entropy estimation: for all α > 0 Span programs Simulated annealing Amplitude amplification Phase estimation Learning graph for k-distinctness Chebyshev cooling Quantum speedup of Chebyshev s inequality Quantum counting Min-entropy estimation α-rényi entropy estimation (α N) α-rényi entropy estimation (α / N) Shannon entropy estimation

27 High-level Framework A general distribution property estimation problem: Given a discrete distribution p = (p i ) n i=1 on [n] and a function f : (0, 1] R, estimate F (p) := i [n] p if(p i ) with small additive or multiplicative error with high success probability.

28 High-level Framework A general distribution property estimation problem: Given a discrete distribution p = (p i ) n i=1 on [n] and a function f : (0, 1] R, estimate F (p) := i [n] p if(p i ) with small additive or multiplicative error with high success probability. If f(x) = log x, F (p) is the Shannon entropy H(p) (additive error); if f(x) = x α 1 for some α > 0, α 1, H α (p) log F (p) (multiplicative error).

29 High-level Framework A general distribution property estimation problem: Given a discrete distribution p = (p i ) n i=1 on [n] and a function f : (0, 1] R, estimate F (p) := i [n] p if(p i ) with small additive or multiplicative error with high success probability. If f(x) = log x, F (p) is the Shannon entropy H(p) (additive error); if f(x) = x α 1 for some α > 0, α 1, H α (p) log F (p) (multiplicative error). Inspired by BHH, we formulate a framework for approximating F (p).

30 High-level Framework Algorithm: Estimate F (p) = i p if(p i ). 1 Set l, M N; 2 Regard the following subroutine as A: 3 Draw a sample i [n] according to p; 4 Use amplitude estimation with M queries to obtain an estimation p i of p i ; 5 Output X = f( p i ); 6 Use A for l times in quantum speedup of Chebyshev s inequality and outputs an estimation F ( p) of F (p);

31 High-level Framework Algorithm: Estimate F (p) = i p if(p i ). 1 Set l, M N; 2 Regard the following subroutine as A: 3 Draw a sample i [n] according to p; 4 Use amplitude estimation with M queries to obtain an estimation p i of p i ; 5 Output X = f( p i ); 6 Use A for l times in quantum speedup of Chebyshev s inequality and outputs an estimation F ( p) of F (p); Intuitively, performance depends on E( F ( p)) and σ( F ( p)).

32 High-level Framework Algorithm: Estimate F (p) = i p if(p i ). 1 Set l, M N; 2 Regard the following subroutine as A: 3 Draw a sample i [n] according to p; 4 Use amplitude estimation with M queries to obtain an estimation p i of p i ; 5 Output X = f( p i ); 6 Use A for l times in quantum speedup of Chebyshev s inequality and outputs an estimation F ( p) of F (p); Intuitively, performance depends on E( F ( p)) and σ( F ( p)). Want: E( F ( p)) close to F (p) and σ( F ( p)) small.

33 High-level Framework Algorithm: Estimate F (p) = i p if(p i ). 1 Set l, M N; 2 Regard the following subroutine as A: 3 Draw a sample i [n] according to p; 4 Use amplitude estimation with M queries to obtain an estimation p i of p i ; 5 Output X = f( p i ); 6 Use A for l times in quantum speedup of Chebyshev s inequality and outputs an estimation F ( p) of F (p); Intuitively, performance depends on E( F ( p)) and σ( F ( p)). Want: E( F ( p)) close to F (p) and σ( F ( p)) small. Pros: intuitively correct; (tedious) calculation only!?

34 High-level Framework Algorithm: Estimate F (p) = i p if(p i ). 1 Set l, M N; 2 Regard the following subroutine as A: 3 Draw a sample i [n] according to p; 4 Use amplitude estimation with M queries to obtain an estimation p i of p i ; 5 Output X = f( p i ); 6 Use A for l times in quantum speedup of Chebyshev s inequality and outputs an estimation F ( p) of F (p); Intuitively, performance depends on E( F ( p)) and σ( F ( p)). Want: E( F ( p)) close to F (p) and σ( F ( p)) small. Pros: intuitively correct; (tedious) calculation only!? Cons: very limited quantum speedup with only the basic framework.

35 Roadmap of quantum entropy estimation: for all α > 0 Span programs Simulated annealing Amplitude amplification Phase estimation Learning graph for k-distinctness Chebyshev cooling Quantum speedup of Chebyshev s inequality Quantum counting Min-entropy estimation α-rényi entropy estimation (α N) α-rényi entropy estimation (α / N) Shannon entropy estimation

36 Quantum algorithm for Shannon entropy estimation Two new ingredients (c.f. BHH) when f(p) = log p

37 Quantum algorithm for Shannon entropy estimation Two new ingredients (c.f. BHH) when f(p) = log p Montanaro s quantum speedup of Monte Carlo methods. Let A be a quantum algorithm outputting X such that Var(X) σ 2. For ɛ s.t. 0 < ɛ < 4σ, by using Õ(σ/ɛ) times of A and A 1, one can output estimate Ẽ(X) of E(X) s.t. Pr [ Ẽ(X) E(X) ɛ] 1/4. Classically, one needs to use Θ(σ 2 /ɛ 2 ) times of A.

38 Quantum algorithm for Shannon entropy estimation Two new ingredients (c.f. BHH) when f(p) = log p Montanaro s quantum speedup of Monte Carlo methods. Let A be a quantum algorithm outputting X such that Var(X) σ 2. For ɛ s.t. 0 < ɛ < 4σ, by using Õ(σ/ɛ) times of A and A 1, one can output estimate Ẽ(X) of E(X) s.t. Pr [ Ẽ(X) E(X) ɛ] 1/4. Classically, one needs to use Θ(σ 2 /ɛ 2 ) times of A. a fine-tuned analysis to bound the value of log(1/p) when p 0. Analyze the full distribution of amplitude amplification.

39 Quantum algorithm for Shannon entropy estimation Two new ingredients (c.f. BHH) when f(p) = log p Montanaro s quantum speedup of Monte Carlo methods. Let A be a quantum algorithm outputting X such that Var(X) σ 2. For ɛ s.t. 0 < ɛ < 4σ, by using Õ(σ/ɛ) times of A and A 1, one can output estimate Ẽ(X) of E(X) s.t. Pr [ Ẽ(X) E(X) ɛ] 1/4. Classically, one needs to use Θ(σ 2 /ɛ 2 ) times of A. a fine-tuned analysis to bound the value of log(1/p) when p 0. Analyze the full distribution of amplitude amplification. Complexity: Classical Θ(n/ log(n)) vs Quantum Õ( n).

40 Roadmap of quantum entropy estimation: for all α > 0 Span programs Simulated annealing Amplitude amplification Phase estimation Learning graph for k-distinctness Chebyshev cooling Quantum speedup of Chebyshev s inequality Quantum counting Min-entropy estimation α-rényi entropy estimation (α N) α-rényi entropy estimation (α / N) Shannon entropy estimation

41 Quantum algorithm for α-rényi entropy: α / N Issue w/ the basic framework for Rényi entropy multiplicative errors (Montanaro s algorithm yields no speedup in the worst case) quantum advantage when 1/2 < α < 2.

42 Quantum algorithm for α-rényi entropy: α / N Issue w/ the basic framework for Rényi entropy multiplicative errors (Montanaro s algorithm yields no speedup in the worst case) quantum advantage when 1/2 < α < 2. New observations given E[X] [a, b] s.t. b = O(a), we develop a variant of Montanaro s algorithm with a quadratic speedup.

43 Quantum algorithm for α-rényi entropy: α / N Issue w/ the basic framework for Rényi entropy multiplicative errors (Montanaro s algorithm yields no speedup in the worst case) quantum advantage when 1/2 < α < 2. New observations given E[X] [a, b] s.t. b = O(a), we develop a variant of Montanaro s algorithm with a quadratic speedup. Let P α (p) = i pα i, α 1, α 2 > 0 s.t. α 1 /α 2 = 1 ± 1/ log(n), P α1 (p) = Θ(P α2 (p) α 1/α 2 ) Use P α2 to estimate [a, b] for P α1 where α 2 is closer to 1.

44 Quantum algorithm for α-rényi entropy: α / N Issue w/ the basic framework for Rényi entropy multiplicative errors (Montanaro s algorithm yields no speedup in the worst case) quantum advantage when 1/2 < α < 2. New observations given E[X] [a, b] s.t. b = O(a), we develop a variant of Montanaro s algorithm with a quadratic speedup. Let P α (p) = i pα i, α 1, α 2 > 0 s.t. α 1 /α 2 = 1 ± 1/ log(n), P α1 (p) = Θ(P α2 (p) α 1/α 2 ) Use P α2 to estimate [a, b] for P α1 where α 2 is closer to 1. Recursively solve the P α2 case until α 2 1 where a speed-up is already known.

45 Quantum algorithm for α-rényi entropy: α / N Recursively call roughly O(log(n)) times until α 1 α > 1: α α(1 + 1 log n ) 1 α(1 + 1 log n ) 2 ; 0 < α < 1: α α(1 1 log n ) 1 α(1 1 log n ) 2.

46 Quantum algorithm for α-rényi entropy: α / N Recursively call roughly O(log(n)) times until α 1 α > 1: α α(1 + 1 log n ) 1 α(1 + 1 log n ) 2 ; 0 < α < 1: α α(1 1 log n ) 1 α(1 1 log n ) 2. Similarity and difference: cooling schedules in simulated annealing, volume estimation multi-section, multiplicative factors, similar design principle. adapt this design principle to our context.

47 Roadmap of quantum entropy estimation: for all α > 0 Span programs Simulated annealing Amplitude amplification Phase estimation Learning graph for k-distinctness Chebyshev cooling Quantum speedup of Chebyshev s inequality Quantum counting Min-entropy estimation α-rényi entropy estimation (α N) α-rényi entropy estimation (α / N) Shannon entropy estimation

48 Quantum algorithm for α-rényi entropy: α 2 N In the sample model, this problem has a good empirical estimator based on α-th frequency moment.

49 Quantum algorithm for α-rényi entropy: α 2 N In the sample model, this problem has a good empirical estimator based on α-th frequency moment. Frequency moments A sequence a 1,..., a M [n] are given with the occurrences of 1,..., n to be m 1,..., m n respectively. You are asked to give a good approximation of F k = i [n] mk i for k N.

50 Quantum algorithm for α-rényi entropy: α 2 N In the sample model, this problem has a good empirical estimator based on α-th frequency moment. Frequency moments A sequence a 1,..., a M [n] are given with the occurrences of 1,..., n to be m 1,..., m n respectively. You are asked to give a good approximation of F k = i [n] mk i for k N. Empirical Estimation Key observation: p α i ( m i M )k = F k /M k i [n] i [n]

51 Quantum algorithm for α-rényi entropy: α 2 N In the sample model, this problem has a good empirical estimator based on α-th frequency moment. Frequency moments A sequence a 1,..., a M [n] are given with the occurrences of 1,..., n to be m 1,..., m n respectively. You are asked to give a good approximation of F k = i [n] mk i for k N. Empirical Estimation Key observation: p α i ( m i M )k = F k /M k i [n] i [n] Note: this is not the best classical estimator. Why?

52 Quantum algorithm for α-rényi entropy: α 2 N Want to use this empirical estimator in the query model!! Issues and Solutions Issue 1: How to generate M samples? cannot query M times (the same complexity as classical).

53 Quantum algorithm for α-rényi entropy: α 2 N Want to use this empirical estimator in the query model!! Issues and Solutions Issue 1: How to generate M samples? cannot query M times (the same complexity as classical). Key observation: treat our quantum oracle O p as a sequence of S samples. The α-frequency moment of O p (1),..., O p (S) is exactly S α i [n] pα i.

54 Quantum algorithm for α-rényi entropy: α 2 N Want to use this empirical estimator in the query model!! Issues and Solutions Issue 1: How to generate M samples? cannot query M times (the same complexity as classical). Key observation: treat our quantum oracle O p as a sequence of S samples. The α-frequency moment of O p (1),..., O p (S) is exactly S α i [n] pα i. Issue 2: quantum algorithm for α-frequency moment with query complexity (e.g., o(n 3 4 (1 1 α ) ) [Montanaro 16]) where the n is actually S in this context.

55 Quantum algorithm for α-rényi entropy: α 2 N Want to use this empirical estimator in the query model!! Issues and Solutions Issue 1: How to generate M samples? cannot query M times (the same complexity as classical). Key observation: treat our quantum oracle O p as a sequence of S samples. The α-frequency moment of O p (1),..., O p (S) is exactly S α i [n] pα i. Issue 2: quantum algorithm for α-frequency moment with query complexity (e.g., o(n 3 4 (1 1 α ) ) [Montanaro 16]) where the n is actually S in this context. Key observation: Roughly replace S by αn in the algorithm and redo the analysis.

56 Roadmap of quantum entropy estimation: for all α > 0 Span programs Simulated annealing Amplitude amplification Phase estimation Learning graph for k-distinctness Chebyshev cooling Quantum speedup of Chebyshev s inequality Quantum counting Min-entropy estimation α-rényi entropy estimation (α N) α-rényi entropy estimation (α / N) Shannon entropy estimation

57 The min-entropy (i.e., α = + ) case: find the max i p i How about using integer α algorithm? Intuitively, exists some α s.t., H α (p) (sub-linear for any α) is a good enough approximation of H min (p).

58 The min-entropy (i.e., α = + ) case: find the max i p i How about using integer α algorithm? Intuitively, exists some α s.t., H α (p) (sub-linear for any α) is a good enough approximation of H min (p). NO, Õ( ) hides an exponential dependence on α.

59 The min-entropy (i.e., α = + ) case: find the max i p i How about using integer α algorithm? Intuitively, exists some α s.t., H α (p) (sub-linear for any α) is a good enough approximation of H min (p). NO, Õ( ) hides an exponential dependence on α. Another reduction to log(n)-distinctness Poissonized sampling : the numbers of occurrences of ith point are pair-wise independent Poisson distributions, parameterized by a guess value λ and p i, i [n].

60 The min-entropy (i.e., α = + ) case: find the max i p i How about using integer α algorithm? Intuitively, exists some α s.t., H α (p) (sub-linear for any α) is a good enough approximation of H min (p). NO, Õ( ) hides an exponential dependence on α. Another reduction to log(n)-distinctness Poissonized sampling : the numbers of occurrences of ith point are pair-wise independent Poisson distributions, parameterized by a guess value λ and p i, i [n]. log(n) is a natural cut-off threshold for Poisson distribution.

61 The min-entropy (i.e., α = + ) case: find the max i p i How about using integer α algorithm? Intuitively, exists some α s.t., H α (p) (sub-linear for any α) is a good enough approximation of H min (p). NO, Õ( ) hides an exponential dependence on α. Another reduction to log(n)-distinctness Poissonized sampling : the numbers of occurrences of ith point are pair-wise independent Poisson distributions, parameterized by a guess value λ and p i, i [n]. log(n) is a natural cut-off threshold for Poisson distribution. When λ max i p i passes this threshold, a log(n)-collision can be found w.h.p.. Otherwise, update λ and try again.

62 Quantum lower bounds Any quantum algorithm that approximates α-rényi entropy of a discrete distribution on [n] with success probability at least 2/3 must use Ω(n 1 7α o(1) ) quantum queries when 0 < α < 3 7. Ω(n 1 3 ) quantum queries when 3 7 α 3. Ω(n α ) quantum queries when α 3. Ω( n) quantum queries when α = +.

63 Quantum lower bounds Any quantum algorithm that approximates α-rényi entropy of a discrete distribution on [n] with success probability at least 2/3 must use Ω(n 1 7α o(1) ) quantum queries when 0 < α < 3 7. Ω(n 1 3 ) quantum queries when 3 7 α 3. Ω(n α ) quantum queries when α 3. Ω( n) quantum queries when α = +. Techniques: Reductions to the collision, Hamming weight, symmetry function problems.

64 Quantum lower bounds Any quantum algorithm that approximates α-rényi entropy of a discrete distribution on [n] with success probability at least 2/3 must use Ω(n 1 7α o(1) ) quantum queries when 0 < α < 3 7. Ω(n 1 3 ) quantum queries when 3 7 α 3. Ω(n α ) quantum queries when α 3. Ω( n) quantum queries when α = +. Techniques: Reductions to the collision, Hamming weight, symmetry function problems. The polynomial method inspired by the collision lower bound for a better error dependence.

65 Outline Motivation and Problem Statements Main Results Techniques Open Questions and On-going Work

66 Questions k-distinctness for super-constant k? A problem interesting on its own. Our reduction to the min-entropy case just adds another motivation!

67 Questions k-distinctness for super-constant k? A problem interesting on its own. Our reduction to the min-entropy case just adds another motivation! No quantum speed-up from existing algorithms by Ambainis and Belovs when k = log(n)

68 Questions k-distinctness for super-constant k? A problem interesting on its own. Our reduction to the min-entropy case just adds another motivation! No quantum speed-up from existing algorithms by Ambainis and Belovs when k = log(n) differences between quantum and classical? At a high level, we want testers with correct expectations but small variances. Many techniques are proposed classically, and so different from ours.

69 Questions k-distinctness for super-constant k? A problem interesting on its own. Our reduction to the min-entropy case just adds another motivation! No quantum speed-up from existing algorithms by Ambainis and Belovs when k = log(n) differences between quantum and classical? At a high level, we want testers with correct expectations but small variances. Many techniques are proposed classically, and so different from ours. Can we leverage the design principle of classical testers?

70 Thank You!! Q & A

71 Classical estimators for Shannon entropy

72 Classical estimators for Shannon entropy A first choice: empirical estimator. If we take M samples and occurences of 1,..., n are m 1,..., m n respectively, then empirically the distribution is ( m 1 M,..., mn M ), and H emp (p) = i [n] m i M log m i M.

73 Classical estimators for Shannon entropy A first choice: empirical estimator. If we take M samples and occurences of 1,..., n are m 1,..., m n respectively, then empirically the distribution is ( m 1 M,..., mn M ), and H emp (p) = i [n] m i M log m i M. To approximate H(p) within error ɛ with high probability, need M = Θ( n ɛ 2 ).

74 Classical estimators for Shannon entropy A first choice: empirical estimator. If we take M samples and occurences of 1,..., n are m 1,..., m n respectively, then empirically the distribution is ( m 1 M,..., mn M ), and H emp (p) = i [n] m i M log m i M. To approximate H(p) within error ɛ with high probability, need M = Θ( n ɛ 2 ). Not that bad compared to the best estimator, where M = Θ( ɛ 2 log n ). n

75 Classical estimators for Shannon entropy Construction of the best estimator:

76 Classical estimators for Shannon entropy Construction of the best estimator: [VV 11]: A very clever (but complicated) application of linear programming under Poissonized samples (will explain later)

77 Classical estimators for Shannon entropy Construction of the best estimator: [VV 11]: A very clever (but complicated) application of linear programming under Poissonized samples (will explain later) [JVHW 15]: polynomial approximation

78 Classical estimators for Rényi entropy

79 Classical estimators for Rényi entropy Still, we can first consider the empirical estimator: H α,emp (p) = 1 1 α log ( mi ) α. i [n] M

80 Classical estimators for Rényi entropy Still, we can first consider the empirical estimator: H α,emp (p) = 1 1 α log ( mi ) α. i [n] M In [AOST 17], it is shown that the query complexity of the empirical estimator is { Θ(n 1 α ), when 0 < α < 1. Θ(n), when α > 1.

81 Classical estimators for Rényi entropy Still, we can first consider the empirical estimator: H α,emp (p) = 1 1 α log ( mi ) α. i [n] M In [AOST 17], it is shown that the query complexity of the empirical estimator is { Θ(n 1 α ), when 0 < α < 1. Θ(n), when α > 1. Recall that the classical query complexity of Rényi entropy: O( n α 1 log n ) and Ω(n 1 α o(1) ), when 0 < α < 1. O( n log n ) and Ω(n1 o(1) ), when α > 1, α / N. Θ(n 1 1 α ), when α > 1, α N.

82 Classical estimators for Rényi entropy Main difference happens only when α > 1, α N.

83 Classical estimators for Rényi entropy Main difference happens only when α > 1, α N. This is not a minor point. In particular, 2-Rényi entropy (also known as the collision entropy) has classical query complexity Θ( n), which is much less than that of Shannon entropy (Θ( n log n )).

84 Classical estimators for Rényi entropy Main difference happens only when α > 1, α N. This is not a minor point. In particular, 2-Rényi entropy (also known as the collision entropy) has classical query complexity Θ( n), which is much less than that of Shannon entropy (Θ( n log n )). Collision entropy gives a rough estimation of Shannon entropy, and it measures the quality of random number generators and key derivation in cryptographic applications.

85 Classical estimators for Rényi entropy When α > 1 and α N, [AOST 17] proposes the following biascorrected estimator: H α,bias (p) = 1 1 α log m i (m i 1) (m i α + 1) M α. i [n] This is actually the best estimator with query complexity Θ(n 1 1 α ).

86 Classical estimators for Rényi entropy When α > 1 and α N, [AOST 17] proposes the following biascorrected estimator: H α,bias (p) = 1 1 α log m i (m i 1) (m i α + 1) M α. i [n] This is actually the best estimator with query complexity Θ(n 1 1 α ). The analysis is mainly based on a Possionization technique.

87 An intuition of Possionization

88 An intuition of Possionization When you are taking M samples where M is fixed, then the occurences of 1,..., n are not independent. But if you take M Poi(λ), then the occurence of i is m i Poi(λp i ), and all m i are pairwise independent.

89 An intuition of Possionization When you are taking M samples where M is fixed, then the occurences of 1,..., n are not independent. But if you take M Poi(λ), then the occurence of i is m i Poi(λp i ), and all m i are pairwise independent. Moreover, with high probability λ/2 M 2λ, so Possionization does not influence query complexity up to a constant.

90 An intuition of Possionization When you are taking M samples where M is fixed, then the occurences of 1,..., n are not independent. But if you take M Poi(λ), then the occurence of i is m i Poi(λp i ), and all m i are pairwise independent. Moreover, with high probability λ/2 M 2λ, so Possionization does not influence query complexity up to a constant. A key reason to use bias-corrected estimator: if a random variable X Poi(λ), then r N, E[X(X 1) (X r + 1)] = λ r.

91 An intuition of Possionization When you are taking M samples where M is fixed, then the occurences of 1,..., n are not independent. But if you take M Poi(λ), then the occurence of i is m i Poi(λp i ), and all m i are pairwise independent. Moreover, with high probability λ/2 M 2λ, so Possionization does not influence query complexity up to a constant. A key reason to use bias-corrected estimator: if a random variable X Poi(λ), then r N, E[X(X 1) (X r + 1)] = λ r. This property helps a lot for the analysis in [AOST 17] based on Chernoff-Hoeffding inequality.

92 References G. Valiant and P. Valiant. Estimating the unseen: an n/log (n)-sample estimator for entropy and support size, shown optimal via new CLTs. Proceedings of the forty-third annual ACM Symposium on Theory of Computing, ACM, J. Jiao, K. Venkat, Y. Han, and T. Weissman. Minimax estimation of functionals of discrete distributions. IEEE Transactions on Information Theory, 61.5 (2015): J. Acharya, A. Orlitsky, A. T. Suresh, and H. Tyagi. Estimating Rényi entropy of discrete distributions. IEEE Transactions on Information Theory, 63.1 (2017): S. Bravyi, A. W. Harrow, and A. Hassidim. Quantum algorithms for testing properties of distributions.. IEEE Transactions on Information Theory, 57.6 (2011): A. Montanaro. Quantum speedup of Monte Carlo methods. Proc. R. Soc. A, Vol No The Royal Society, 2015.

Maximum Likelihood Approach for Symmetric Distribution Property Estimation

Maximum Likelihood Approach for Symmetric Distribution Property Estimation Maximum Likelihood Approach for Symmetric Distribution Property Estimation Jayadev Acharya, Hirakendu Das, Alon Orlitsky, Ananda Suresh Cornell, Yahoo, UCSD, Google Property estimation p: unknown discrete

More information

Quantum algorithms for testing properties of distributions

Quantum algorithms for testing properties of distributions Quantum algorithms for testing properties of distributions Sergey Bravyi, Aram W. Harrow, and Avinatan Hassidim July 8, 2009 Abstract Suppose one has access to oracles generating samples from two unknown

More information

Improved Quantum Algorithm for Triangle Finding via Combinatorial Arguments

Improved Quantum Algorithm for Triangle Finding via Combinatorial Arguments Improved Quantum Algorithm for Triangle Finding via Combinatorial Arguments François Le Gall The University of Tokyo Technical version available at arxiv:1407.0085 [quant-ph]. Background. Triangle finding

More information

The space complexity of approximating the frequency moments

The space complexity of approximating the frequency moments The space complexity of approximating the frequency moments Felix Biermeier November 24, 2015 1 Overview Introduction Approximations of frequency moments lower bounds 2 Frequency moments Problem Estimate

More information

Information Measure Estimation and Applications: Boosting the Effective Sample Size from n to n ln n

Information Measure Estimation and Applications: Boosting the Effective Sample Size from n to n ln n Information Measure Estimation and Applications: Boosting the Effective Sample Size from n to n ln n Jiantao Jiao (Stanford EE) Joint work with: Kartik Venkat Yanjun Han Tsachy Weissman Stanford EE Tsinghua

More information

Estimating Symmetric Distribution Properties Maximum Likelihood Strikes Back!

Estimating Symmetric Distribution Properties Maximum Likelihood Strikes Back! Estimating Symmetric Distribution Properties Maximum Likelihood Strikes Back! Jayadev Acharya Cornell Hirakendu Das Yahoo Alon Orlitsky UCSD Ananda Suresh Google Shannon Channel March 5, 2018 Based on:

More information

Quantum speedup of backtracking and Monte Carlo algorithms

Quantum speedup of backtracking and Monte Carlo algorithms Quantum speedup of backtracking and Monte Carlo algorithms Ashley Montanaro School of Mathematics, University of Bristol 19 February 2016 arxiv:1504.06987 and arxiv:1509.02374 Proc. R. Soc. A 2015 471

More information

The Polynomial Method Strikes Back: Tight Quantum Query Bounds Via Dual Polynomials

The Polynomial Method Strikes Back: Tight Quantum Query Bounds Via Dual Polynomials The Polynomial Method Strikes Back: Tight Quantum Query Bounds Via Dual Polynomials Justin Thaler (Georgetown) Joint work with: Mark Bun (Princeton) Robin Kothari (MSR Redmond) Boolean Functions Boolean

More information

Quantum Communication Complexity

Quantum Communication Complexity Quantum Communication Complexity Ronald de Wolf Communication complexity has been studied extensively in the area of theoretical computer science and has deep connections with seemingly unrelated areas,

More information

Lower Bounds for Testing Bipartiteness in Dense Graphs

Lower Bounds for Testing Bipartiteness in Dense Graphs Lower Bounds for Testing Bipartiteness in Dense Graphs Andrej Bogdanov Luca Trevisan Abstract We consider the problem of testing bipartiteness in the adjacency matrix model. The best known algorithm, due

More information

Quantum pattern matching fast on average

Quantum pattern matching fast on average Quantum pattern matching fast on average Ashley Montanaro Department of Computer Science, University of Bristol, UK 12 January 2015 Pattern matching In the traditional pattern matching problem, we seek

More information

Quantum algorithms. Andrew Childs. Institute for Quantum Computing University of Waterloo

Quantum algorithms. Andrew Childs. Institute for Quantum Computing University of Waterloo Quantum algorithms Andrew Childs Institute for Quantum Computing University of Waterloo 11th Canadian Summer School on Quantum Information 8 9 June 2011 Based in part on slides prepared with Pawel Wocjan

More information

Lecture 12: Lower Bounds for Element-Distinctness and Collision

Lecture 12: Lower Bounds for Element-Distinctness and Collision Quantum Computation (CMU 18-859BB, Fall 015) Lecture 1: Lower Bounds for Element-Distinctness and Collision October 19, 015 Lecturer: John Wright Scribe: Titouan Rigoudy 1 Outline In this lecture, we will:

More information

Lecture 21: Counting and Sampling Problems

Lecture 21: Counting and Sampling Problems princeton univ. F 14 cos 521: Advanced Algorithm Design Lecture 21: Counting and Sampling Problems Lecturer: Sanjeev Arora Scribe: Today s topic of counting and sampling problems is motivated by computational

More information

Quantum algorithms (CO 781/CS 867/QIC 823, Winter 2013) Andrew Childs, University of Waterloo LECTURE 13: Query complexity and the polynomial method

Quantum algorithms (CO 781/CS 867/QIC 823, Winter 2013) Andrew Childs, University of Waterloo LECTURE 13: Query complexity and the polynomial method Quantum algorithms (CO 781/CS 867/QIC 823, Winter 2013) Andrew Childs, University of Waterloo LECTURE 13: Query complexity and the polynomial method So far, we have discussed several different kinds of

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

1 Approximate Counting by Random Sampling

1 Approximate Counting by Random Sampling COMP8601: Advanced Topics in Theoretical Computer Science Lecture 5: More Measure Concentration: Counting DNF Satisfying Assignments, Hoeffding s Inequality Lecturer: Hubert Chan Date: 19 Sep 2013 These

More information

Lecture notes for quantum semidefinite programming (SDP) solvers

Lecture notes for quantum semidefinite programming (SDP) solvers CMSC 657, Intro to Quantum Information Processing Lecture on November 15 and 0, 018 Fall 018, University of Maryland Prepared by Tongyang Li, Xiaodi Wu Lecture notes for quantum semidefinite programming

More information

{X i } realize. n i=1 X i. Note that again X is a random variable. If we are to

{X i } realize. n i=1 X i. Note that again X is a random variable. If we are to 3 Convergence This topic will overview a variety of extremely powerful analysis results that span statistics, estimation theorem, and big data. It provides a framework to think about how to aggregate more

More information

The Quantum Query Complexity of Algebraic Properties

The Quantum Query Complexity of Algebraic Properties The Quantum Query Complexity of Algebraic Properties Sebastian Dörn Institut für Theoretische Informatik Universität Ulm 89069 Ulm, Germany Thomas Thierauf Fak. Elektronik und Informatik HTW Aalen 73430

More information

arxiv: v4 [quant-ph] 14 Mar 2018

arxiv: v4 [quant-ph] 14 Mar 2018 A Survey of Quantum Property Testing Ashley Montanaro Ronald de Wolf arxiv:1310.2035v4 [quant-ph] 14 Mar 2018 March 15, 2018 Abstract The area of property testing tries to design algorithms that can efficiently

More information

arxiv:quant-ph/ v1 29 May 2003

arxiv:quant-ph/ v1 29 May 2003 Quantum Lower Bounds for Collision and Element Distinctness with Small Range arxiv:quant-ph/0305179v1 29 May 2003 Andris Ambainis Abstract We give a general method for proving quantum lower bounds for

More information

A better lower bound for quantum algorithms searching an ordered list

A better lower bound for quantum algorithms searching an ordered list A better lower bound for quantum algorithms searching an ordered list Andris Ambainis Computer Science Division University of California Berkeley, CA 94720, e-mail: ambainis@cs.berkeley.edu Abstract We

More information

Quantum Query Complexity of Boolean Functions with Small On-Sets

Quantum Query Complexity of Boolean Functions with Small On-Sets Quantum Query Complexity of Boolean Functions with Small On-Sets Seiichiro Tani TT/JST ERATO-SORST. Joint work with Andris Ambainis Univ. of Latvia Kazuo Iwama Kyoto Univ. Masaki akanishi AIST. Harumichi

More information

Most data analysis starts with some data set; we will call this data set P. It will be composed of a set of n

Most data analysis starts with some data set; we will call this data set P. It will be composed of a set of n 3 Convergence This topic will overview a variety of extremely powerful analysis results that span statistics, estimation theorem, and big data. It provides a framework to think about how to aggregate more

More information

Notes 6 : First and second moment methods

Notes 6 : First and second moment methods Notes 6 : First and second moment methods Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Roc, Sections 2.1-2.3]. Recall: THM 6.1 (Markov s inequality) Let X be a non-negative

More information

Compute the Fourier transform on the first register to get x {0,1} n x 0.

Compute the Fourier transform on the first register to get x {0,1} n x 0. CS 94 Recursive Fourier Sampling, Simon s Algorithm /5/009 Spring 009 Lecture 3 1 Review Recall that we can write any classical circuit x f(x) as a reversible circuit R f. We can view R f as a unitary

More information

CSE548, AMS542: Analysis of Algorithms, Spring 2014 Date: May 12. Final In-Class Exam. ( 2:35 PM 3:50 PM : 75 Minutes )

CSE548, AMS542: Analysis of Algorithms, Spring 2014 Date: May 12. Final In-Class Exam. ( 2:35 PM 3:50 PM : 75 Minutes ) CSE548, AMS54: Analysis of Algorithms, Spring 014 Date: May 1 Final In-Class Exam ( :35 PM 3:50 PM : 75 Minutes ) This exam will account for either 15% or 30% of your overall grade depending on your relative

More information

arxiv: v1 [quant-ph] 24 Jul 2015

arxiv: v1 [quant-ph] 24 Jul 2015 Quantum Algorithm for Triangle Finding in Sparse Graphs François Le Gall Shogo Nakajima Department of Computer Science Graduate School of Information Science and Technology The University of Tokyo, Japan

More information

6.842 Randomness and Computation Lecture 5

6.842 Randomness and Computation Lecture 5 6.842 Randomness and Computation 2012-02-22 Lecture 5 Lecturer: Ronitt Rubinfeld Scribe: Michael Forbes 1 Overview Today we will define the notion of a pairwise independent hash function, and discuss its

More information

From Adversaries to Algorithms. Troy Lee Rutgers University

From Adversaries to Algorithms. Troy Lee Rutgers University From Adversaries to Algorithms Troy Lee Rutgers University Quantum Query Complexity As in classical complexity, it is difficult to show lower bounds on the time or number of gates required to solve a problem

More information

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant Lower Bound Techniques for Statistical Estimation Gregory Valiant and Paul Valiant The Setting Given independent samples from a distribution (of discrete support): D Estimate # species Estimate entropy

More information

Lecture 4: Phase Transition in Random Graphs

Lecture 4: Phase Transition in Random Graphs Lecture 4: Phase Transition in Random Graphs Radu Balan February 21, 2017 Distributions Today we discuss about phase transition in random graphs. Recall on the Erdöd-Rényi class G n,p of random graphs,

More information

Lecture Learning infinite hypothesis class via VC-dimension and Rademacher complexity;

Lecture Learning infinite hypothesis class via VC-dimension and Rademacher complexity; CSCI699: Topics in Learning and Game Theory Lecture 2 Lecturer: Ilias Diakonikolas Scribes: Li Han Today we will cover the following 2 topics: 1. Learning infinite hypothesis class via VC-dimension and

More information

Lecture 13: Lower Bounds using the Adversary Method. 2 The Super-Basic Adversary Method [Amb02]

Lecture 13: Lower Bounds using the Adversary Method. 2 The Super-Basic Adversary Method [Amb02] Quantum Computation (CMU 18-859BB, Fall 015) Lecture 13: Lower Bounds using the Adversary Method October 1, 015 Lecturer: Ryan O Donnell Scribe: Kumail Jaffer 1 Introduction There are a number of known

More information

On the Sample Complexity of Noise-Tolerant Learning

On the Sample Complexity of Noise-Tolerant Learning On the Sample Complexity of Noise-Tolerant Learning Javed A. Aslam Department of Computer Science Dartmouth College Hanover, NH 03755 Scott E. Decatur Laboratory for Computer Science Massachusetts Institute

More information

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2 MA008 p.1/36 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/36 Content of lecture 2 Examples Review data structures Data types vs. data

More information

Random Walks on graphs

Random Walks on graphs November 2007 UW Probability Seminar 1 Random Walks on graphs Random walk on G: Simple to analyze: satisfying some natural properties Mixes quickly to stationary distribution. efficient sampling of the

More information

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang. Florida State University March 1, 2018 Framework 1. (Lizhe) Basic inequalities Chernoff bounding Review for STA 6448 2. (Lizhe) Discrete-time martingales inequalities via martingale approach 3. (Boning)

More information

Lecture 4: Two-point Sampling, Coupon Collector s problem

Lecture 4: Two-point Sampling, Coupon Collector s problem Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms

More information

Hamiltonian simulation and solving linear systems

Hamiltonian simulation and solving linear systems Hamiltonian simulation and solving linear systems Robin Kothari Center for Theoretical Physics MIT Quantum Optimization Workshop Fields Institute October 28, 2014 Ask not what you can do for quantum computing

More information

Stanford University CS254: Computational Complexity Handout 8 Luca Trevisan 4/21/2010

Stanford University CS254: Computational Complexity Handout 8 Luca Trevisan 4/21/2010 Stanford University CS254: Computational Complexity Handout 8 Luca Trevisan 4/2/200 Counting Problems Today we describe counting problems and the class #P that they define, and we show that every counting

More information

Introduction The Search Algorithm Grovers Algorithm References. Grovers Algorithm. Quantum Parallelism. Joseph Spring.

Introduction The Search Algorithm Grovers Algorithm References. Grovers Algorithm. Quantum Parallelism. Joseph Spring. Quantum Parallelism Applications Outline 1 2 One or Two Points 3 4 Quantum Parallelism We have discussed the concept of quantum parallelism and now consider a range of applications. These will include:

More information

Quantum Queries for Testing Distributions

Quantum Queries for Testing Distributions Quantum Queries for Testing Distributions Sourav Chakraborty Eldar Fischer Arie Matsliah Ronald de Wolf Abstract We consider probability distributions given in the form of an oracle f : [n] [m] that we

More information

On the method of typical bounded differences. Lutz Warnke. Georgia Tech

On the method of typical bounded differences. Lutz Warnke. Georgia Tech On the method of typical bounded differences Lutz Warnke Georgia Tech What is this talk about? Motivation Behaviour of a function of independent random variables ξ 1,..., ξ n : X = F (ξ 1,..., ξ n ) the

More information

Lecture notes on OPP algorithms [Preliminary Draft]

Lecture notes on OPP algorithms [Preliminary Draft] Lecture notes on OPP algorithms [Preliminary Draft] Jesper Nederlof June 13, 2016 These lecture notes were quickly assembled and probably contain many errors. Use at your own risk! Moreover, especially

More information

3 Finish learning monotone Boolean functions

3 Finish learning monotone Boolean functions COMS 6998-3: Sub-Linear Algorithms in Learning and Testing Lecturer: Rocco Servedio Lecture 5: 02/19/2014 Spring 2014 Scribes: Dimitris Paidarakis 1 Last time Finished KM algorithm; Applications of KM

More information

arxiv: v1 [quant-ph] 6 Feb 2013

arxiv: v1 [quant-ph] 6 Feb 2013 Exact quantum query complexity of EXACT and THRESHOLD arxiv:302.235v [quant-ph] 6 Feb 203 Andris Ambainis Jānis Iraids Juris Smotrovs University of Latvia, Raiņa bulvāris 9, Riga, LV-586, Latvia February

More information

Course Notes. Part IV. Probabilistic Combinatorics. Algorithms

Course Notes. Part IV. Probabilistic Combinatorics. Algorithms Course Notes Part IV Probabilistic Combinatorics and Algorithms J. A. Verstraete Department of Mathematics University of California San Diego 9500 Gilman Drive La Jolla California 92037-0112 jacques@ucsd.edu

More information

arxiv: v2 [quant-ph] 25 May 2018

arxiv: v2 [quant-ph] 25 May 2018 The Polynomial Method Strikes Back: Tight Quantum Query Bounds via Dual Polynomials arxiv:1710.09079v2 [quant-ph] 25 May 2018 Mark Bun Princeton University mbun@cs.princeton.edu Robin Kothari Microsoft

More information

Frequency Estimators

Frequency Estimators Frequency Estimators Outline for Today Randomized Data Structures Our next approach to improving performance. Count-Min Sketches A simple and powerful data structure for estimating frequencies. Count Sketches

More information

Lecture 7: Passive Learning

Lecture 7: Passive Learning CS 880: Advanced Complexity Theory 2/8/2008 Lecture 7: Passive Learning Instructor: Dieter van Melkebeek Scribe: Tom Watson In the previous lectures, we studied harmonic analysis as a tool for analyzing

More information

arxiv: v3 [quant-ph] 12 May 2010 Abstract

arxiv: v3 [quant-ph] 12 May 2010 Abstract New Results on Quantum Property Testing Sourav Chakraborty Eldar Fischer Arie Matsliah Ronald de Wolf arxiv:1005.0523v3 [quant-ph] 12 May 2010 Abstract We present several new examples of speed-ups obtainable

More information

Bounds for Error Reduction with Few Quantum Queries

Bounds for Error Reduction with Few Quantum Queries Bounds for Error Reduction with Few Quantum Queries Sourav Chakraborty, Jaikumar Radhakrishnan 2,3, and andakumar Raghunathan Department of Computer Science, University of Chicago, Chicago, IL 60637, USA

More information

Lecture 1: Introduction to Sublinear Algorithms

Lecture 1: Introduction to Sublinear Algorithms CSE 522: Sublinear (and Streaming) Algorithms Spring 2014 Lecture 1: Introduction to Sublinear Algorithms March 31, 2014 Lecturer: Paul Beame Scribe: Paul Beame Too much data, too little time, space for

More information

Introduction to Quantum Computing

Introduction to Quantum Computing Introduction to Quantum Computing Part II Emma Strubell http://cs.umaine.edu/~ema/quantum_tutorial.pdf April 13, 2011 Overview Outline Grover s Algorithm Quantum search A worked example Simon s algorithm

More information

Decoupling course outline Decoupling theory is a recent development in Fourier analysis with applications in partial differential equations and

Decoupling course outline Decoupling theory is a recent development in Fourier analysis with applications in partial differential equations and Decoupling course outline Decoupling theory is a recent development in Fourier analysis with applications in partial differential equations and analytic number theory. It studies the interference patterns

More information

{X i } realize. n i=1 X i. Note that again X is a random variable. If we are to

{X i } realize. n i=1 X i. Note that again X is a random variable. If we are to 3 Convergence This topic will overview a variety of extremely powerful analysis results that span statistics, estimation theorem, and big data. It provides a framework to think about how to aggregate more

More information

On the complexity of approximate multivariate integration

On the complexity of approximate multivariate integration On the complexity of approximate multivariate integration Ioannis Koutis Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 USA ioannis.koutis@cs.cmu.edu January 11, 2005 Abstract

More information

Trace Reconstruction Revisited

Trace Reconstruction Revisited Trace Reconstruction Revisited Andrew McGregor 1, Eric Price 2, and Sofya Vorotnikova 1 1 University of Massachusetts Amherst {mcgregor,svorotni}@cs.umass.edu 2 IBM Almaden Research Center ecprice@mit.edu

More information

Discrete quantum random walks

Discrete quantum random walks Quantum Information and Computation: Report Edin Husić edin.husic@ens-lyon.fr Discrete quantum random walks Abstract In this report, we present the ideas behind the notion of quantum random walks. We further

More information

A Las Vegas approximation algorithm for metric 1-median selection

A Las Vegas approximation algorithm for metric 1-median selection A Las Vegas approximation algorithm for metric -median selection arxiv:70.0306v [cs.ds] 5 Feb 07 Ching-Lueh Chang February 8, 07 Abstract Given an n-point metric space, consider the problem of finding

More information

Testing Monotone High-Dimensional Distributions

Testing Monotone High-Dimensional Distributions Testing Monotone High-Dimensional Distributions Ronitt Rubinfeld Computer Science & Artificial Intelligence Lab. MIT Cambridge, MA 02139 ronitt@theory.lcs.mit.edu Rocco A. Servedio Department of Computer

More information

Lecture 5: Probabilistic tools and Applications II

Lecture 5: Probabilistic tools and Applications II T-79.7003: Graphs and Networks Fall 2013 Lecture 5: Probabilistic tools and Applications II Lecturer: Charalampos E. Tsourakakis Oct. 11, 2013 5.1 Overview In the first part of today s lecture we will

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Robustness and duality of maximum entropy and exponential family distributions

Robustness and duality of maximum entropy and exponential family distributions Chapter 7 Robustness and duality of maximum entropy and exponential family distributions In this lecture, we continue our study of exponential families, but now we investigate their properties in somewhat

More information

Janson s Inequality and Poisson Heuristic

Janson s Inequality and Poisson Heuristic Janson s Inequality and Poisson Heuristic Dinesh K CS11M019 IIT Madras April 30, 2012 Dinesh (IITM) Janson s Inequality April 30, 2012 1 / 11 Outline 1 Motivation Dinesh (IITM) Janson s Inequality April

More information

COS598D Lecture 3 Pseudorandom generators from one-way functions

COS598D Lecture 3 Pseudorandom generators from one-way functions COS598D Lecture 3 Pseudorandom generators from one-way functions Scribe: Moritz Hardt, Srdjan Krstic February 22, 2008 In this lecture we prove the existence of pseudorandom-generators assuming that oneway

More information

P, NP, NP-Complete, and NPhard

P, NP, NP-Complete, and NPhard P, NP, NP-Complete, and NPhard Problems Zhenjiang Li 21/09/2011 Outline Algorithm time complicity P and NP problems NP-Complete and NP-Hard problems Algorithm time complicity Outline What is this course

More information

Notes on the second moment method, Erdős multiplication tables

Notes on the second moment method, Erdős multiplication tables Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where

More information

When to Use Bit-Wise Neutrality

When to Use Bit-Wise Neutrality When to Use Bit-Wise Neutrality Tobias Friedrich Department 1: Algorithms and Complexity Max-Planck-Institut für Informatik Saarbrücken, Germany Frank Neumann Department 1: Algorithms and Complexity Max-Planck-Institut

More information

CS Data Structures and Algorithm Analysis

CS Data Structures and Algorithm Analysis CS 483 - Data Structures and Algorithm Analysis Lecture II: Chapter 2 R. Paul Wiegand George Mason University, Department of Computer Science February 1, 2006 Outline 1 Analysis Framework 2 Asymptotic

More information

arxiv: v1 [quant-ph] 21 Jun 2011

arxiv: v1 [quant-ph] 21 Jun 2011 arxiv:1106.4267v1 [quant-ph] 21 Jun 2011 An optimal quantum algorithm to approximate the mean and its application for approximating the median of a set of points over an arbitrary distance Gilles Brassard,

More information

Testing that distributions are close

Testing that distributions are close Tugkan Batu, Lance Fortnow, Ronitt Rubinfeld, Warren D. Smith, Patrick White May 26, 2010 Anat Ganor Introduction Goal Given two distributions over an n element set, we wish to check whether these distributions

More information

Quantum Oracle Classification

Quantum Oracle Classification Quantum Oracle Classification The Case of Group Structure Mark Zhandry Princeton University Query Complexity x O(x) O:Xà Y Info about O Examples: Pre- image of given output Collision Complete description

More information

Non-Interactive Zero Knowledge (II)

Non-Interactive Zero Knowledge (II) Non-Interactive Zero Knowledge (II) CS 601.442/642 Modern Cryptography Fall 2017 S 601.442/642 Modern CryptographyNon-Interactive Zero Knowledge (II) Fall 2017 1 / 18 NIZKs for NP: Roadmap Last-time: Transformation

More information

Lecture 9: March 26, 2014

Lecture 9: March 26, 2014 COMS 6998-3: Sub-Linear Algorithms in Learning and Testing Lecturer: Rocco Servedio Lecture 9: March 26, 204 Spring 204 Scriber: Keith Nichols Overview. Last Time Finished analysis of O ( n ɛ ) -query

More information

CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms

CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms Tim Roughgarden March 9, 2017 1 Preamble Our first lecture on smoothed analysis sought a better theoretical

More information

New Results on Quantum Property Testing

New Results on Quantum Property Testing New Results on Quantum Property Testing Sourav Chakraborty 1, Eldar Fischer 2, Arie Matsliah 3, and Ronald de Wolf 3 1 Chennai Mathematical Institute, Chennai, India. sourav@cmi.ac.in 2 Computer Science

More information

Quantum algorithms for testing Boolean functions

Quantum algorithms for testing Boolean functions Quantum algorithms for testing Boolean functions Dominik F. Floess Erika Andersson SUPA, School of Engineering and Physical Sciences Heriot-Watt University, Edinburgh EH4 4AS, United Kingdom dominikfloess@gmx.de

More information

Reduction of Variance. Importance Sampling

Reduction of Variance. Importance Sampling Reduction of Variance As we discussed earlier, the statistical error goes as: error = sqrt(variance/computer time). DEFINE: Efficiency = = 1/vT v = error of mean and T = total CPU time How can you make

More information

Lecture 8: February 8

Lecture 8: February 8 CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 8: February 8 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

Complexity-Theoretic Foundations of Quantum Supremacy Experiments

Complexity-Theoretic Foundations of Quantum Supremacy Experiments Complexity-Theoretic Foundations of Quantum Supremacy Experiments Scott Aaronson, Lijie Chen UT Austin, Tsinghua University MIT July 7, 2017 Scott Aaronson, Lijie Chen (UT Austin, Tsinghua University Complexity-Theoretic

More information

1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample

More information

Approximating Submodular Functions. Nick Harvey University of British Columbia

Approximating Submodular Functions. Nick Harvey University of British Columbia Approximating Submodular Functions Nick Harvey University of British Columbia Approximating Submodular Functions Part 1 Nick Harvey University of British Columbia Department of Computer Science July 11th,

More information

CS264: Beyond Worst-Case Analysis Lecture #15: Smoothed Complexity and Pseudopolynomial-Time Algorithms

CS264: Beyond Worst-Case Analysis Lecture #15: Smoothed Complexity and Pseudopolynomial-Time Algorithms CS264: Beyond Worst-Case Analysis Lecture #15: Smoothed Complexity and Pseudopolynomial-Time Algorithms Tim Roughgarden November 5, 2014 1 Preamble Previous lectures on smoothed analysis sought a better

More information

1 Hoeffding s Inequality

1 Hoeffding s Inequality Proailistic Method: Hoeffding s Inequality and Differential Privacy Lecturer: Huert Chan Date: 27 May 22 Hoeffding s Inequality. Approximate Counting y Random Sampling Suppose there is a ag containing

More information

Simultaneous Communication Protocols with Quantum and Classical Messages

Simultaneous Communication Protocols with Quantum and Classical Messages Simultaneous Communication Protocols with Quantum and Classical Messages Oded Regev Ronald de Wolf July 17, 2008 Abstract We study the simultaneous message passing model of communication complexity, for

More information

Affine extractors over large fields with exponential error

Affine extractors over large fields with exponential error Affine extractors over large fields with exponential error Jean Bourgain Zeev Dvir Ethan Leeman Abstract We describe a construction of explicit affine extractors over large finite fields with exponentially

More information

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2 Asymptotic Analysis Slides by Carl Kingsford Jan. 27, 2014 AD Chapter 2 Independent Set Definition (Independent Set). Given a graph G = (V, E) an independent set is a set S V if no two nodes in S are joined

More information

ICML '97 and AAAI '97 Tutorials

ICML '97 and AAAI '97 Tutorials A Short Course in Computational Learning Theory: ICML '97 and AAAI '97 Tutorials Michael Kearns AT&T Laboratories Outline Sample Complexity/Learning Curves: nite classes, Occam's VC dimension Razor, Best

More information

Lecture 5: Two-point Sampling

Lecture 5: Two-point Sampling Randomized Algorithms Lecture 5: Two-point Sampling Sotiris Nikoletseas Professor CEID - ETY Course 2017-2018 Sotiris Nikoletseas, Professor Randomized Algorithms - Lecture 5 1 / 26 Overview A. Pairwise

More information

Quantum Algorithms for Evaluating Min-Max Trees

Quantum Algorithms for Evaluating Min-Max Trees Quantum Algorithms for Evaluating Min-Max Trees Richard Cleve 1,2,DmitryGavinsky 1, and D. L. Yonge-Mallo 1 1 David R. Cheriton School of Computer Science and Institute for Quantum Computing, University

More information

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector 1 Query model In the quantum query model we wish to compute some function f and we access the input through queries. The complexity of f is the number of queries needed to compute f on a worst-case input

More information

Taming Big Probability Distributions

Taming Big Probability Distributions Taming Big Probability Distributions Ronitt Rubinfeld June 18, 2012 CSAIL, Massachusetts Institute of Technology, Cambridge MA 02139 and the Blavatnik School of Computer Science, Tel Aviv University. Research

More information

The Query Complexity of Witness Finding

The Query Complexity of Witness Finding The Query Complexity of Witness Finding Akinori Kawachi Benjamin Rossman Osamu Watanabe September 10, 2016 Abstract We study the following information-theoretic witness finding problem: for a hidden nonempty

More information

On the query complexity of counterfeiting quantum money

On the query complexity of counterfeiting quantum money On the query complexity of counterfeiting quantum money Andrew Lutomirski December 14, 2010 Abstract Quantum money is a quantum cryptographic protocol in which a mint can produce a state (called a quantum

More information

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website: CS 344 Design and Analysis of Algorithms Tarek El-Gaaly tgaaly@cs.rutgers.edu Course website: www.cs.rutgers.edu/~tgaaly/cs344.html Course Outline Textbook: Algorithms by S. Dasgupta, C.H. Papadimitriou,

More information

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0 Asymptotic Notation Asymptotic notation deals with the behaviour of a function in the limit, that is, for sufficiently large values of its parameter. Often, when analysing the run time of an algorithm,

More information

Computational Learning Theory

Computational Learning Theory 1 Computational Learning Theory 2 Computational learning theory Introduction Is it possible to identify classes of learning problems that are inherently easy or difficult? Can we characterize the number

More information