Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Size: px
Start display at page:

Download "Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery"

Transcription

1 Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) Switzerland Bristol Statistics Seminar [October, 2015] Joint work with Volkan LIONS

2 Group testing Limits on Support Recovery Jonathan Scarlett, Slide 2/ 25

3 Group testing In this talk: Defective set S Uniform( p ) k n p Measurement matrix X P X (x i=1 j=1 i,j) with P X Bernoulli(ν/k) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 2/ 25

4 Group testing In this talk: Defective set S Uniform( p ) k n p Measurement matrix X P X (x i=1 j=1 i,j) with P X Bernoulli(ν/k) Perfect recovery P e := P[Ŝ S] Partial recovery P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 2/ 25

5 Group testing In this talk: Defective set S Uniform( p ) k n p Measurement matrix X P X (x i=1 j=1 i,j) with P X Bernoulli(ν/k) Perfect recovery P e := P[Ŝ S] Partial recovery P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Goal: Conditions on n for P e 0 or P e(d max) 0 Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 2/ 25

6 Noiseless Group Testing (Exact Recovery) Observation model { } Y = 1 {X i = 1} i S Bernoulli measurements, sparsity k = O(p θ ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 3/ 25

7 Noiseless Group Testing (Exact Recovery) Observation model { } Y = 1 {X i = 1} i S Bernoulli measurements, sparsity k = O(p θ ) Sufficient for P e 0: n inf max ν>0 Necessary for P e 1: { }( θ e ν ν(1 θ), 1 H 2 (e ν k log p ) (1 + η) ) k n k log p k log 2 (1 η) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 3/ 25

8 Noiseless Group Testing (Exact Recovery) Key Implication: i.i.d. Bernoulli measurements are asymptotically as good as optimal adaptive measurements when k = O(p 1/3 ). Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 4/ 25

9 Noisy Group Testing (Exact Recovery) Observation model where Z Bernoulli(ρ) { } Y = 1 {X i = 1} Z i S Bernoulli measurements, sparsity k = O(p θ ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 5/ 25

10 Noisy Group Testing (Exact Recovery) Observation model where Z Bernoulli(ρ) { } Y = 1 {X i = 1} Z i S Bernoulli measurements, sparsity k = O(p θ ) Sufficient for P e 0: n inf δ 2 (0,1) max { }( 1 ζ(ρ, δ 2, θ), k log p ) (1 + η) log 2 H 2 (ρ) k where ζ(ρ, δ 2, θ) := 2 { 2(1 + 1 log 2 max 3 δ 2(1 2ρ)) θ 1+2θ } 1 θ 1 θ δ2 2(1, 2ρ)2 (1 2ρ) log 1 ρ ρ (1 δ. 2) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 5/ 25

11 Noisy Group Testing (Exact Recovery) Observation model where Z Bernoulli(ρ) { } Y = 1 {X i = 1} Z i S Bernoulli measurements, sparsity k = O(p θ ) Sufficient for P e 0: n inf δ 2 (0,1) max { }( 1 ζ(ρ, δ 2, θ), k log p ) (1 + η) log 2 H 2 (ρ) k where ζ(ρ, δ 2, θ) := 2 { 2(1 + 1 log 2 max 3 δ 2(1 2ρ)) θ 1+2θ } 1 θ 1 θ δ2 2(1, 2ρ)2 (1 2ρ) log 1 ρ ρ (1 δ. 2) Necessary for P e 1: n k log p k (1 η) log 2 H 2 (ρ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 5/ 25

12 Noisy Group Testing (Exact Recovery) Key Implication: i.i.d. Bernoulli measurements are asymptotically as good as optimal adaptive measurements when k = O(p θ ) for sufficiently small θ. Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 6/ 25

13 Partial Recovery Recovery criterion P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] where d max = α k for some α (0, 1) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 7/ 25

14 Partial Recovery Recovery criterion P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] where d max = α k for some α (0, 1) Sufficient for P e(d max) 0: Necessary for P e(d max) 1: n k log p k (1 + η) log 2 H 2 (ρ) n (1 α )k log p k log 2 H 2 (ρ) (1 η) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 7/ 25

15 Partial Recovery Recovery criterion P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] where d max = α k for some α (0, 1) Sufficient for P e(d max) 0: Necessary for P e(d max) 1: n k log p k (1 + η) log 2 H 2 (ρ) n (1 α )k log p k log 2 H 2 (ρ) (1 η) For small θ, the reduction is at most a factor 1 α asymptotically Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 7/ 25

16 Channel coding Message S Codeword Output Encoder X S Channel Y PY n XS Decoder Estimate Ŝ e.g. see [Wainwright, 2009], [Atia and Saligrama, 2012], [Aksoylar et al., 2013] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 8/ 25

17 Thresholding Techniques for Channel Coding Mutual information I (X; Y ) := P XY (x, y) log P Y X (y x) P Y (y) x,y Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 9/ 25

18 Thresholding Techniques for Channel Coding Mutual information I (X; Y ) := P XY (x, y) log P Y X (y x) P Y (y) x,y Information density ı(x; y) := log P Y X (y x) P Y (y) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 9/ 25

19 Thresholding Techniques for Channel Coding Mutual information I (X; Y ) := P XY (x, y) log P Y X (y x) P Y (y) x,y Information density ı(x; y) := log P Y X (y x) P Y (y) Non-asymptotic channel coding bounds [Han, 2003] P e P [ ı n (X; Y) nr log δ ] + δ P e P [ ı n (X; Y) nr + log δ ] δ Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 9/ 25

20 Non-Asymptotic Bounds Information density P Y Xsdif X seq (y x sdif, x seq ) ı(x sdif ; y x seq ) := log P Y Xseq (y x seq ) where (s dif, s eq) is a partition of s Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 10/ 25

21 Non-Asymptotic Bounds Information density P Y Xsdif X seq (y x sdif, x seq ) ı(x sdif ; y x seq ) := log P Y Xseq (y x seq ) where (s dif, s eq) is a partition of s Achievability [ P e P s dif,s eq { ( ı n p k) }] (X sdif ; Y X seq ) log + + δ 1 s dif Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 10/ 25

22 Non-Asymptotic Bounds Information density P Y Xsdif X seq (y x sdif, x seq ) ı(x sdif ; y x seq ) := log P Y Xseq (y x seq ) where (s dif, s eq) is a partition of s Achievability [ P e P s dif,s eq { ( ı n p k) }] (X sdif ; Y X seq ) log + + δ 1 s dif Converse [ ( P e P ı n p k + s dif ) ] (X sdif ; Y X seq ) log + log δ1 δ 1 s dif Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 10/ 25

23 Achievability Bound Decoder that searches for the unique set s S such that for all (s dif, s eq) of s with s dif. ı n (x sdif ; y x seq ) > γ sdif Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

24 Achievability Bound Decoder that searches for the unique set s S such that ı n (x sdif ; y x seq ) > γ sdif for all (s dif, s eq) of s with s dif. Initial bound: [ P e P (s dif,s eq) { }] ı n (X sdif ; Y X seq ) γ sdif + s S\{s} [ ] P ı n (X s\s ; Y X s s ) > γ sdif, Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

25 Achievability Bound Decoder that searches for the unique set s S such that ı n (x sdif ; y x seq ) > γ sdif for all (s dif, s eq) of s with s dif. Analysis of second term (with l := s\s ): [ ] P ı n (X s\s ; Y X s s ) > γ l = P n (k l) X (x s s )P n l X (x s\s)p n Y X seq (y x s s ) x s s,x s\s,y { P n (y x Y X sdif X seq s\s, x s s ) } 1 log P n > γ l (y x Y X seq s s ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

26 Achievability Bound Decoder that searches for the unique set s S such that ı n (x sdif ; y x seq ) > γ sdif for all (s dif, s eq) of s with s dif. Analysis of second term (with l := s\s ): [ ] P ı n (X s\s ; Y X s s ) > γ l = P n (k l) X (x s s )P n l X (x s\s)p n Y X seq (y x s s ) x s s,x s\s,y x s s,x s\s,y P n (k l) X { P n (y x Y X sdif X seq s\s, x s s ) } 1 log P n > γ l (y x Y X seq s s ) (x s s )P n l X (x s\s)p n Y X sdif X seq (y x s\s, x s s )e γ l Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

27 Achievability Bound Decoder that searches for the unique set s S such that ı n (x sdif ; y x seq ) > γ sdif for all (s dif, s eq) of s with s dif. Analysis of second term (with l := s\s ): [ ] P ı n (X s\s ; Y X s s ) > γ l = P n (k l) X (x s s )P n l X (x s\s)p n Y X seq (y x s s ) x s s,x s\s,y x s s,x s\s,y P n (k l) X { P n (y x Y X sdif X seq s\s, x s s ) } 1 log P n > γ l (y x Y X seq s s ) (x s s )P n l X (x s\s)p n Y X sdif X seq (y x s\s, x s s )e γ l = e γ l Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

28 Achievability Bound Decoder that searches for the unique set s S such that ı n (x sdif ; y x seq ) > γ sdif for all (s dif, s eq) of s with s dif. After some re-arrangements and choosing {γ l } appropriately, [ { ( P e P ı n p k) }] (X sdif ; Y X seq ) log + + δ 1 s dif s dif,s eq Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 11/ 25

29 Converse Bound Consider a genie that reveals part of the defective set, S eq S, to the decoder. The decoder is left to estimate S dif := S\S eq. Starting point: P e(s eq) P[A(s eq)] P[A(s eq) no error], where A(s eq) = { ı n (X Sdif ; Y X seq ) γ }. Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 12/ 25

30 Converse Bound Consider a genie that reveals part of the defective set, S eq S, to the decoder. The decoder is left to estimate S dif := S\S eq. Analysis of second term: P[A(s eq) no error] 1 = ( p k+l ) P n p X (x) s dif S dif (s eq) l (x,y) D(s dif s eq) { P n (y x P n Y X sdif X sdif, x seq ) seq Y X sdif X seq (y x sdif, x seq )1 log P n (y x Y X seq ) seq } γ Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 12/ 25

31 Converse Bound Consider a genie that reveals part of the defective set, S eq S, to the decoder. The decoder is left to estimate S dif := S\S eq. Analysis of second term: P[A(s eq) no error] 1 = ( p k+l ) P n p X (x) s dif S dif (s eq) l (x,y) D(s dif s eq) { P n (y x P n Y X sdif X sdif, x seq ) seq Y X sdif X seq (y x sdif, x seq )1 log P n (y x Y X seq ) seq 1 ( p k+l ) P n p X (x)pn Y X seq (y x seq )e γ l s dif S dif (s eq) (x,y) D(s dif s eq) } γ Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 12/ 25

32 Converse Bound Consider a genie that reveals part of the defective set, S eq S, to the decoder. The decoder is left to estimate S dif := S\S eq. Analysis of second term: P[A(s eq) no error] 1 = ( p k+l ) P n p X (x) s dif S dif (s eq) l (x,y) D(s dif s eq) { P n (y x P n Y X sdif X sdif, x seq ) seq Y X sdif X seq (y x sdif, x seq )1 log P n (y x Y X seq ) seq 1 ( p k+l ) P n p X (x)pn Y X seq (y x seq )e γ l e γ = ( p k+l ). l s dif S dif (s eq) (x,y) D(s dif s eq) } γ Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 12/ 25

33 Converse Bound Consider a genie that reveals part of the defective set, S eq S, to the decoder. The decoder is left to estimate S dif := S\S eq. After some re-arrangements and choosing {γ l } appropriately, [ ( P e P ı n p k + s dif ) ] (X sdif ; Y X seq ) log + log δ1 δ 1 s dif Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 12/ 25

34 Non-Asymptotic Bounds Information density P Y Xsdif X seq (y x sdif, x seq ) ı(x sdif ; y x seq ) := log P Y Xseq (y x seq ) where (s dif, s eq) is a partition of s Achievability [ P e P s dif,s eq { ( ı n p k) }] (X sdif ; Y X seq ) log + + δ 1 s dif Converse [ ( P e P ı n p k + s dif ) ] (X sdif ; Y X seq ) log + log δ1 δ 1 s dif Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 13/ 25

35 Application Techniques General steps for application [ 1. Bound the tail probabilities P ı n (X sdif ; Y X seq ) E[ı n ] ± nδ ] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 14/ 25

36 Application Techniques General steps for application [ 1. Bound the tail probabilities P ı n (X sdif ; Y X seq ) E[ı n ] ± nδ 2. Control and simplify the remainder terms ] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 14/ 25

37 Application Techniques General steps for application [ 1. Bound the tail probabilities P ı n (X sdif ; Y X seq ) E[ı n ] ± nδ 2. Control and simplify the remainder terms General form of corollaries: P e 0 if ( ) p k log s n max dif (1 + η) (s dif,s eq) I (X sdif ; Y X seq ) ] and P e 1 if n max (s dif,s eq) ( ) p k+ s log dif s dif (1 η) I (X sdif ; Y X seq ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 14/ 25

38 Concentration Inequalities Noiseless and noisy cases (This one alone is enough for partial recovery!): [ ı ] ( ) P n (X sdif ; Y Xseq ) ni (l) nδ 2 exp δ2 n 4(8 + δ) Proved using Bernstein s inequality The only property used is that Y = 2 Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 15/ 25

39 Concentration Inequalities Noiseless and noisy cases (This one alone is enough for partial recovery!): [ ı ] ( ) P n (X sdif ; Y Xseq ) ni (l) nδ 2 exp δ2 n 4(8 + δ) Proved using Bernstein s inequality The only property used is that Y = 2 Noiseless case: [ ] P ı n ni (l)(1 δ 2 ) ( exp n l ) ) k e ν ν ((1 δ 2 ) log(1 δ 2 ) + δ 2 (1 ɛ) Proved by writing ı n (X sdif ; Y X seq ) in terms of Binomial random variables, then applying Binomial tail bounds Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 15/ 25

40 Concentration Inequalities Noiseless and noisy cases (This one alone is enough for partial recovery!): [ ı ] ( ) P n (X sdif ; Y Xseq ) ni (l) nδ 2 exp δ2 n 4(8 + δ) Proved using Bernstein s inequality The only property used is that Y = 2 Noiseless case: [ ] P ı n ni (l)(1 δ 2 ) ( exp n l ) ) k e ν ν ((1 δ 2 ) log(1 δ 2 ) + δ 2 (1 ɛ) Proved by writing ı n (X sdif ; Y X seq ) in terms of Binomial random variables, then applying Binomial tail bounds Noisy case with crossover probability ρ: [ ] ( P ı n ni (l)(1 δ 2 ) exp n l ( k e ν ν δ2 2 (1 2ρ)2 2( δ 2(1 2ρ)) ) ) (1 ɛ) Proved using Bennet s inequality may be somewhat crude Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 15/ 25

41 Recap of Results Noiseless case (exact recovery): Sufficient for P e 0: { θ n inf max ν>0 e ν ν(1 θ), 1 H 2(e ν ) Necessary for P e 1: } ( k log p k ) (1 + η) n k log p k (1 η) log 2 Noisy case (exact recovery): Sufficient for P e 0: { } 1 ( n inf max ζ(ρ, δ 2, θ), k log p ) (1 + η). δ 2 (0,1) log 2 H 2(ρ) k Necessary for P e 1: k log p k n (1 η) log 2 H 2(ρ) Noiseless and noisy cases (partial recovery): Sufficient for P e(d max) 0: Necessary for P e(d max) 1: n k log p k (1 + η) log 2 H 2(ρ) n (1 α )k log p k (1 η) log 2 H 2(ρ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 16/ 25

42 More General Suport Recovery X 2 R n p Y 2 R n 2 R p Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 17/ 25

43 More General Suport Recovery X 2 R n p Y 2 R n 2 R p General probabilistic models Support S Uniform( p ) k n Measurement matrix X i=1 Non-zero entries β S P βs Observations (Y X, β) P Y XS β S p j=1 P X (x i,j) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 17/ 25

44 More General Suport Recovery X 2 R n p Y 2 R n 2 R p General probabilistic models Support S Uniform( p ) k n Measurement matrix X i=1 Non-zero entries β S P βs Observations (Y X, β) P Y XS β S Support recovery Partial recovery p j=1 P X (x i,j) P e := P[Ŝ S] P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 17/ 25

45 More General Suport Recovery X 2 R n p Y 2 R n 2 R p General probabilistic models Support S Uniform( p ) k n Measurement matrix X i=1 Non-zero entries β S P βs Observations (Y X, β) P Y XS β S Support recovery Partial recovery p j=1 P X (x i,j) P e := P[Ŝ S] P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Goal: Conditions on n for P e 0 or P e(d max) 0 Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 17/ 25

46 Channel coding Channel State β S P βs Message S Codeword Output Encoder X S Channel Y PY n XSβS Decoder Estimate Ŝ e.g. see [Wainwright, 2009], [Atia and Saligrama, 2012], [Aksoylar et al., 2013] Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 18/ 25

47 More General Suport Recovery Steps for applying generalized non-asymptotic bounds: 1. Construct a typical set T β of non-zero entries β S Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 19/ 25

48 More General Suport Recovery Steps for applying generalized non-asymptotic bounds: 1. Construct a typical set T β [ of non-zero entries β S ] 2. Bound the tail probabilities P ı n (X sdif ; Y X seq, β s) E[ı n ] ± nδ βs = b s Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 19/ 25

49 More General Suport Recovery Steps for applying generalized non-asymptotic bounds: 1. Construct a typical set T β [ of non-zero entries β S ] 2. Bound the tail probabilities P ı n (X sdif ; Y X seq, β s) E[ı n ] ± nδ βs = b s 3. Control and simplify the remainder terms Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 19/ 25

50 More General Suport Recovery Steps for applying generalized non-asymptotic bounds: 1. Construct a typical set T β [ of non-zero entries β S ] 2. Bound the tail probabilities P ı n (X sdif ; Y X seq, β s) E[ı n ] ± nδ βs = b s 3. Control and simplify the remainder terms General form of corollaries: P e 0 if ( ) p k log s n max dif (1 + η) (s dif,s eq) I (X sdif ; Y X seq, β s = b for s) all bs T β and P e 1 if n max (s dif,s eq) ( ) p k+ s log dif s dif (1 η) I (X sdif ; Y X seq, β s = b for s) all bs T β Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 19/ 25

51 Exact Recovery for Linear and 1-bit Models Observation models Y = X, β + Z Y = sign ( X, β + Z ) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 20/ 25

52 Exact Recovery for Linear and 1-bit Models Observation models Y = X, β + Z Y = sign ( X, β + Z ) Case 1: Gaussian X, discrete β S, sparsity k = Θ(1), low SNR Necessary and sufficient conditions: Linear 1-bit s dif log p max (s dif,seq) 1 2σ i s 2 bi 2 dif s dif log p max (s dif,seq) 1 πσ i s 2 bi 2 dif Only factor π 2 difference Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 20/ 25

53 Exact Recovery for Linear and 1-bit Models Observation models Y = X, β + Z Y = sign ( X, β + Z ) Case 1: Gaussian X, discrete β S, sparsity k = Θ(1), low SNR Necessary and sufficient conditions: Linear 1-bit s dif log p max (s dif,seq) 1 2σ i s 2 bi 2 dif s dif log p max (s dif,seq) 1 πσ i s 2 bi 2 dif Only factor π 2 difference Case 2: Gaussian X, fixed β S, sparsity k = Θ(p), moderate SNR Conditions: Linear Θ(p) sufficient 1-bit ( ) Ω p log p necessary Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 20/ 25

54 Partial Recovery for Linear and 1-bit Models Partial recovery P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Linear and 1-bit models Gaussian X, Gaussian β S, sparsity k = o(p), allowed errors d max = α k Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 21/ 25

55 Partial Recovery for Linear and 1-bit Models Partial recovery P e(d max) := P [ S\Ŝ > dmax Ŝ\S > dmax ] Linear and 1-bit models Gaussian X, Gaussian β S, sparsity k = o(p), allowed errors d max = α k Sufficient for P e 0: Necessary for P e 1: n max α [α,1] αk log p k f (α) (1 + η) n max α [α,1] (α α )k log p k f (α) (1 η) Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 21/ 25

56 Partial Recovery Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 22/ 25

57 Conclusion Contributions: New information-theoretic limits for group testing and other support recovery problems Exact thresholds (phase transitions), or near-exact Limits on Support Recovery Jonathan Scarlett, Slide 23/ 25

58 Conclusion Contributions: New information-theoretic limits for group testing and other support recovery problems Exact thresholds (phase transitions), or near-exact Implications for group testing: Optimality of i.i.d. Bernoulli measurements in many cases Limited gain by moving to partial recovery Limits on Support Recovery Jonathan Scarlett, Slide 23/ 25

59 Conclusion Contributions: New information-theoretic limits for group testing and other support recovery problems Exact thresholds (phase transitions), or near-exact Implications for group testing: Optimality of i.i.d. Bernoulli measurements in many cases Limited gain by moving to partial recovery Future work: Closing the remaining gaps (better concentration inequalities?) Non-i.i.d. measurements Applications to other non-linear models Information-spectrum type analysis of other statistical problems Limits on Support Recovery Jonathan Scarlett, Slide 23/ 25

60 Conclusion Contributions: New information-theoretic limits for group testing and other support recovery problems Exact thresholds (phase transitions), or near-exact Implications for group testing: Optimality of i.i.d. Bernoulli measurements in many cases Limited gain by moving to partial recovery Future work: Closing the remaining gaps (better concentration inequalities?) Non-i.i.d. measurements Applications to other non-linear models Information-spectrum type analysis of other statistical problems Limits on Support Recovery Jonathan Scarlett, Slide 23/ 25

61 Further Details Further details (group testing): (accepted to 2016 SODA conference) Further details (general models): (submitted to IEEE Transactions on Information Theory) Limits on Support Recovery Jonathan Scarlett, Slide 24/ 25

62 References I M. Wainwright, Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting, IEEE Trans. Inf. Theory, vol. 55, no. 12, pp , Dec G. Atia and V. Saligrama, Boolean compressed sensing and noisy group testing, IEEE Trans. Inf. Theory, vol. 58, no. 3, pp , March C. Aksoylar, G. Atia, and V. Saligrama, Sparse signal processing with linear and non-linear observations: A unified Shannon theoretic approach, April 2013, T. S. Han, Information-Spectrum Methods in Information Theory. Springer, Limits on Support Recovery Jonathan Scarlett, jonathan.scarlett@epfl.ch Slide 25/ 25

Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices

Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices Jonathan Scarlett and Volkan Cevher Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale

More information

Improved Group Testing Rates with Constant Column Weight Designs

Improved Group Testing Rates with Constant Column Weight Designs Improved Group esting Rates with Constant Column Weight Designs Matthew Aldridge Department of Mathematical Sciences University of Bath Bath, BA2 7AY, UK Email: maldridge@bathacuk Oliver Johnson School

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Compressed Sensing Using Bernoulli Measurement Matrices

Compressed Sensing Using Bernoulli Measurement Matrices ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation

More information

High-dimensional graphical model selection: Practical and information-theoretic limits

High-dimensional graphical model selection: Practical and information-theoretic limits 1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John

More information

Communication by Regression: Achieving Shannon Capacity

Communication by Regression: Achieving Shannon Capacity Communication by Regression: Practical Achievement of Shannon Capacity Department of Statistics Yale University Workshop Infusing Statistics and Engineering Harvard University, June 5-6, 2011 Practical

More information

High-dimensional graphical model selection: Practical and information-theoretic limits

High-dimensional graphical model selection: Practical and information-theoretic limits 1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery

Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad

More information

Iterative Quantization. Using Codes On Graphs

Iterative Quantization. Using Codes On Graphs Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional

More information

Channel Polarization and Blackwell Measures

Channel Polarization and Blackwell Measures Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution

More information

Communication by Regression: Sparse Superposition Codes

Communication by Regression: Sparse Superposition Codes Communication by Regression: Sparse Superposition Codes Department of Statistics, Yale University Coauthors: Antony Joseph and Sanghee Cho February 21, 2013, University of Texas Channel Communication Set-up

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Turbo Codes for Deep-Space Communications

Turbo Codes for Deep-Space Communications TDA Progress Report 42-120 February 15, 1995 Turbo Codes for Deep-Space Communications D. Divsalar and F. Pollara Communications Systems Research Section Turbo codes were recently proposed by Berrou, Glavieux,

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Structured signal recovery from non-linear and heavy-tailed measurements

Structured signal recovery from non-linear and heavy-tailed measurements Structured signal recovery from non-linear and heavy-tailed measurements Larry Goldstein* Stanislav Minsker* Xiaohan Wei # *Department of Mathematics # Department of Electrical Engineering UniversityofSouthern

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

sparse and low-rank tensor recovery Cubic-Sketching

sparse and low-rank tensor recovery Cubic-Sketching Sparse and Low-Ran Tensor Recovery via Cubic-Setching Guang Cheng Department of Statistics Purdue University www.science.purdue.edu/bigdata CCAM@Purdue Math Oct. 27, 2017 Joint wor with Botao Hao and Anru

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

SPARSE signal models are of great interest due to their applicability

SPARSE signal models are of great interest due to their applicability 5762 IEEE TRANSACTIONS ON SIGNA PROCESSING, VO. 66, NO. 2, NOVEMBER, 208 On Finding a Subset of Non-Defective Items From a arge Population Abhay Sharma and Chandra R. Murthy, Senior Member, IEEE Abstract

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

ON THE MINIMUM DISTANCE OF NON-BINARY LDPC CODES. Advisor: Iryna Andriyanova Professor: R.. udiger Urbanke

ON THE MINIMUM DISTANCE OF NON-BINARY LDPC CODES. Advisor: Iryna Andriyanova Professor: R.. udiger Urbanke ON THE MINIMUM DISTANCE OF NON-BINARY LDPC CODES RETHNAKARAN PULIKKOONATTU ABSTRACT. Minimum distance is an important parameter of a linear error correcting code. For improved performance of binary Low

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns

More information

Information Recovery from Pairwise Measurements

Information Recovery from Pairwise Measurements Information Recovery from Pairwise Measurements A Shannon-Theoretic Approach Yuxin Chen, Changho Suh, Andrea Goldsmith Stanford University KAIST Page 1 Recovering data from correlation measurements A large

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

arxiv: v1 [cs.it] 21 Feb 2013

arxiv: v1 [cs.it] 21 Feb 2013 q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Tightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications

Tightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications on the ML Decoding Error Probability of Binary Linear Block Codes and Department of Electrical Engineering Technion-Israel Institute of Technology An M.Sc. Thesis supervisor: Dr. Igal Sason March 30, 2006

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

An Overview of Compressed Sensing

An Overview of Compressed Sensing An Overview of Compressed Sensing Nathan Schneider November 18, 2009 Abstract In a large number of applications, the system will be designed to sample at a rate equal to at least the frequency bandwidth

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Graph-based codes for flash memory

Graph-based codes for flash memory 1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Compressed Sensing under Optimal Quantization

Compressed Sensing under Optimal Quantization Compressed Sensing under Optimal Quantization Alon Kipnis, Galen Reeves, Yonina C. Eldar and Andrea J. Goldsmith Department of Electrical Engineering, Stanford University Department of Electrical and Computer

More information

Approximate Message Passing

Approximate Message Passing Approximate Message Passing Mohammad Emtiyaz Khan CS, UBC February 8, 2012 Abstract In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki s PhD thesis. 1 Notation We denote scalars by small letters

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

On Large Deviation Analysis of Sampling from Typical Sets

On Large Deviation Analysis of Sampling from Typical Sets Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan

More information

Bayesian Inference Course, WTCN, UCL, March 2013

Bayesian Inference Course, WTCN, UCL, March 2013 Bayesian Course, WTCN, UCL, March 2013 Shannon (1948) asked how much information is received when we observe a specific value of the variable x? If an unlikely event occurs then one would expect the information

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

Survey of Interference Channel

Survey of Interference Channel Survey of Interference Channel Alex Dytso Department of Electrical and Computer Engineering University of Illinois at Chicago, Chicago IL 60607, USA, Email: odytso2 @ uic.edu Abstract Interference Channel(IC)

More information

Least Squares Superposition Codes of Moderate Dictionary Size, Reliable at Rates up to Capacity

Least Squares Superposition Codes of Moderate Dictionary Size, Reliable at Rates up to Capacity Least Squares Superposition Codes of Moderate Dictionary Size, eliable at ates up to Capacity Andrew Barron, Antony Joseph Department of Statistics, Yale University Email: andrewbarron@yaleedu, antonyjoseph@yaleedu

More information

Fundamental Limits of Compressed Sensing under Optimal Quantization

Fundamental Limits of Compressed Sensing under Optimal Quantization Fundamental imits of Compressed Sensing under Optimal Quantization Alon Kipnis, Galen Reeves, Yonina C. Eldar and Andrea J. Goldsmith Department of Electrical Engineering, Stanford University Department

More information

Subset Universal Lossy Compression

Subset Universal Lossy Compression Subset Universal Lossy Compression Or Ordentlich Tel Aviv University ordent@eng.tau.ac.il Ofer Shayevitz Tel Aviv University ofersha@eng.tau.ac.il Abstract A lossy source code C with rate R for a discrete

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library

Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library Research Collection Conference Paper The relation between block length and reliability for a cascade of AWG links Authors: Subramanian, Ramanan Publication Date: 0 Permanent Link: https://doiorg/0399/ethz-a-00705046

More information

Remote Source Coding with Two-Sided Information

Remote Source Coding with Two-Sided Information Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State

More information

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY

More information

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE

Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, and Vahid Tarokh, Fellow, IEEE 492 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 Shannon-Theoretic Limits on Noisy Compressive Sampling Mehmet Akçakaya, Student Member, IEEE, Vahid Tarokh, Fellow, IEEE Abstract

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

Approximate Message Passing Algorithms

Approximate Message Passing Algorithms November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)

More information

arxiv: v2 [cs.lg] 6 May 2017

arxiv: v2 [cs.lg] 6 May 2017 arxiv:170107895v [cslg] 6 May 017 Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity Abstract Adarsh Barik Krannert School of Management Purdue University West Lafayette,

More information

Wireless Compressive Sensing for Energy Harvesting Sensor Nodes over Fading Channels

Wireless Compressive Sensing for Energy Harvesting Sensor Nodes over Fading Channels Wireless Compressive Sensing for Energy Harvesting Sensor Nodes over Fading Channels Gang Yang, Vincent Y. F. Tan, Chin Keong Ho, See Ho Ting and Yong Liang Guan School of Electrical and Electronic Engineering,

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information