Fundamental Limits in Asynchronous Communication

Size: px
Start display at page:

Download "Fundamental Limits in Asynchronous Communication"

Transcription

1 Fundamental Limits in Asynchronous Communication Aslan Tchamkerten Telecom ParisTech V. Chandar D. Tse G. Wornell L. Li G. Caire D.E. Shaw Stanford MIT TPT TU Berlin

2 What is asynchronism? When timing of information transmission is unknown to the receiver.

3 Why important? Because virtually all communication systems are a priori asynchronous.

4 Channel burstiness Steady data generation time x 1 x 2 x 4 x 5 x 6 x 7 x 8 x 3 x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 Asynchronous channel Receiver Example: internet.

5 Channel burstiness: models Insertion-deletion-substitution channel (Dobrushin 1967) W (y x), x 2 X, y 2 Y? Channel coding theorem exists: C = lim n!1 1 n max X n I(Xn ; Y n ) but no known single letter expression, even for the purely deletion channel (Kanoria-Montanari 2013).

6 assumes that positions are known Timing channels (Anantharam-Verdù 1996) (a 1,a 2,...,a n ) queue (d 1,d 2,...,d n ) (x a1,x a2,...,x an ) (y d1,y d2,...,y dn ) W Timing information embedded into a i+1 a i and d i+1 d i. Well understood for exponential service, less so for other service times.

7 this tutorial will focus mostly on one type of asynchronism, here we briefly review other setups Source burstiness Data time Synchronous channel Receiver

8 time Small amounts of data because sources produces small amounts of data limited resources for transmission (energy harvesting systems, Ulukus et al. 2015)

9 This tutorial: source burstiness time long Packets sent once in a long while Fundamental limits (bits/channel use or /joule)? Efficient communication strategies?

10 Roadmap Preliminaries Detection Capacity & capacity per unit cost Receiver sampling constraint Finite length analysis

11 Preliminaries

12 Channel W = {W (y x),x2 X,y 2 Y} DMC X input random variable to channel W Y output random variable to channel W Hence if X P then (X, Y ) PW = P ( )W ( )

13 Notation W a = W ( a) Y a W a D(Y a Y b )=D(W ( a) W ( b)) = X y W (y a) log W (y a) W (y b)

14 Typicality empirical distribution: ˆP x n(a) = 1 n {i : x i = a} typical sequence: x n 2 T (V ) if ˆP x V n 1 = X n if i.i.d. P then Pr(X n 2 T (P )) = 1 o(1) Pr(X n 2 T (V )). =2 nd(v P ) o(1)!(1/ p n) where D(V P )= X x V (x) log V (x) P (x) a n. = bn log a n b n n!1! 0

15 Typicality (cont.) Hence, if (X n,y n ) i.i.d. PW then Pr((X n,y n ) 2 T (J)). =2 nd(j PW) and if (X n,y n ) i.i.d. PW a then Pr((X n,y n ) 2 T (J)). =2 nd(j P,W a) where D(J P, W a )= X x,y J(x, y) log J(x, y) P (x)w (y a)

16 Energy-limited synchronous communication msg m,bbits x n (m) Input cost: x 7! k(x) 2 R + Encoding: Cost for sending m : Channel: x m 7! x n (m)! Y W ( x) k(x n (m)) def = nx i=1 k(x i (m)) Decoding: ˆm(Y n )

17 Energy-limited synchronous communication msg m,bbits x n (m) R = B max m k(x n (m)) bits/joule R achievable if Pr(ˆm 6= m) n!1! 0 C sync. =sup{achievable R}

18 msg m,bbits x n (m) Theorem (Gallager, Verdú, late 80 s) C sync. = max X I(X; Y ) Ek(X) If there exists a zero cost symbol C sync. = max x D(Y x Y 0 ) k(x)

19 Detection

20 Detection x B?,?,...??,?,...? A Single message x B Sent at random time {1, 2,...,A} across W Receiver: Y B W x B over {, +1,, + B 1} and Y W? otherwise.

21 ?,?,...??,?,...? x B A Given Y A+B 1 produce estimate ˆ(Y A+B 1 ) of Largest A for which Pr(ˆ 6= ) B!1! 0? x B Optimal?

22 A natural scheme a, a,..., a Y B A Suppose slotted model, i.e., receiver knows mod B Suppose x B = a, a,..., a Declare ˆ such that is -typical Y ˆ+B 1 ˆ W a

23 Analysis a, a,..., a Y B Ȳ B A P (ˆ 6= ) apple P (miss detection) + P (false-alarm) = P (Y +B 1 /2 T (W a )) + (A/B)P (Ȳ B 2 T (W a )) where is i.i.d. Ȳ B W?

24 Analysis a, a,..., a Y B Ȳ B A P (Y +B 1 /2 T (W a )) = o(1) Pr(Ȳ B 2 T (W a )). =2 BD(Y a Y? ) ) (A/B)P (Ȳ B 2 T (W a )) = o(1) if A =2 B with <D(Y a Y? )

25 Analysis a, a,..., a Y B Ȳ B A =2 B Hence Pr(ˆ 6= ) =o(1) as B!1 if A =2 B with <D(Y a Y? ) largest : o = max a D(Y a Y? ) which corresponds to ā = arg max a D(Y a Y? )

26 All observations??,?,...? a, a,..., a?,?,...? Y B Ȳ B A =2 B Sliding window sequential detection: =min{t 1:Y t+b 1 t 2 T (Wā)} achieves same performance as non-sequential.

27 Non-slotted model? Slotted model guarantees pure hypothesis ā B?,?,...??,?,...? ˆ A =2 B If non-slotted Pr( ˆ 1) = (1)!

28 Non-slotted model??,?,...??,?,...? ā B A =2 B Modify ā B slightly to get low autocorrelation; Hamming distance with any of its shift is (B). ā B =ā x B?,?,...,? ā B i?,?,...,? x B i

29 > o : converse (intuition) a, a,..., a Y B Ȳ B A =2 B Assume x B = a B Reveal mod B Typicality decoding MAP (can be shown) : Pr(Y B 2 T (W a )) = 1 o(1) : Pr(Ȳ B 2 T (W a )). =2 BD(Y a Y? ) Pr(false-alarm). =2 B(D(Y a Y? ) ) Non constant x B, no gain (can be shown).

30 Summary x B?,?,...??,?,...? A =2 B Theorem (Chandar-T-Wornell 2008): Pr(ˆ 6= ) =o(1) as B!1 i A =2 B with Optimal rule: < max a =min{t 1:Y t+b 1 t D(Y a Y? ) 2 T (W x B)}

31 Information transmission

32 Bursty communication model msg m, B bits x n (m) A =2 B when transmission starts when information is available

33 Information transmission msg m, B bits?,?,...? x n (m)?,?,...? A =2 B Y n i.i.d. W? i.i.d. W? Delay constraint (for a meaningful problem): given d we want Pr( apple d) =1 o(1).

34 msg m, B bits?,?,...? x n (m)?,?,...? A =2 B Y n i.i.d. W? i.i.d. W? C(A, d)? How does it compare with? C = max X I(X; Y )

35 Small delay capacity msg m, B bits x n (m) A =2 B Theorem (Chandar-T-Tse 2013): C(,d=2 o(b) ) = max X d min ( )= B min max X2P I(X; Y ) I(X; Y ), D(XY X, Y?) 1+ where P is the set of inputs that achieve C(, = 0).

36 Remark C C( ) C(,d=2 o(b) ) = max X min I(X; Y ), D(XY X, Y?) 1+ ) C(,d=2 o(b) ) apple C

37 Remark (cont.) C C( ) C(,d=2 o(b) ) = max X min c =sup{ : C( )=C} > 0 c I(X; Y ), D(XY X, Y?) 1+ iff (unique) capacity achieving output channel differs from Y? since Ỹ of sync D(XY X, Y? )=I(X; Y )+D(Ỹ Y?) = C + D(Ỹ Y?) {z } >0

38 Small delay capacity: alternative scaling msg m, B bits x n (m) A =2 n A =2 B =2 n, = R Theorem (Chandar-T-Tse 2013): C(,d=2 o(n) ) = max I(X; Y ) X:D(Y Y? ) d min ( ) =n

39 Achievability msg m, B bits x n (m) A =2 B {x n (m)} =min{t i.i.d. P, = 1:9m s.t. (x n (m),y t t n+1) 2 T (PW)}

40 Simplified analysis (slotted) Ỹ n A =2 B Assume m =1 is sent P ((X n (1),Y n ) 2 T (PW)) = 1 P ((X n (2),Y n ) 2 T (PW)). =2 o(1) nd(xy X,Y ) P ((X n (2), Ỹ n ) 2 T (PW)). =2 nd(xy X,Y?)

41 Simplified analysis (slotted) Ỹ n A =2 B Hence Pr(error)! 0 if 2 nr 2 nd(xy X,Y ) = o(1) (A/n)2 nr 2 nd(xy X,Y?). =2 n(d(xy X,Y? ) R(1+ )) i.e. = o(1) R<D(XY X, Y )=I(X; Y ) R(1 + ) <D(XY X, Y? ) # ) C(, = 0) max min I(X; Y ), D(XY X, Y?) X 1+

42 Converse: intuition (small delay, =0) x n (m) d A =2 B Effective output process: pure noise from + d onwards Differs from real output process when + n> + d Reveal effective output process to receiver

43 Converse: intuition ( =0) x n (m) d A =2 B Consider extended codewords: symbols sent from to + d x d (m) x n (m)??? d or x d (m) x n ( ) (m) d + d + n + d< + n

44 Converse: intuition ( =0) x d (m) A =2 B Without essential loss of generality assume constant composition P. {x d (m)} is of From synchronous communication: R = log M d apple I(X; Y ) with X P

45 Converse (cont.) Decoding delay implies the receiver can locate codeword Reveal effective output process to receiver Reveal mod d Ask receiver to detect and isolate sent codeword d A =2 B

46 d A =2 B Typic-decoding is essentially optimal (can be shown).

47 Typic-decoding error probability d Ȳ d A =2 B Assuming m =1 is sent: P ((X d (1),Y d ) 2 T (PW)) = 1 P ((X d (2),Y d ) 2 T (PW)). =2 o(1) dd(xy X,Y ) P ((X d (2), Ȳ d ) 2 T (PW)). =2 dd(xy X,Y?) Union bound over msg: R apple D(XY X, Y )=I(X; Y ) Union bound over msg/slots: R(1 + ) apple D(XY X, Y? )

48 Typic-decoding error probability d Ȳ d A =2 B R apple D(XY X, Y )=I(X; Y ) R(1 + ) apple D(XY X, Y? ), R apple min ) C(, = 0) apple max X I(X; Y ), D(XY X, Y?) 1+ min I(X; Y ), D(XY X, Y?) 1+ #

49 Large delay capacity d =2 B 2 B A =2 B Theorem (Chandar-T-Tse 2013): C(, )=C(, 0) for any > 0 Proof: reduce asynchronism by delaying transmission

50 Energy to transmit asynchronously

51 Capacity per unit cost Theorem (Chandar-T-Tse 2013): C(, = 0) = max X d min ( )= min B max X2P I(X; Y ) I(X; Y ) Ek(X), D(XY X, Y?) (1 + )Ek(X) where P is the set of inputs that achieve C(, = 0). C(, )=C(, 0) 0 Proof: follow capacity arguments.

52 A natural case: k(?) =0 C(, = 0) = 1 1+ max x = 1 1+ C D(Y x Y? ) k(x) Example: Gaussian channel x! x + Z(0,N o /2) k(x) =x 2,? =0 C( )= log 2 e N o (1 + )

53 Alternative proof for C(, = 0) when k(?) =0 m, B bits d A Equivalent scenarios Async. within d m,w ˆm, ˆ Sync. m, W ˆm, ˆ PPM

54 Alternative proof for when k(?) =0 C( ) m, B bits A =2 B Since k(?) =0: cost to transmit B bits asynchronously = cost to transmit log(a/d)+b. = B + B bits synchronously via PPM.

55 Alternative proof for when k(?) =0 C( ) m, B bits Hence, minimum cost to transmit 1 bit asynchronously minimum cost to transmit 1+ bits synchronously, i.e. 1 C( ) (1 + ) 1 C, (1 + )C( ) apple C

56 Alternative proof for when k(?) =0 C( ) m, B bits Converse: optimality of PPM ) (1 + )C( ) C( ) #

57 Energy to transmit and receive asynchronously

58 msg m, B bits x n (m) A =2 B Transmission cost: k(x) Reception cost: sampling rate Motivation: power at receivers scales linearly with.

59 msg m, B bits x n (m) A =2 B Transmitter: x n (m) at time Receiver: sampling, stop, decode = number of until Receiver operates at rate 30% if Pr( apple 30%) = 1 o(1)

60 msg m, B bits x n (m) A =2 B C async. (, )?

61 Theorem (non-adaptive sampling, T-Chandar-Caire 2014): For any and > 0 0 < < 1 C async. (, ) =C async. (, 1) d min (, ) = 1 d min(, 1)

62 Achievability (asynchronism reduction) consider coding scheme under full sampling stretch codewords by 1/, sample each 1/ output Converse Non-adaptive sampling budget A implies delay 1/. (change of measure argument).

63 Theorem (adaptive sampling, T-Chandar-Caire 2014): For any 0 and 0 < apple 1 C async. (, ) =C async. (, 1) d min (, ) =d min (, 1)(1 + o(1)) Minimum input energy, minimum delay, arbitrarily small sampling rate.

64 Achievability Transmitter: random coding, = Receiver: decoding function stopping rule sampling strategy

65 Two-phase scheme Sparse sampling: at multiples of 1/ compute empirical distribution ˆP of the last log(n) samples. If P? (T ( ˆP )) apple " switch to full sampling for period n. At the end of the full sampling period, typic-decode or return to sparse mode if no codeword is found.

66 Theorem (adaptive sampling, T-Chandar-Caire 2014): For any 0 and 0 < apple 1 C async. (, ) =C async. (, 1) d min (, ) =d min (, 1)(1 + o(1)) No loss for constant sampling rate but what if! 0?

67 Theorem (min,min,min Chandar-T 2016): For any 0 and 0 < apple 1 if =!(1/B) C async. (, ) =C async. (, 1) d min (, ) =d min (, 1)(1 + o(1)) if = o(1/b) unreliable communication

68 H? 0 H msg Multi-phase H? 1 H msg 0 < 1 <...< ` =1... H? H msg skip samples repeat ` H? H msg stop and decode

69 skip skip H? H? N 0 N 1 N 1 H msg N 0 H msg At each s-instant {t = j,j 2 N} test N 0,N 1,...,N` samples if a test skip samples until next s-instant, repeat! H? if ` consecutive test! Hmsg, decode

70 skip skip H? H? N 0 N 1 N 1 H msg N 0 H msg Parameters can be chosen such that =!(1/B)

71 Theorem (min,min,min Chandar-T 2016): For any 0 and 0 < apple 1 if =!(1/B) C async. (, ) =C async. (, 1) d min (, ) =d min (, 1)(1 + o(1)) if = o(1/b) unreliable communication Artefacts of asymptotic analysis?

72 Finite length analysis ( k(x) =1)

73 Synchronous communication x n (m) W y n Pr(ˆm 6= m) apple " Theorem (Strassen 1962, Polyanskiy et al. 2010) B (n, ") =nc p nv " (W )Q 1 (")+O(log n) largest message size dispersion, gap to capacity

74 msg m, B bits x n (m) d A =2 n asynchronism? sampling rate? dispersion delay d?

75 x n (m) A =2 n Recall C( ) = max I(X; Y ) X:D(Y Y? ) d min ( ) =n Definition: V " (W, ) dispersion of synchronous channel under input constraint W V " (W ) V " (W, ) D(Y Y?). c

76 Minimum delay: d n = d min = n x n (m) A =2 n Theorem (full sampling, Polyanskiy 2013, Li-T. 2017) B (n, ",, n = 1) = nc( ) p nv" (W, )Q 1 (")+O(log n) Theorem (sparse sampling, Li-T. 2017) B (n, ",,!(1/ p n)) = nc( ) p nv" (W, )Q 1 (")+o( p n) B (n, ",,!(1/n)) =nc( ) (1/ n ) + O( p n) & O(1/ p n)

77 Close to minimum delay: d n = n(1 + o(1)) x n (m) A =2 n Theorem (sparse sampling, Li-T. 2017) B (n, ",,!(1/n)) = nc( ) p nv" (W, )Q 1 (")+O(log n) minimum sampling rate minimum delay & & d = n(1 + o(1)) n =!(1/ p dispersion n)

78 Intuition: d n = d min = n x n (m) A =2 n B (n, ",,!(1/n)) =nc( ) (1/ n ) + O( p n) At sampling rate we miss (1/ ) symbols of the sent codeword. So if = o(1/ p n) decoder will miss!( p n) of the sent codeword. Codes of length n!( p n) have!( p n) order synchronous dispersion.

79 Summary msg m, B bits x n (m) A Capacity, capacity per unit cost Tradeoffs between rate, delay, sampling rate Finite length analysis

80 Open issues Bit asynchronism: capacity per unit cost? Random/multiple access: bursty interference (Chandar-T.-Caire 2014, Shahi et. al. 2016, Farkas- Kói 2016) Asynchronous networks (Shomorony et al. 2014, Gallager 1976)

81 Bibliography 1. Dobrushin, Roland L'vovich. "Shannon's theorems for channels with synchronization errors." Problemy Peredachi Informatsii 3.4 (1967): Kanoria, Yashodhan, and Andrea Montanari. "Optimal coding for the binary deletion channel with small deletion probability." IEEE Transactions on Information Theory (2013): Anantharam, Venkat, and Sergio Verdu. "Bits through queues." IEEE Transactions on Information Theory 42.1 (1996): Ulukus, Sennur, et al. "Energy harvesting wireless communications: A review of recent advances." IEEE Journal on Selected Areas in Communications 33.3 (2015): Verdu, Sergio. "On channel capacity per unit cost." IEEE Transactions on Information Theory 36.5 (1990): LIDS-P-1714, November Gallager, R. G., "Energy Limited Channels: Coding, Multiaccess, and Spread Spectrum", M.I.T., Laboratory for Information and Decision Systems, Report 7. Chandar, Venkat, Aslan Tchamkerten, and Gregory Wornell. "Optimal sequential frame synchronization." IEEE Transactions on Information Theory 54.8 (2008): Chandar, Venkat, Aslan Tchamkerten, and David Tse. "Asynchronous capacity per unit cost." IEEE Transactions on Information Theory 59.3 (2013): Tchamkerten, Aslan, Venkat Chandar, and Giuseppe Caire. "Energy and sampling constrained asynchronous communication." IEEE Transactions on Information Theory (2014): Chandar, Venkat, and Aslan Tchamkerten. "How to Quickly Detect a Change while Sleeping Almost all the Time." arxiv preprint arxiv: (2015). 11. Strassen, Volker. "Asymptotische abschätzungen in Shannons informationstheorie." Trans. Third Prague Conf. Inf. Theory Polyanskiy, Yury, H. Vincent Poor, and Sergio Verdú. "Channel coding rate in the finite blocklength regime." IEEE Transactions on Information Theory 56.5 (2010): Polyanskiy, Yury. "Asynchronous communication: Exact synchronization, universality, and dispersion." IEEE Transactions on Information Theory 59.3 (2013):

82 Bibliography 14. Li Longguang and Aslan Tchamkerten. Infinite dispersion in bursty communication." ISIT Shahi, Sara, Daniela Tuninetti, and Natasha Devroye. "On the capacity of strong asynchronous multiple access channels with a large number of users." Information Theory (ISIT), 2016 IEEE International Symposium on. IEEE, Farkas Lóránt, Tamás Kói. "Universal Random Access Error Exponents for Codebooks of Different Word-Lengths." arxiv preprint arxiv: (2016). 17. Shomorony, Ilan, et al. "Diamond networks with bursty traffic: Bounds on the minimum energy-perbit." IEEE Transactions on Information Theory 60.1 (2014): Gallager, Robert. "Basic limits on protocol information in data communication networks." IEEE Transactions on Information Theory 22.4 (1976):

83

84

85 Typical scenarios

86 msg m, B bits c n (m) A =2 B d =2 B C async. (,, )?

87 msg m, B bits c n (m) A =2 B C async. (, )?

88

89 Synchronous communication time Voice, video Packets sent contiguously Negligible cost to acquire synchronization

90

91

92

93 How does asynchronism impact capacity per unit cost?

94 Bursty communication model msg m, B bits?,?,...? x n (m)?,?,...? idle idle A Y n i.i.d. W (?) i.i.d. W (?) noise noise : stopping time with respect to Y 1,Y 2,...

95 Bursty communication model msg m, B bits x n (m) time A A:level of asynchronism :transmission start time One message sent over the [1, A]

96 msg m, B bits x n (m) A Given A Find C async. (A) =sup{achievable R}

97 msg m, B bits A natural scheme x n (m) A Random code {x n (m)} i.i.d. P Sequential typicality: =inf{t 1:(x n (m),y t t n+1) PW-typical for some m}

98 Analysis msg m, B bits A Random code {x n (m)} i.i.d. P Sequential typicality: =inf{t 1:(x n (m),y t t n+1) PW-typical for some m}

99 msg m, B bits c n (m) A C async. (A, )?

100 Bursty communication model msg m, B bits c n (m) time A

101 Bursty communication model msg m, B bits?,?,...??,?,...? c n (m) idle Y n idle A i.i.d. W (?) i.i.d. W (?)

102 msg m, B bits c n (m) A Input cost Output cost: sampling rate = number of until

103 msg m, B bits c n (m) A C async. (A, )?

104 =1 msg m, B bits c n (m) A =2 B Theorem (Chandar, T., Tse 2010): C async. (A, = 1) = max X min I(X; Y ) Ek(X), I(X; Y )+D(Y Y?) (1 + )Ek(X) Minimum delay d min (, = 1)(= (B))

105 0 < apple 1 msg m, B bits c n (m) A =2 B Theorem (Chandar, T., Caire 2013): for any > 0 C async. (, ) =C async. (, 1) d min (, ) d min (, = 1)

106 msg m, B bits c n (m) A =2 B No loss for constant sampling rate.! 0?

107 B bits Main result A =2 B Theorem: for any > 0 if =!(1/B) C async. (A, ) =C async. (A, 1) d min (, ) d min (, = 1) if = o(1/b) unreliable communication

108 Achievability Transmitter: random coding Receiver: decoding function stopping rule sampling strategy Message detection with =!(1/B) samples?

109 H? 0 H msg Multi-zoom H? 1 H msg 0 < 1 <...< ` =1... H? H msg skip samples repeat ` H? H msg stop and decode

110 skip skip H? H? N 0 N 1 N 1 H msg N 0 H msg At each s-instant {t = j,j 2 N} test N 0,N 1,...,N` samples if a test skip samples until s-instant, repeat! H? if ` consecutive test! H msg, decode

111 skip skip H? H? N 0 N 1 N 1 H msg N 0 H msg Parameters can be chosen such that =!(1/B)

112 Summary Fundamental tradeoffs for bursty communication To combat asynchronism we only need strong signals; better output sampling does not help Decoder can sleep almost all the time and yet be maximally efficient (asympt.).

113 skip skip H? H? N 0 N 1 N 1 H msg N 0 H msg Design tests such that first phase dominates: ) #{noise samples at s-instant} N 0 N0 =!(1) )!(1) But = o(b) otherwise change is missed. )!(1/B)

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

IN many emerging technologies, communication is. Sampling Constrained Asynchronous Communication: How to Sleep Efficiently

IN many emerging technologies, communication is. Sampling Constrained Asynchronous Communication: How to Sleep Efficiently 1 Sampling Constrained Asynchronous Communication: How to Sleep Efficiently Venkat Chandar and Aslan Tchamkerten arxiv:1501.05930v5 [cs.it] 24 Oct 2017 Abstract The minimum energy, and, more generally,

More information

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff Transmitting samples over the Gaussian channel: energy-distortion tradeoff Victoria Kostina Yury Polyansiy Sergio Verdú California Institute of Technology Massachusetts Institute of Technology Princeton

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Variable-length coding with feedback in the non-asymptotic regime

Variable-length coding with feedback in the non-asymptotic regime Variable-length coding with feedback in the non-asymptotic regime Yury Polyanskiy, H. Vincent Poor, and Sergio Verdú Abstract Without feedback, the backoff from capacity due to non-asymptotic blocklength

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Correcting Bursty and Localized Deletions Using Guess & Check Codes

Correcting Bursty and Localized Deletions Using Guess & Check Codes Correcting Bursty and Localized Deletions Using Guess & Chec Codes Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University serge..hanna@rutgers.edu, salim.elrouayheb@rutgers.edu Abstract

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

Asymptotic Expansion and Error Exponent of Two-Phase Variable-Length Coding with Feedback for Discrete Memoryless Channels

Asymptotic Expansion and Error Exponent of Two-Phase Variable-Length Coding with Feedback for Discrete Memoryless Channels 1 Asymptotic Expansion and Error Exponent of Two-Phase Variable-Length oding with Feedback for Discrete Memoryless hannels Tsung-Yi hen, Adam R. Williamson and Richard D. Wesel tsungyi.chen@northwestern.edu,

More information

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Cooperative HARQ with Poisson Interference and Opportunistic Routing

Cooperative HARQ with Poisson Interference and Opportunistic Routing Cooperative HARQ with Poisson Interference and Opportunistic Routing Amogh Rajanna & Mostafa Kaveh Department of Electrical and Computer Engineering University of Minnesota, Minneapolis, MN USA. Outline

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Job Scheduling and Multiple Access. Emre Telatar, EPFL Sibi Raj (EPFL), David Tse (UC Berkeley)

Job Scheduling and Multiple Access. Emre Telatar, EPFL Sibi Raj (EPFL), David Tse (UC Berkeley) Job Scheduling and Multiple Access Emre Telatar, EPFL Sibi Raj (EPFL), David Tse (UC Berkeley) 1 Multiple Access Setting Characteristics of Multiple Access: Bursty Arrivals Uncoordinated Transmitters Interference

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009 Published in IET Communications Received on 10th March 2008 Revised on 10th July 2008 ISSN 1751-8628 Comprehensive partial decoding approach for two-level relay networks L. Ghabeli M.R. Aref Information

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Guess & Check Codes for Deletions, Insertions, and Synchronization

Guess & Check Codes for Deletions, Insertions, and Synchronization Guess & Check Codes for Deletions, Insertions, and Synchronization Serge Kas Hanna, Salim El Rouayheb ECE Department, Rutgers University sergekhanna@rutgersedu, salimelrouayheb@rutgersedu arxiv:759569v3

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

An Achievable Rate for the Multiple Level Relay Channel

An Achievable Rate for the Multiple Level Relay Channel An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Capacity Upper Bounds for the Deletion Channel

Capacity Upper Bounds for the Deletion Channel Capacity Upper Bounds for the Deletion Channel Suhas Diggavi, Michael Mitzenmacher, and Henry D. Pfister School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland Email: suhas.diggavi@epfl.ch

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

The Binary Energy Harvesting Channel. with a Unit-Sized Battery

The Binary Energy Harvesting Channel. with a Unit-Sized Battery The Binary Energy Harvesting Channel 1 with a Unit-Sized Battery Kaya Tutuncuoglu 1, Omur Ozel 2, Aylin Yener 1, and Sennur Ulukus 2 1 Department of Electrical Engineering, The Pennsylvania State University

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Performance Analysis of Physical Layer Network Coding

Performance Analysis of Physical Layer Network Coding Performance Analysis of Physical Layer Network Coding by Jinho Kim A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Electrical Engineering: Systems)

More information

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat

More information

Information Capacity of an Energy Harvesting Sensor Node

Information Capacity of an Energy Harvesting Sensor Node Information Capacity of an Energy Harvesting Sensor Node R Rajesh, Vinod Sharma and Pramod Viswanath arxiv:1212.3177v1 [cs.it] 13 Dec 2012 Abstract Energy harvesting sensor nodes are gaining popularity

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

On the Error Exponents of ARQ Channels with Deadlines

On the Error Exponents of ARQ Channels with Deadlines On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

The Hallucination Bound for the BSC

The Hallucination Bound for the BSC The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University

More information

IN MOST problems pertaining to hypothesis testing, one

IN MOST problems pertaining to hypothesis testing, one 1602 IEEE SIGNAL PROCESSING LETTERS, VOL. 23, NO. 11, NOVEMBER 2016 Joint Detection and Decoding in the Presence of Prior Information With Uncertainty Suat Bayram, Member, IEEE, Berkan Dulek, Member, IEEE,

More information

Stability Analysis in a Cognitive Radio System with Cooperative Beamforming

Stability Analysis in a Cognitive Radio System with Cooperative Beamforming Stability Analysis in a Cognitive Radio System with Cooperative Beamforming Mohammed Karmoose 1 Ahmed Sultan 1 Moustafa Youseff 2 1 Electrical Engineering Dept, Alexandria University 2 E-JUST Agenda 1

More information

Information Theory Meets Game Theory on The Interference Channel

Information Theory Meets Game Theory on The Interference Channel Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University

More information

Memory in Classical Information Theory: A Brief History

Memory in Classical Information Theory: A Brief History Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)

More information