New communication strategies for broadcast and interference networks

Size: px
Start display at page:

Download "New communication strategies for broadcast and interference networks"

Transcription

1 New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor

2 Distributed Information Coding Proliferation of wireless data and sensor network applications Supported by distributed information processing Information-theoretic perspective

3 1: Distributed Field Gathering

4 2: Broadcast and Interference Networks Mobile Transmitter 1 Mobile Receiver 1 Mobile Transmitter 2 Mobile Receiver 2

5 Information and Coding theory: Tradition Information Theory: Develop efficient communication strategies No constraints on memory/computation for encoding/decoding Obtain performance limits that are independent of technology

6 Information and Coding theory: Tradition Information Theory: Develop efficient communication strategies No constraints on memory/computation for encoding/decoding Obtain performance limits that are independent of technology Coding Theory: Approach these limits using algebraic codes (Ex: linear codes) Fast encoding and decoding algorithms Objective: practical implementability of optimal communication systems

7 Information theory: Orders of magnitude Subatomic scale: Physicists Atomic scale: Chemists

8 Information theory: Orders of magnitude Subatomic scale: Physicists Atomic scale: Chemists Human scale: Biologists Astronomical scale: Astromoners

9 Information theory: Orders of magnitude Subatomic scale: Physicists Atomic scale: Chemists Human scale: Biologists Astronomical scale: Astromoners Information-theory scale: 10 n, n sufficiently large.

10 Probability versus Algebra Information Theory Tools: based on probability Finding the optimal communication system directly is difficult

11 Probability versus Algebra Information Theory Tools: based on probability Finding the optimal communication system directly is difficult Random Coding: Build a collection of communication systems (ensemble) Put a probability distribution on them Show good average performance Craft ensembles using probability

12 Probability versus Algebra Information Theory Tools: based on probability Finding the optimal communication system directly is difficult Random Coding: Build a collection of communication systems (ensemble) Put a probability distribution on them Show good average performance Craft ensembles using probability Coding Theory Tools: Abstract algebra (groups, fields) Exploit algebraic structure to develop algorithms of polynomial complexity for encoding/decoding Study a very small ensemble at a time.

13 Random Coding in networks Prob. distribution on a collection of codebooks (ensemble) Extensions of Shannon ensembles

14 Random Coding in networks Prob. distribution on a collection of codebooks (ensemble) Extensions of Shannon ensembles Lot of bad codebooks in the ensemble Average performance significantly affected by these bad codes Do not achieve optimality in general Many problems have remained open for decades.

15 Coding theory to the rescue? It turns out that algebraic structure can be used to weed out bad codes

16 Coding theory to the rescue? It turns out that algebraic structure can be used to weed out bad codes Gain barely noticeable in point-to-point communication Improvement in second order performance (error exponents)

17 Coding theory to the rescue? It turns out that algebraic structure can be used to weed out bad codes Gain barely noticeable in point-to-point communication Improvement in second order performance (error exponents) Gains significant in multi-terminal communication

18 Coding theory to the rescue? It turns out that algebraic structure can be used to weed out bad codes Gain barely noticeable in point-to-point communication Improvement in second order performance (error exponents) Gains significant in multi-terminal communication Time for Question?

19 Broadcast Networks Mobile Receiver 1 Base Station Mobile Receiver 2

20 Point-to-point communication Start with Binary symmetric channel 1 δ δ δ 1 δ 1 X + N Y N Be(δ), and + is addition modulo 2 Capacity = max P(x) I(X;Y) = 1 h(δ).

21 Picture of an optimal code Output is within a ball around a transmitted codeword Maximum likelyhood decoding

22 Picture of an optimal code Output is within a ball around a transmitted codeword Maximum likelyhood decoding Time for Question?

23 Twitter and Eddington Number Suppose you to want tweet on a BSC: 140 characters

24 Twitter and Eddington Number Suppose you to want tweet on a BSC: 140 characters Entropy of tweets = 1.9 bits/character, 266 bits. Suppose δ = 0.11, then C = 0.5 bits/channel use

25 Twitter and Eddington Number Suppose you to want tweet on a BSC: 140 characters Entropy of tweets = 1.9 bits/character, 266 bits. Suppose δ = 0.11, then C = 0.5 bits/channel use A tweet can be sent by using BSC 532 times. Number of possible tweets = 2 266

26 Twitter and Eddington Number Suppose you to want tweet on a BSC: 140 characters Entropy of tweets = 1.9 bits/character, 266 bits. Suppose δ = 0.11, then C = 0.5 bits/channel use A tweet can be sent by using BSC 532 times. Number of possible tweets = Equals the number of protons in the observable universe Named after Arthur Eddington.

27 BSC with cost constaint 1 n Ew H(X n ) q i.e., a codeword has at most q fractions of 1 s

28 BSC with cost constaint 1 n Ew H(X n ) q i.e., a codeword has at most q fractions of 1 s Capacity-cost function C(q) = max I(X;Y) = H(Y) H(Y X) = h(q δ) h(δ) Ew H (X) q q δ = (1 q)δ +q(1 δ)

29 BSC with cost constaint 1 n Ew H(X n ) q i.e., a codeword has at most q fractions of 1 s Capacity-cost function C(q) = max I(X;Y) = H(Y) H(Y X) = h(q δ) h(δ) Ew H (X) q q δ = (1 q)δ +q(1 δ) X Be(q)

30 Picture of an optimal code Big circle: the set of all words with q fraction of 1 s

31 BSC with cost constraint and interference X + + Y S N S Be(0.5) and N Be(δ) S is non-causally observable only at encoder

32 BSC with cost constraint and interference X + + Y S N S Be(0.5) and N Be(δ) S is non-causally observable only at encoder 1 n Ew H(X n ) q

33 Applications Digital watermarking, data hiding, covert communication Watermark (on youtube) Encoder X + + Y Decoder S N Big Govt. Original Image Blind watermarking

34 Applications Digital watermarking, data hiding, covert communication Watermark (on youtube) Encoder X + + Y Decoder S N Big Govt. Original Image Blind watermarking You want big govt. but you dont trust it too much

35 BSC with cost constraint and interference Q1: What is the communication strategy?

36 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it

37 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it You cannot, you do not have enough number of ones.

38 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it You cannot, you do not have enough number of ones. A2. Ride on the interference

39 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it You cannot, you do not have enough number of ones. A2. Ride on the interference Nudge the interference with channel input toward a codeword But, you have got just q fraction of ones.

40 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it You cannot, you do not have enough number of ones. A2. Ride on the interference Nudge the interference with channel input toward a codeword But, you have got just q fraction of ones. Gelfand-Pinsker: Nudge toward a codeword from a set

41 BSC with cost constraint and interference Q1: What is the communication strategy? A1. Try cancelling it You cannot, you do not have enough number of ones. A2. Ride on the interference Nudge the interference with channel input toward a codeword But, you have got just q fraction of ones. Gelfand-Pinsker: Nudge toward a codeword from a set Q2. How large should the set be? Rate of the set: 1 h(q).

42 Picture of an optimal set of codewords

43 Picture of an optimal set of codewords All these codewords are assigned for a message

44 Picture of an optimal set of codewords All these codewords are assigned for a message Select a codeword to which you can nudge the interference....by spending just q fraction of ones U = X +S

45 Picture of an optimal set of codewords All these codewords are assigned for a message Select a codeword to which you can nudge the interference....by spending just q fraction of ones U = X +S New effective channel: Y = U +N with capacity 1 h(δ)

46 Precoding for Interference Rate of the composite codebook: 1 h(δ) Rate of a sub-code-book: 1 h(q) Transmission rate: difference = h(q) h(δ) Capacity in general case [Gelfand-Pinsker 80]

47 Precoding for Interference Rate of the composite codebook: 1 h(δ) Rate of a sub-code-book: 1 h(q) Transmission rate: difference = h(q) h(δ) Capacity in general case [Gelfand-Pinsker 80] C(q) = max I(U;Y) I(U;S) P(U,X S):Ew H (X) q

48 Picture of Capacity cost function Rate h(q* δ) h( δ) 0.1 C(q) k(q)=h(q) h( ) δ q Cost q Bottomline: Rate loss as compared to no inferference

49 Broadcast Channel: Cover 72 Y Decoder 1 M1 M M 1 2 Encoder X Broadcast Channel Z Decoder 2 M2 Channel with one input and multiple outputs Same signal should contain info. meant for both receivers Capacity region still not known in general

50 Broadcast Channel: Cover 72 Y Decoder 1 M1 M M 1 2 Encoder X Broadcast Channel Z Decoder 2 M2 Channel with one input and multiple outputs Same signal should contain info. meant for both receivers Capacity region still not known in general Time for questions?

51 Marton s Coding Strategy: Two receivers Create a signal that carry information for the second receiver

52 Marton s Coding Strategy: Two receivers Create a signal that carry information for the second receiver This signal acts as interference for the signal of the first How to tackle (self) interference?

53 Marton s Coding Strategy: Two receivers Create a signal that carry information for the second receiver This signal acts as interference for the signal of the first How to tackle (self) interference? Make the first receiver decode a large portion of interference This portion is given by a (univariate) function

54 Marton s Coding Strategy: Two receivers Create a signal that carry information for the second receiver This signal acts as interference for the signal of the first How to tackle (self) interference? Make the first receiver decode a large portion of interference This portion is given by a (univariate) function The rest is precoded for using Gelfand-Pinsker strategy

55 Marton s Coding Strategy: Two receivers Create a signal that carry information for the second receiver This signal acts as interference for the signal of the first How to tackle (self) interference? Make the first receiver decode a large portion of interference This portion is given by a (univariate) function The rest is precoded for using Gelfand-Pinsker strategy This strategy is optimal for many special cases We do not know whether it is optimal in general

56 Example: so-called non-degraded channel X1 + Y N 1 X2 + N 2 Z N 1 Be(δ), and N 2 Be(ǫ), and no constraint on X 2 Hamming weight constraint on X 1 : 1 n Ew H(X n 1 ) q

57 Example: so-called non-degraded channel X1 + Y N 1 X2 + N 2 Z N 1 Be(δ), and N 2 Be(ǫ), and no constraint on X 2 Hamming weight constraint on X 1 : 1 n Ew H(X n 1 ) q Fix R 2 = 1 h(ǫ), and assume δ < ǫ

58 Example: so-called non-degraded channel X1 + Y N 1 X2 + N 2 Z N 1 Be(δ), and N 2 Be(ǫ), and no constraint on X 2 Hamming weight constraint on X 1 : 1 n Ew H(X n 1 ) q Fix R 2 = 1 h(ǫ), and assume δ < ǫ When q δ ǫ, Rec. 1 can decode interference completely

59 Example: so-called non-degraded channel X1 + Y N 1 X2 + N 2 Z N 1 Be(δ), and N 2 Be(ǫ), and no constraint on X 2 Hamming weight constraint on X 1 : 1 n Ew H(X n 1 ) q Fix R 2 = 1 h(ǫ), and assume δ < ǫ When q δ ǫ, Rec. 1 can decode interference completely a.k.a no interference R 1 = h(q δ) h(δ)

60 Example: so-called non-degraded channel X1 + Y N 1 X2 + N 2 Z N 1 Be(δ), and N 2 Be(ǫ), and no constraint on X 2 Hamming weight constraint on X 1 : 1 n Ew H(X n 1 ) q Fix R 2 = 1 h(ǫ), and assume δ < ǫ When q δ ǫ, Rec. 1 can decode interference completely a.k.a no interference R 1 = h(q δ) h(δ) Otherwise, precode for X 2 : R 1 = h(q) h(δ)

61 Picture of Rate Region Rate Cost q Decode a univariate function of interference & precode for the rest

62 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1:

63 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1: (self) interference of signals of Rec. 2 and Rec. 3

64 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1: (self) interference of signals of Rec. 2 and Rec. 3 Decode a univariate function of signal meant for Rec and a univariate function of signal meant for Rec. 3.

65 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1: (self) interference of signals of Rec. 2 and Rec. 3 Decode a univariate function of signal meant for Rec and a univariate function of signal meant for Rec. 3. Precode for the rest

66 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1: (self) interference of signals of Rec. 2 and Rec. 3 Decode a univariate function of signal meant for Rec and a univariate function of signal meant for Rec. 3. Precode for the rest All these being done using random codes No need for linear or algebraic codes till now

67 Broadcast with more receivers Marton s strategy can be easily extended Consider 3 receiver case: At receiver 1: (self) interference of signals of Rec. 2 and Rec. 3 Decode a univariate function of signal meant for Rec and a univariate function of signal meant for Rec. 3. Precode for the rest All these being done using random codes No need for linear or algebraic codes till now We can show that such a strategy is strictly suboptimal

68 New Strategy Decode a bivariate function of the signals meant for other two

69 New Strategy Decode a bivariate function of the signals meant for other two It turns out that to exploit this we need linear codes

70 New Strategy Decode a bivariate function of the signals meant for other two It turns out that to exploit this we need linear codes X1 X2 + N + N2 X3 A + N 2,N 3 Be(ǫ), and no constraints on X 2 and X 3 1 N3 N 1 Be(δ) and the usual : 1 n Ew H(X n 1 ) q Y Z

71 New Strategy Decode a bivariate function of the signals meant for other two It turns out that to exploit this we need linear codes X1 X2 + N + N2 X3 A + N 2,N 3 Be(ǫ), and no constraints on X 2 and X 3 N 1 Be(δ) and the usual : 1 n Ew H(X n 1 ) q Let R 2 = R 3 = 1 h(ǫ), the incorrigble brutes! Let δ < ǫ 1 N3 Y Z

72 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε))

73 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε)) Marton wishes to decode full interference: (X 2,X 3 ): 1 h(q δ) > 2(1 h(ǫ))

74 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε)) Marton wishes to decode full interference: (X 2,X 3 ): 1 h(q δ) > 2(1 h(ǫ)) a.k.a never going to happen Marton ends up doing precoding incurring rate loss

75 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε)) Marton wishes to decode full interference: (X 2,X 3 ): 1 h(q δ) > 2(1 h(ǫ)) a.k.a never going to happen Marton ends up doing precoding incurring rate loss New Approach: Try decoding actual interference: X 2 +X 3

76 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε)) Marton wishes to decode full interference: (X 2,X 3 ): 1 h(q δ) > 2(1 h(ǫ)) a.k.a never going to happen Marton ends up doing precoding incurring rate loss New Approach: Try decoding actual interference: X 2 +X 3 Benefit if the range of X 2 +X 3 is range of (X 2,X 3 )

77 Deficiency of random codes δ = 0.1 and ǫ = h( ε ) 1 h( δ ) 2(1 h( ε)) Marton wishes to decode full interference: (X 2,X 3 ): 1 h(q δ) > 2(1 h(ǫ)) a.k.a never going to happen Marton ends up doing precoding incurring rate loss New Approach: Try decoding actual interference: X 2 +X 3 Benefit if the range of X 2 +X 3 is range of (X 2,X 3 ) If X 2 and X 3 are random, this wont happen

78 Picture of sum of two random sets + =

79 Picture of sum of two cosets of a linear code + =

80 Exploits of Linear Codes The incorrigible brutes can have their capacities

81 Exploits of Linear Codes The incorrigible brutes can have their capacities We just need their codebooks to behave algebraic We know that linear codes achieve the capacity of BSC

82 Exploits of Linear Codes The incorrigible brutes can have their capacities We just need their codebooks to behave algebraic We know that linear codes achieve the capacity of BSC rate of X 2 = rate of X 3 = rate of X 2 +X 3 = 1 h(ǫ) Since δ < ǫ, we have for small q: q δ < ǫ

83 Exploits of Linear Codes The incorrigible brutes can have their capacities We just need their codebooks to behave algebraic We know that linear codes achieve the capacity of BSC rate of X 2 = rate of X 3 = rate of X 2 +X 3 = 1 h(ǫ) Since δ < ǫ, we have for small q: q δ < ǫ Hence 1 h(q δ) > 1 h(ǫ) Rec. 1 can decode the actual interference and subtract it off

84 Exploits of Linear Codes The incorrigible brutes can have their capacities We just need their codebooks to behave algebraic We know that linear codes achieve the capacity of BSC rate of X 2 = rate of X 3 = rate of X 2 +X 3 = 1 h(ǫ) Since δ < ǫ, we have for small q: q δ < ǫ Hence 1 h(q δ) > 1 h(ǫ) Rec. 1 can decode the actual interference and subtract it off Then decodes her message at rate h(q δ) h(δ) R 1 = h(q δ) h(δ), R 2 = R 3 = 1 h(ǫ)

85 Symmetry and addition saved the world We have banked on

86 Symmetry and addition saved the world We have banked on Channels of Rec. 2 and 3 are symmetric so uniform input distribution achieves capacity

87 Symmetry and addition saved the world We have banked on Channels of Rec. 2 and 3 are symmetric so uniform input distribution achieves capacity Interference in the broadcast channel is additive

88 Symmetry and addition saved the world We have banked on Channels of Rec. 2 and 3 are symmetric so uniform input distribution achieves capacity Interference in the broadcast channel is additive But Shannon theory is all about not getting bogged down in an example Objective is to develop a theory for general case

89 However? Caution: Even in point-to-point communication In general, linear codes do not achieve Shannon capacity of an arbitrary discrete memoryless channel

90 However? Caution: Even in point-to-point communication In general, linear codes do not achieve Shannon capacity of an arbitrary discrete memoryless channel What hope do we have in using them for network communication for the arbitrary discrete memoryless case?

91 Thesis Algebraic structure in codes may be necessary in a fundamental way

92 Thesis Algebraic structure in codes may be necessary in a fundamental way Algebraic structure alone is not sufficient A right mix of algebraic structure along with non-linearity Nested algebraic code appears to be a universal structure

93 Noisy Channel Coding in Point-to-point case M Channel Encoder X n Channel Y n Channel Decoder M Given: Channel I/P= X, O/P=Y, with p Y X, and cost function w(x) Find: maximum transmission rate R for a target cost W.

94 Noisy Channel Coding in Point-to-point case M Channel Encoder X n Channel Y n Channel Decoder M Given: Channel I/P= X, O/P=Y, with p Y X, and cost function w(x) Find: maximum transmission rate R for a target cost W. Answer: Shannon Capacity-Cost function (Shannon 49) C(W) = max p X :Ew W I(X;Y)

95 Picture of a near-optimal channel code Obtained from Shannon ensemble Box = X n Red dot = codeword C = code book

96 Picture of a near-optimal channel code Obtained from Shannon ensemble Box = X n Red dot = codeword C = code book C has Packing Property C has Shaping Property

97 Picture of a near-optimal channel code Obtained from Shannon ensemble Box = X n Red dot = codeword C = code book C has Packing Property C has Shaping Property Shape Region = Typical set Size of code = I(X;Y) Codeword density = I(X;Y) H(X) = H(X Y)

98 New Result: An optimal linear code Let X = p, prime no. C 1 = code book C 1 has Packing Property Size of code =log X H(X Y)

99 New Result: An optimal linear code Let X = p, prime no. C 1 = code book C 1 has Packing Property Size of code =log X H(X Y) Finite field is Z p Bounding Region = X n Density = H(X Y)

100 New Theorem: An optimal nested linear code C 1 fine code (red & black) C 2 coarse code (black) C 1 has Packing property Going beyond symmetry

101 New Theorem: An optimal nested linear code C 1 fine code (red & black) C 2 coarse code (black) C 1 has Packing property C 2 has Shaping property Size of C 1 = log X H(X Y) Going beyond symmetry Size of C 2 = log X H(X)

102 New Theorem: An optimal nested linear code C 1 fine code (red & black) C 2 coarse code (black) C 1 has Packing property C 2 has Shaping property Size of C 1 = log X H(X Y) Going beyond symmetry Size of C 2 = log X H(X) Code book= C 1 /C 2 Code book size = I(X;Y) Achieves C(W)

103 Going beyond addition X 2 X 3 (logical OR function)

104 Going beyond addition X 2 X 3 (logical OR function) What kind of glasses you wear so this looks like addition?

105 Going beyond addition X 2 X 3 (logical OR function) What kind of glasses you wear so this looks like addition? Can be embedded in the addition table in F

106 Going beyond addition X 2 X 3 (logical OR function) What kind of glasses you wear so this looks like addition? Can be embedded in the addition table in F Map binary sources into F 3, and use linear codes built on F 3 Can do better than traditional random coding

107 Going beyond addition X 2 X 3 (logical OR function) What kind of glasses you wear so this looks like addition? Can be embedded in the addition table in F Map binary sources into F 3, and use linear codes built on F 3 Can do better than traditional random coding In general we embed bivariate functions in groups

108 Groups - An Introduction G - a finite abelian group of order n G = Z p e 1 1 Z p e 2 2 Z p e k k G isomorphic to direct product of possibly repeating primary cyclic groups g G g = (g 1,...,g k ), g i Z p e i i Call g i as the ith digit of g

109 Groups - An Introduction G - a finite abelian group of order n G = Z p e 1 1 Z p e 2 2 Z p e k k G isomorphic to direct product of possibly repeating primary cyclic groups g G g = (g 1,...,g k ), g i Z p e i i Call g i as the ith digit of g Prove coding theorems for primary cyclic groups

110 Nested Group Codes Group code over Z n p r: C < Zn p r C = Image(φ) for some homomorphism φ: Z k p r Zn p r

111 Nested Group Codes Group code over Z n p r: C < Zn p r C = Image(φ) for some homomorphism φ: Z k p r Zn p r (C 1,C 2 ) nested if C 2 C 1

112 Nested Group Codes Group code over Z n p r: C < Zn p r C = Image(φ) for some homomorphism φ: Z k p r Zn p r (C 1,C 2 ) nested if C 2 C 1 We need: C 1 < Z n pr: good packing code C 2 < Z n pr: good covering code

113 Good Group Packing Codes Good group channel code C 2 for the triple (U,V,P UV ) Assume U = Z p r for some prime p and exponent r > 0

114 Good Group Packing Codes Good group channel code C 2 for the triple (U,V,P UV ) Assume U = Z p r for some prime p and exponent r > 0 Lemma Exists for large n if 1 n log C 2 logp r max 0 i<r ( r r i ) (H(U V) H([U] i V))

115 Good Group Packing Codes Good group channel code C 2 for the triple (U,V,P UV ) Assume U = Z p r for some prime p and exponent r > 0 Lemma Exists for large n if 1 n log C 2 logp r max 0 i<r ( r r i ) (H(U V) H([U] i V)) [U] i is a function of U and depends on the group Extra penalty for imposing group structure beyond linearity

116 Good Group Packing Codes Good group channel code C 2 for the triple (U,V,P UV ) Assume U = Z p r for some prime p and exponent r > 0 Lemma Exists for large n if 1 n log C 2 logp r max 0 i<r ( r r i ) (H(U V) H([U] i V)) [U] i is a function of U and depends on the group Extra penalty for imposing group structure beyond linearity Time for questions?

117 A Distributed Source Coding Problem n X Encoder 1 Rate=R 1 ^ n Y Encoder 2 Rate=R 2 Joint ^ n ^ n ^ n X,Y,Z Decoder n Z Encoder 3 Rate=R 3 Encoders observe different components of a vector source Central decoder receives quantized observations from the encoders Given source distribution p XYZ Best known rate region - Berger-Tung Rate Region, 77

118 Conclusions Presented a nested group codes based coding scheme Can recover known rate regions of broadcast channel Offers rate gains over random coding coding scheme

119 Conclusions Presented a nested group codes based coding scheme Can recover known rate regions of broadcast channel Offers rate gains over random coding coding scheme New bridge between probability and algebra, between information theory and coding theory

120 Conclusions Presented a nested group codes based coding scheme Can recover known rate regions of broadcast channel Offers rate gains over random coding coding scheme New bridge between probability and algebra, between information theory and coding theory It was thought that probability and algebra are nemesis

121 Conclusions Presented a nested group codes based coding scheme Can recover known rate regions of broadcast channel Offers rate gains over random coding coding scheme New bridge between probability and algebra, between information theory and coding theory It was thought that probability and algebra are nemesis Instead the match made in heaven

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

arxiv: v1 [cs.it] 19 Aug 2008

arxiv: v1 [cs.it] 19 Aug 2008 Distributed Source Coding using Abelian Group Codes arxiv:0808.2659v1 [cs.it] 19 Aug 2008 Dinesh Krithivasan and S. Sandeep Pradhan, Department of Electrical Engineering and Computer Science, University

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

arxiv: v1 [cs.it] 5 Feb 2016

arxiv: v1 [cs.it] 5 Feb 2016 An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes Farhad Shirani and S. Sandeep Pradhan Dept. of Electrical Engineering and Computer Science Univ. of Michigan,

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

MATH 433 Applied Algebra Lecture 21: Linear codes (continued). Classification of groups.

MATH 433 Applied Algebra Lecture 21: Linear codes (continued). Classification of groups. MATH 433 Applied Algebra Lecture 21: Linear codes (continued). Classification of groups. Binary codes Let us assume that a message to be transmitted is in binary form. That is, it is a word in the alphabet

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

How to Compute Modulo Prime-Power Sums?

How to Compute Modulo Prime-Power Sums? How to Compute Modulo Prime-Power Sums? Mohsen Heidari, Farhad Shirani, and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 48109, USA.

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Dirty Paper Writing and Watermarking Applications

Dirty Paper Writing and Watermarking Applications Dirty Paper Writing and Watermarking Applications G.RaviKiran February 10, 2003 1 Introduction Following an underlying theme in Communication is the duo of Dirty Paper Writing and Watermarking. In 1983

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Group, Lattice and Polar Codes for Multi-terminal Communications

Group, Lattice and Polar Codes for Multi-terminal Communications Group, Lattice and Polar Codes for Multi-terminal Communications by Aria Ghasemian Sahebi A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Electrical

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise 9 THEORY OF CODES Chapter 9 Theory of Codes After studying this chapter you should understand what is meant by noise, error detection and correction; be able to find and use the Hamming distance for a

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x) Outline Recall: For integers Euclidean algorithm for finding gcd s Extended Euclid for finding multiplicative inverses Extended Euclid for computing Sun-Ze Test for primitive roots And for polynomials

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Structural Results for Coding Over Communication Networks

Structural Results for Coding Over Communication Networks Structural Results for Coding Over Communication Networks by Farhad Shirani Chaharsooghi A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Electrical

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Binary Dirty MAC with Common State Information

Binary Dirty MAC with Common State Information Binary Dirty MAC with Common State Information Anatoly Khina Email: anatolyk@eng.tau.ac.il Tal Philosof Email: talp@eng.tau.ac.il Ram Zamir Email: zamir@eng.tau.ac.il Uri Erez Email: uri@eng.tau.ac.il

More information

Probability Bounds for Two-Dimensional Algebraic Lattice Codes

Probability Bounds for Two-Dimensional Algebraic Lattice Codes Probability Bounds for Two-Dimensional Algebraic Lattice Codes David Karpuk Aalto University April 16, 2013 (Joint work with C. Hollanti and E. Viterbo) David Karpuk (Aalto University) Probability Bounds

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Polar Codes for Some Multi-terminal Communications Problems

Polar Codes for Some Multi-terminal Communications Problems Polar Codes for ome Multi-terminal Communications Problems Aria G. ahebi and. andeep Pradhan Department of Electrical Engineering and Computer cience, niversity of Michigan, Ann Arbor, MI 48109, A. Email:

More information

Capacity of channel with energy harvesting transmitter

Capacity of channel with energy harvesting transmitter IET Communications Research Article Capacity of channel with energy harvesting transmitter ISSN 75-868 Received on nd May 04 Accepted on 7th October 04 doi: 0.049/iet-com.04.0445 www.ietdl.org Hamid Ghanizade

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Quantum Information Theory and Cryptography

Quantum Information Theory and Cryptography Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Polar Codes for Sources with Finite Reconstruction Alphabets

Polar Codes for Sources with Finite Reconstruction Alphabets Polar Codes for Sources with Finite Reconstruction Alphabets Aria G. Sahebi and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 4809,

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Cognitive Multiple Access Networks

Cognitive Multiple Access Networks Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can

More information

Lecture 4: Codes based on Concatenation

Lecture 4: Codes based on Concatenation Lecture 4: Codes based on Concatenation Error-Correcting Codes (Spring 206) Rutgers University Swastik Kopparty Scribe: Aditya Potukuchi and Meng-Tsung Tsai Overview In the last lecture, we studied codes

More information

6.02 Fall 2011 Lecture #9

6.02 Fall 2011 Lecture #9 6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

ABriefReviewof CodingTheory

ABriefReviewof CodingTheory ABriefReviewof CodingTheory Pascal O. Vontobel JTG Summer School, IIT Madras, Chennai, India June 16 19, 2014 ReliableCommunication Oneofthemainmotivationforstudyingcodingtheoryisthedesireto reliably transmit

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Network Coding and Schubert Varieties over Finite Fields

Network Coding and Schubert Varieties over Finite Fields Network Coding and Schubert Varieties over Finite Fields Anna-Lena Horlemann-Trautmann Algorithmics Laboratory, EPFL, Schweiz October 12th, 2016 University of Kentucky What is this talk about? 1 / 31 Overview

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

List Decoding of Reed Solomon Codes

List Decoding of Reed Solomon Codes List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

An Introduction to (Network) Coding Theory

An Introduction to (Network) Coding Theory An to (Network) Anna-Lena Horlemann-Trautmann University of St. Gallen, Switzerland April 24th, 2018 Outline 1 Reed-Solomon Codes 2 Network Gabidulin Codes 3 Summary and Outlook A little bit of history

More information

Bounds and Capacity Results for the Cognitive Z-interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion

More information

Universal Incremental Slepian-Wolf Coding

Universal Incremental Slepian-Wolf Coding Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Performance of Polar Codes for Channel and Source Coding

Performance of Polar Codes for Channel and Source Coding Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch

More information

Assume that the follow string of bits constitutes one of the segments we which to transmit.

Assume that the follow string of bits constitutes one of the segments we which to transmit. Cyclic Redundancy Checks( CRC) Cyclic Redundancy Checks fall into a class of codes called Algebraic Codes; more specifically, CRC codes are Polynomial Codes. These are error-detecting codes, not error-correcting

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information