Recursions for the Trapdoor Channel and an Upper Bound on its Capacity

Size: px
Start display at page:

Download "Recursions for the Trapdoor Channel and an Upper Bound on its Capacity"

Transcription

1 4 IEEE International Symposium on Information heory Recursions for the rapdoor Channel an Upper Bound on its Capacity obias Lutz Institute for Communications Engineering echnische Universität München Munich, 89 Germany Abstract he problem of maximizing the n-letter mutual information of the trapdoor channel is considered. It is shown that log 5.66 bits per use is an upper bound on the capacity of the trapdoor channel. his upper bound, which is the tightest upper bound known, proves that feedback increases the capacity. I. INRODUCION AND CHANNEL MODEL he trapdoor channel was introduced by David Blackwell in 96 is used by Robert Ash both as a book cover as an introductory example for channels with memory. he mapping of channel inputs to channel outputs can be described as follows. Consider a box that contains a ball that is labeled s {,}, where the index refers to time. Both the sender the receiver know the initial ball. In time slot, the sender places a new ball labeled x {,} in the box. In the same time slot, the receiver chooses one of the two ballss or x at rom while the other ball remains in the box. he chosen ball is interpreted as channel output y at time t while the remaining ball becomes the channel state s. he same procedure is applied in every future channel use. In time slot, for instance, the sender places a new ball x {,} in the box the corresponding channel outputy is eitherx or s. he transmission process is visualized in Fig.. Fig. a shows the trapdoor channel at time t when the sender places ball x t in the box. In the same time slot, the receiver chooses romly one of the two balls x t or s t as channel output, in the figure s t. Consequently, the upcoming channel state s t becomes x t see Fig. b. At time t+ the sender places a new ball x t+ in the box the receiver draws y t+ from s t x t+. able I depicts the probability of an output y t given an input x t state s t. ABLE I RANSIION PROBABILIIES OF HE RAPDOOR CHANNEL x t s t Py t x t,s t Py t x t,s t Despite the simplicity of the trapdoor channel, deriving its capacity seems challenging is an open problem. One feature that makes the problem cumbersome is that the distribution of the output symbols may depend on events s t x t+3 x t+ x t+ y t y t y t 3 x t a he trapdoor channel at time t. s t x t x t+4 x t+3 x t+ y t y t y t x t+ b he trapdoor channel at time t+. Here y t s t. Fig.. he trapdoor channel. happening arbitrarily far back in the past since each ball has a positive probability to remain in the channel over any finite number of channel uses. Instead of maximizing IX;Y one rather has to consider the multi-letter mutual information, i.e., limsup n IX n ;Y n. Let P n s denote the matrix of conditional probabilities of output sequences of length n given input sequences of length n where the initial state equals s. he following ordering of the entries of P n s is assumed. Row indices represent input sequences column indices represent output sequences. he i,jth entry of P n s, indicated as P n s i,j, is the conditional probability of the binary output sequence corresponding to the integer j given the binary input sequence corresponding the the integer i, i,j n. For instance, if n 3, then P 3 s denotes the conditional 5,3 probability that the channel input x,x,x 3,, will be mapped to the channel output y,y,y 3,,. Kobayashi Morita showed 3 that P n s, s {,}, satisfies the recursion laws Pn P n+ P n P n P n+ P n P n P n where the initial matrices are given by P P. Ahlswede Kaspi 4 derived the zero-error capacity of the trapdoor channel which equals.5 b/u. Permuter et al. 5 considered the trapdoor channel under the additional assumption of having a unit delay feedback link available from the receiver /4/$3. 4 IEEE 94

2 4 IEEE International Symposium on Information heory to the sender. hey established that the feedback capacity of the trapdoor channel is equal to the logarithm of the golden ratio. In this paper, we consider the problem of maximizing the n- letter mutual information of the trapdoor channel for any n N. We relax the problem by permitting distributions that are not probability distributions. he resulting optimization problem is convex but the feasible set is larger than the probability simplex. Using the method of Lagrange multipliers via a theorem presented in, we show that log 5.66 b/u is an upper bound on the capacity of the trapdoor channel. Specifically, the same absolute maximum log 5.66 b/u results for all trapdoor channels which process input blocks of even length n. And the sequence of absolute maxima corresponding to trapdoor channels which process inputs of odd lengths converges to log 5 b/u from below as the block length increases. Unfortunately, the absolute maxima of our relaxed optimization are attained outside the probability simplex. Otherwise we would have established the capacity. Nevertheless, log 5.66 b/u is, to the best of our knowledge, the tightest capacity upper bound. Moreover, this bound is less than the feedback capacity of the trapdoor channel. he notation used in this paper is as follows. he symbols N N refer to the natural numbers with without, respectively. he input corresponding to theith row ofp n s is denoted as x i. Further, I n denotes the n n identity matrix, Ĩ n is a n n matrix whose secondary diagonal entries are all equal towhile the remaining entries are all equal to, n denotes a column vector of length n consisting only of ones. he vector n is the transpose of n. he functions exp log indicate the exponential function to base the logarithm to base. If applied to a vector/matrix, log or exp of each element is taken a vector/matrix results. Finally, the symbol refers to the Hadarmard product, i.e., the entry wise product of two matrices. II. A LAGRANGE MULIPLIER APPROACH O HE A. Problem Formulation RAPDOOR CHANNEL We derive an upper bound on the capacity of the trapdoor channel. Specifically, for any n N, we find a solution to the optimization problem maximize P X n n subject to n IXn ;Y n s n n i j n i n k p i Pn s i,j log Pn s n k p k i,j Pn s k,j 3 p i 4 p k Pn s k,j j n. 5 Note that the pmf P X n represents the n -sequences p,...,p n where p i denotes the probability of the ith input sequence x i, i.e., the binary sequence corresponding to the integeri. Constraint 5 guarantees that the argument of the logarithm does not become negative. he feasible set, ined by 4 5, is convex. It includes the set of probability mass functions, but might be larger. o see this note that 5 is a weighted sum of all p k where each weight P n s k,j is non negative. Clearly, 4 5 are satisfied by probability distributions. However, there might exist distributions which involve negative values sum up to one but still satisfy 5. Moreover, the objective function n IX n ;Y n s is concave on the set of probability distributions, which follows by using the same arguments that show that mutual information is concave on the set of input probability distributions. Consequently, the optimization problem is convex every solution maximizes n IX n ;Y n s. In the following, we use the terminology max P X n n IX n ;Y n s. aking the limit of the sequence as n grows, one n N obtains either the capacity of the trapdoor channel or an upper bound on the capacity, depending on whether the limit is attained inside or outside the set of probability distributions. Since it does not matter whether the optimization is with respect to initial state or due to symmetry reasons, we do not have to distinguish between lower capacity upper capacity 6, Chapter 4.6 B. Using a Result from the Literature he reason for considering 5 not the more natural constraints p k for all k is that a closed form solution can be obtained by applying the method of Lagrange multipliers to 3 4. As a byproduct, 5 will be automatically satisfied. In particular, setting the partial derivatives of n IXn ;Y n s +λ n i p i 6 with respect to each of the p i equal to zero results in a closed form solution of the considered optimization problem. his was done in, heorem for general discrete memoryless channels which are square non singular. Note that P n s is square non singular see, for instance, Lemma II. b. Moreover, we assume that the channel P n s is memoryless by repeatedly using it over a large number of input blocks of length n. Consequently, might be an upper bound on the capacity of the channel P n s. he reason is that some input blocks possibly drive the channel P n s into the opposite state s, i.e., the upcoming input block would see the channel P n s whose is equal to of P n s by symmetry but not P n s. However, by assuming that the channel does not change over time, the sender always knows the channel state before a new block is transmitted. Hence, might be an upper bound even though it is attained on the set of probability distributions. Nevertheless, this issue 95

3 4 IEEE International Symposium on Information heory can be ignored if n goes to infinity because in the asymptotic regime the channel P n s is used only once. Indeed we are interested in the asymptotic regime since the limit of the sequence is also its supremum see heorem II.7 n N Remark II.9. In summary, it is valid to apply, heorem which yields n n log exp n P j,i H Yn X n x i attained at j where d k is given by n P j,k exp j i 7 p k C n dk, k,..., n 8 n i P j,i H Yn X n x i. 9 Clearly, p,...,p n is a probability distribution only if d k. Observe that the Lagrangian 6 does not involve the constraint 5. However, the proof of, heorem shows that n k p k equals Pn s k,j M exp λ P H Y n X n x i i j,i for all j n. Hence, 5 is satisfied. For computational reasons we write 7 in matrix vector notation, which reads n exp P Pn s log P n s n n log In the remainder, we use instead of 7 we find exact numerical expressions for in heorem II.7 below. C. Useful Recursions In this section, we derive recursions for P n s log P n s n P n s Pn s log P n s n, as stated in Lemma II.4 Lemma II.5. he recursions, interesting by themselves, are needed to prove the main result. Both expressions are ined next. Definition II.. a he conditional entropy vector h n s of P n s, s {,}, is h n s HY n X n x... HY n X n x n P n s log P n s n 3 where n N. b he weighted conditional entropy vector ω n s of P n s, s {,}, is where n N. ω n s P h n s 4 P Pn s log P n s n 5 We remark that h n s ω n s are column vectors with n entries. he following two lemmas provide tools that we need for the proof of Lemma II.4 Lemma II.5. Lemma II.. a he trapdoor channel matrices P n+ P n+, n N, satisfy the following recursions: P P n+ P n P 4 P n 4 P P 6 P n+ P n 4 P n 4 P P n 4 P n 4 P P 4 P n 4 P P n P. 7 P n b Let M P P n P M P n P P n. he inverses of P n+ P n+, n N, satisfy the following recursions: P P n+ M P P P M P n P 3M M 4P 8 P n+ 4P n M 3M M P P P n P n P n M P n n. 9 Proof. a: Substituting P n+ P n+ into P n+ P n+, where the four matrices are expressed as in, yields 6 7. b: wo versions of the matrix inversion lemma are 7 A A C D D CA D A B A A BD D D. Now divide 6 7 into four blocks of equal size. A twofold application of, first to P n+ P n+, subsequently, to each of the blocks of P n+ P n+ yields 8 9. A transformation relating P n to P n, P n to h n ω n to ω n is derived next. to P n, h n Lemma II.3. Let P n P n be trapdoor channel matrices, n N. We have the following identities. a P n ĨnP n Ĩ n P n ĨnP n Ĩ n. 3 96

4 4 IEEE International Symposium on Information heory b c d P n ĨnP 4 P n ĨnP 5 h n Ĩnh n 6 h n Ĩnh n. 7 ω n Ĩnω n 8 ω n Ĩnω n. 9 e he row sums of P n P n are. Proof. See 8. We can now state the recursive laws for the conditional entropy vector the weighted conditional entropy vector. Lemma II.4. For n, h n+ satisfies the recursion h n+ h h + Ĩnh + n 3 4 h + 4Ĩnh + 3 n 4 h + 3 4Ĩnh + 3 n he initial value for n is given by h. Proof. See Lemma II.5. a For n N, ω satisfies the recursion ω ω n+ ω n ω n 3 ω with initial value ω. b For n N, ω n+ satisfies the recursion ω n ω n+ Ĩ n ω n ω n n 3 Ĩ n ω n n with initial value ω. Proof. a: he proof is by induction. he case n can be verified using Definition II. b with P P. Now assume that 3 holds for some n. In order to show 3 for n+, we evaluate ω n+ using 5 replacing P n+ h n+ with 8 3. he details of the simplification steps can be found in 8. b: Recall the recursions Pn+ P n+ P n+ P 33 n+ P n+ P n+ P n+ P n+ P n+ P n+, 34 which follow from. Computing the first half i.e., the first n+ entries ofω n+, indicated asω n+, based on Definition II.b using yields ω n+ P n+ Pn+ log P n+ n+. 35 By inition, the right h side of 35 is ω n+. Hence, under consideration of 3, we have ω ω n+. 36 ω n It remains to expressω in 36 in terms ofω n. By the same argument as just used, the first half of the vector ω equals ω n. Since ω is a palindrome by assumption, the second half of ω equals Ĩn ω n. Hence, ω ω n Ĩ n ω n. 37 By replacing ω in 36 with 37, we obtain 3. he initial value ω follows from 36 for n noting that ω. Remark II.6. he recursions derived in Lemma II.4 II.5 are with respect to initial state s. hey can be transformed to recursions with respect to initial state s by using 6 8 from Lemma II.3. D. Proof of the Main Result In this section, we evaluate based on Lemma II.5. heorem II.7. Consider the convex optimization problem 3 to 5. he absolute maximum for input blocks of even length n is C n 5 log b/u for all n N. 38 For input blocks of odd length n, the absolute maximum is C n 5 5 log n +n log 4 b/u 39 where n N. Proof. Without loss of generality, the initial state is assumed to be s. Recall, which for input blocks of lengthn+k reads C n+k n+k log n+k exp ωn+k b/u 4 where n N k,. For n, a straightforward computation shows, using 3 3, that C log 5 4 b/u C log 5 b/u. Now assume that hold for some n. In particular, suppose that nexp ω 5 n 4 A palindrome is a finite sequence of symbols, numbers or elements, which reads the same backwards as forward. 97

5 4 IEEE International Symposium on Information heory n exp ωn 5 4 n 5. 4 We now show that hold if n is replaced by n+. By means of the recursions derived in Lemma II.5, we have n+ exp ωn+ n exp ω +exp ω n + nexp ω 43 n+ exp ωn+ n exp ωn +exp ωn n 44 + n exp ωn. 45 Observe that we used the property n exp Ĩn ω n n exp ωn in 44. Finally, using 4 under consideration of the induction hypotheses 4 4, we obtain C n+ n+ log + nexp ω 5 log b/u C n+ n+ log + n exp ωn 5 5 log n+ +n log 4 b/u. Remark II.8. Observe that lim n C n+ log 5, where convergence is from below. Hence, we have max n N C n 5 log b/u. Unfortunately, the distributions corresponding to involve negative probabilities otherwise the capacity of the trapdoor channel would have been established. We elaborate this issue in the following remark. Remark II.9. Note that condition 9 does not hold for all k,..., n. his can be verified as follows. For a trapdoor channel P n, condition 9 reads in matrix vector notation as dk P k n exp ω n n. 46 We now compute the second last row of Pn by the following recursive scheme. Applying the matrix inversion lemma in the form of to P n, which is written as in, subsequently taking the transpose, then replacing the right bottom block of this matrix, which is Pn, with the just obtained matrix times two where n is replaced by n, then applying the same procedure to the right bottom block of Pn so on until the right bottom block equals n P shows that the second last row of Pn equals n n. Further, using Lemma II.5, it follows that the second to last entry the last entry of ω n equals if n N is even, 4 if n N is odd. Inserting the gathered quantities into 46 yields { 3 n 3 if n is even d n 3 n 5 if n is odd. Hence, condition 9 does not hold for all k,..., n. III. CONCLUSIONS We have focused on the convex optimization problem 3 to 5 where the feasible set is larger than the probability simplex. An absolute maximum of the n-letter mutual information was established for any n N by using the method of Lagrange multipliers. he same absolute maximum log 5.66 b/u results for all even n the sequence of absolute maxima corresponding to odd block lengths converges from below to log 5 b/u as the block length increases. Unfortunately, all absolute maxima are attained outside the probability simplex. Hence, instead of establishing the capacity of the trapdoor channel, we have shown only that log 5 b/u is an upper bound on the capacity. o the best of our knowledge, this upper bound is the tightest known bound. Notably, this upper bound is strictly smaller than the feedback capacity 5. Moreover, the result gives an indirect justification that the capacity of the trapdoor channel is attained on the boundary of the probability simplex. ACKNOWLEDGMEN he author is supported by the German Ministry of Education Research in the framework of the Alexer von Humboldt-Professorship would like to thank Prof. Haim Permuter who suggested to use, heorem Moreover, the author wishes to thank Prof. Gerhard Kramer Prof. sachy Weissman for helpful discussions. REFERENCES D. Blackwell, Information heory, E. F. Beckenbach, Ed. McGraw-Hill Book Co., New York, 96, vol. Modern Mathematics for the Engineer. R. Ash, Information heory. Interscience Publishers, K. Kobayashi H. Morita, An input/output recursion for the trapdoor channel, in Proc. IEEE Int. Symp. Inf. heory, Lausanne, Switzerl, Jun. 3 Jul. 5, p R. Ahlswede A. H. Kaspi, Optimal coding strategies for certain permuting channels. IEEE rans. Inf. heory, vol. 33, no. 3, pp. 3 34, H. Permuter, P. Cuff, B. Van Roy,. Weissman, Capacity of the trapdoor channel with feedback, IEEE rans. Inf. heory, vol. 54, no. 7, pp , Jul R. G. Gallager, Information heory Reliable Communication. John Wiley & Sons, Inc., G. H. Golub C. F. van Van Loan, Matrix Computations, 3rd ed. he Johns Hopkins University Press, Lutz, Various views on the trapdoor channel an upper bound on its capacity, 4, available online: 98

Some Aspects of Finite State Channel related to Hidden Markov Process

Some Aspects of Finite State Channel related to Hidden Markov Process Some Aspects of Finite State Channel related to Hidden Markov Process Kingo Kobayashi National Institute of Information and Communications Tecnology(NICT), 4-2-1, Nukui-Kitamachi, Koganei, Tokyo, 184-8795,

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

DAVID Blackwell, who has done fundamental work both

DAVID Blackwell, who has done fundamental work both 3150 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 7, JULY 2008 Capacity of the Trapdoor Channel With Feedback Haim Permuter, Student Member, IEEE, Paul Cuff, Student Member, IEEE, Benjamin Van

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Optimization in Information Theory

Optimization in Information Theory Optimization in Information Theory Dawei Shen November 11, 2005 Abstract This tutorial introduces the application of optimization techniques in information theory. We revisit channel capacity problem from

More information

Symmetric Characterization of Finite State Markov Channels

Symmetric Characterization of Finite State Markov Channels Symmetric Characterization of Finite State Markov Channels Mohammad Rezaeian Department of Electrical and Electronic Eng. The University of Melbourne Victoria, 31, Australia Email: rezaeian@unimelb.edu.au

More information

Capacity of the Trapdoor Channel with Feedback

Capacity of the Trapdoor Channel with Feedback Capacity of the Trapdoor Channel with Feedback Haim Permuter, Paul Cuff, Benjamin Van Roy and Tsachy Weissman Abstract We establish that the feedback capacity of the trapdoor channel is the logarithm of

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Optimal Power Control in Decentralized Gaussian Multiple Access Channels

Optimal Power Control in Decentralized Gaussian Multiple Access Channels 1 Optimal Power Control in Decentralized Gaussian Multiple Access Channels Kamal Singh Department of Electrical Engineering Indian Institute of Technology Bombay. arxiv:1711.08272v1 [eess.sp] 21 Nov 2017

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Estimates for probabilities of independent events and infinite series

Estimates for probabilities of independent events and infinite series Estimates for probabilities of independent events and infinite series Jürgen Grahl and Shahar evo September 9, 06 arxiv:609.0894v [math.pr] 8 Sep 06 Abstract This paper deals with finite or infinite sequences

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Capacity Upper Bounds for the Deletion Channel

Capacity Upper Bounds for the Deletion Channel Capacity Upper Bounds for the Deletion Channel Suhas Diggavi, Michael Mitzenmacher, and Henry D. Pfister School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland Email: suhas.diggavi@epfl.ch

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

On the Capacity of Free-Space Optical Intensity Channels

On the Capacity of Free-Space Optical Intensity Channels On the Capacity of Free-Space Optical Intensity Channels Amos Lapidoth TH Zurich Zurich, Switzerl mail: lapidoth@isi.ee.ethz.ch Stefan M. Moser National Chiao Tung University NCTU Hsinchu, Taiwan mail:

More information

Polar codes for the m-user MAC and matroids

Polar codes for the m-user MAC and matroids Research Collection Conference Paper Polar codes for the m-user MAC and matroids Author(s): Abbe, Emmanuel; Telatar, Emre Publication Date: 2010 Permanent Link: https://doi.org/10.3929/ethz-a-005997169

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL 2006 1545 Bounds on Capacity Minimum EnergyPerBit for AWGN Relay Channels Abbas El Gamal, Fellow, IEEE, Mehdi Mohseni, Student Member, IEEE,

More information

Section Summary. Sequences. Recurrence Relations. Summations. Examples: Geometric Progression, Arithmetic Progression. Example: Fibonacci Sequence

Section Summary. Sequences. Recurrence Relations. Summations. Examples: Geometric Progression, Arithmetic Progression. Example: Fibonacci Sequence Section 2.4 1 Section Summary Sequences. Examples: Geometric Progression, Arithmetic Progression Recurrence Relations Example: Fibonacci Sequence Summations 2 Introduction Sequences are ordered lists of

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

The Compound Capacity of Polar Codes

The Compound Capacity of Polar Codes The Compound Capacity of Polar Codes S. Hamed Hassani, Satish Babu Korada and Rüdiger Urbanke arxiv:97.329v [cs.it] 9 Jul 29 Abstract We consider the compound capacity of polar codes under successive cancellation

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

CAPACITY OF THE DELAY SELECTOR CHANNEL LU CUI

CAPACITY OF THE DELAY SELECTOR CHANNEL LU CUI CAPACITY OF THE DELAY SELECTOR CHANNEL LU CUI A THESIS SUBMITTED TO THE FACULTY OF GRADUATE STUDIES IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF APPLIED SCIENCE GRADUATE PROGRAM

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

On Asymptotic Strategies for GMD Decoding with Arbitrary Error-Erasure Tradeoff

On Asymptotic Strategies for GMD Decoding with Arbitrary Error-Erasure Tradeoff On Asymptotic Strategies for GMD Decoding with Arbitrary Error-Erasure Tradeoff Joschi Brauchle and Vladimir Sidorenko Institute for Communications Engineering, Technische Universität München Arcisstrasse

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Approximate Capacity of the MIMO Relay Channel

Approximate Capacity of the MIMO Relay Channel 04 IEEE International Symposium on Information Theory Approximate Capacity of the MIMO Relay Channel Xianglan Jin Department of Electrical and Computer Engineering Pusan National University Busan 609-735

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

An Achievable Rate for the Multiple Level Relay Channel

An Achievable Rate for the Multiple Level Relay Channel An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign

More information

On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels

On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels Jie Luo, Anthony Ephremides ECE Dept. Univ. of Maryland College Park, MD 20742

More information

Time-invariant LDPC convolutional codes

Time-invariant LDPC convolutional codes Time-invariant LDPC convolutional codes Dimitris Achlioptas, Hamed Hassani, Wei Liu, and Rüdiger Urbanke Department of Computer Science, UC Santa Cruz, USA Email: achlioptas@csucscedu Department of Computer

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

MADHAVA MATHEMATICS COMPETITION, December 2015 Solutions and Scheme of Marking

MADHAVA MATHEMATICS COMPETITION, December 2015 Solutions and Scheme of Marking MADHAVA MATHEMATICS COMPETITION, December 05 Solutions and Scheme of Marking NB: Part I carries 0 marks, Part II carries 30 marks and Part III carries 50 marks Part I NB Each question in Part I carries

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Computation of Information Rates from Finite-State Source/Channel Models

Computation of Information Rates from Finite-State Source/Channel Models Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch

More information

Alternative Characterization of Ergodicity for Doubly Stochastic Chains

Alternative Characterization of Ergodicity for Doubly Stochastic Chains Alternative Characterization of Ergodicity for Doubly Stochastic Chains Behrouz Touri and Angelia Nedić Abstract In this paper we discuss the ergodicity of stochastic and doubly stochastic chains. We define

More information

Practical Polar Code Construction Using Generalised Generator Matrices

Practical Polar Code Construction Using Generalised Generator Matrices Practical Polar Code Construction Using Generalised Generator Matrices Berksan Serbetci and Ali E. Pusane Department of Electrical and Electronics Engineering Bogazici University Istanbul, Turkey E-mail:

More information

Direct Methods for Solving Linear Systems. Matrix Factorization

Direct Methods for Solving Linear Systems. Matrix Factorization Direct Methods for Solving Linear Systems Matrix Factorization Numerical Analysis (9th Edition) R L Burden & J D Faires Beamer Presentation Slides prepared by John Carroll Dublin City University c 2011

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

On the symmetry features of some electrical circuits

On the symmetry features of some electrical circuits INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS Int. J. Circ. Theor. Appl. 2006; 34:637 644 Published online 26 September 2006 in Wiley InterScience (www.interscience.wiley.com)..377 On the symmetry

More information

Streaming Algorithms for Optimal Generation of Random Bits

Streaming Algorithms for Optimal Generation of Random Bits Streaming Algorithms for Optimal Generation of Random Bits ongchao Zhou Electrical Engineering Department California Institute of echnology Pasadena, CA 925 Email: hzhou@caltech.edu Jehoshua Bruck Electrical

More information

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes 1 Zheng Wang, Student Member, IEEE, Jie Luo, Member, IEEE arxiv:0808.3756v1 [cs.it] 27 Aug 2008 Abstract We show that

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

Markov Chain Model for ALOHA protocol

Markov Chain Model for ALOHA protocol Markov Chain Model for ALOHA protocol Laila Daniel and Krishnan Narayanan April 22, 2012 Outline of the talk A Markov chain (MC) model for Slotted ALOHA Basic properties of Discrete-time Markov Chain Stability

More information

Understanding the Simplex algorithm. Standard Optimization Problems.

Understanding the Simplex algorithm. Standard Optimization Problems. Understanding the Simplex algorithm. Ma 162 Spring 2011 Ma 162 Spring 2011 February 28, 2011 Standard Optimization Problems. A standard maximization problem can be conveniently described in matrix form

More information

Lecture 3: Semidefinite Programming

Lecture 3: Semidefinite Programming Lecture 3: Semidefinite Programming Lecture Outline Part I: Semidefinite programming, examples, canonical form, and duality Part II: Strong Duality Failure Examples Part III: Conditions for strong duality

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

The spectra of super line multigraphs

The spectra of super line multigraphs The spectra of super line multigraphs Jay Bagga Department of Computer Science Ball State University Muncie, IN jbagga@bsuedu Robert B Ellis Department of Applied Mathematics Illinois Institute of Technology

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2)

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2) IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 361 Uplink Downlink Duality Via Minimax Duality Wei Yu, Member, IEEE Abstract The sum capacity of a Gaussian vector broadcast channel

More information

Capacity and Zero-Error Capacity of the Chemical

Capacity and Zero-Error Capacity of the Chemical Capacity and Zero-Error Capacity of the Chemical Channel with Feedback Haim Permuter, Paul Cuff, Benjamin Van Roy and Tsachy Weissman Department of Electrical Engineering Stanford University Stanford,

More information

Chapter 5: Integer Compositions and Partitions and Set Partitions

Chapter 5: Integer Compositions and Partitions and Set Partitions Chapter 5: Integer Compositions and Partitions and Set Partitions Prof. Tesler Math 184A Winter 2017 Prof. Tesler Ch. 5: Compositions and Partitions Math 184A / Winter 2017 1 / 32 5.1. Compositions A strict

More information

Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

More information

Proofs for Large Sample Properties of Generalized Method of Moments Estimators

Proofs for Large Sample Properties of Generalized Method of Moments Estimators Proofs for Large Sample Properties of Generalized Method of Moments Estimators Lars Peter Hansen University of Chicago March 8, 2012 1 Introduction Econometrica did not publish many of the proofs in my

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

On the Block Error Probability of LP Decoding of LDPC Codes

On the Block Error Probability of LP Decoding of LDPC Codes On the Block Error Probability of LP Decoding of LDPC Codes Ralf Koetter CSL and Dept. of ECE University of Illinois at Urbana-Champaign Urbana, IL 680, USA koetter@uiuc.edu Pascal O. Vontobel Dept. of

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Capacity of a Two-way Function Multicast Channel

Capacity of a Two-way Function Multicast Channel Capacity of a Two-way Function Multicast Channel 1 Seiyun Shin, Student Member, IEEE and Changho Suh, Member, IEEE Abstract We explore the role of interaction for the problem of reliable computation over

More information

Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems

Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems 2382 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 59, NO 5, MAY 2011 Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems Holger Boche, Fellow, IEEE,

More information

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Extension of the Blahut Arimoto Algorithm for Maximizing Directed Information

Extension of the Blahut Arimoto Algorithm for Maximizing Directed Information 204 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 1, JANUARY 2013 Extension of the Blahut Arimoto Algorithm for Maximizing Directed Information Iddo Naiss and Haim H. Permuter, Member, IEEE Abstract

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

Chapter 5: Integer Compositions and Partitions and Set Partitions

Chapter 5: Integer Compositions and Partitions and Set Partitions Chapter 5: Integer Compositions and Partitions and Set Partitions Prof. Tesler Math 184A Fall 2017 Prof. Tesler Ch. 5: Compositions and Partitions Math 184A / Fall 2017 1 / 46 5.1. Compositions A strict

More information

LOW-density parity-check (LDPC) codes [1], a class

LOW-density parity-check (LDPC) codes [1], a class 3872 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 11, NOVEMBER 2005 Analysis of Low-Density Parity-Check Codes for the Gilbert Elliott Channel Andrew W. Eckford, Member, IEEE, Frank R. Kschischang,

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Multiuser Capacity in Block Fading Channel

Multiuser Capacity in Block Fading Channel Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Brian M. Kurkoski, Kazuhiko Yamaguchi and Kingo Kobayashi kurkoski@ice.uec.ac.jp Dept. of Information and Communications Engineering

More information

Online Interval Coloring and Variants

Online Interval Coloring and Variants Online Interval Coloring and Variants Leah Epstein 1, and Meital Levy 1 Department of Mathematics, University of Haifa, 31905 Haifa, Israel. Email: lea@math.haifa.ac.il School of Computer Science, Tel-Aviv

More information

Bounds on Mutual Information for Simple Codes Using Information Combining

Bounds on Mutual Information for Simple Codes Using Information Combining ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar

More information