Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library

Size: px
Start display at page:

Download "Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library"

Transcription

1 Research Collection Conference Paper The relation between block length and reliability for a cascade of AWG links Authors: Subramanian, Ramanan Publication Date: 0 Permanent Link: Rights / License: In Copyright - on-commercial Use Permitted This page was generated automatically upon download from the ETH Zurich Research Collection For more information please consult the Terms of use ETH Library

2 Int Zurich Seminar on Communications IZS, February 9 March, 0 The Relation Between Block Length and Reliability for a Cascade of AWG Links Ramanan Subramanian Institute for Telecommunications Research, University of South Australia, Mawson Lakes, SA 5095, Australia RamananSubramanian@unisaeduau Abstract This paper presents a fundamental informationtheoretic scaling law for a heterogeneous cascade of links corrupted by Gaussian noise For any set of fixed-rate encoding/decoding strategies applied at the links, it is claimed that a scaling of Θ log n in block length is both sufficient and necessary for reliable communication A proof of this claim is provided using tools from the convergence theory for inhomogeneous Markov chains I ITRODUCTIO Consider a line network in which a message originating at a certain node is conveyed to the intended destination through a series of hops The reception at every hop is impaired by zero-mean additive white Gaussian noise AWG of a certain variance The originating node as well as the intermediate nodes are subject to an average transmit-power constraint P 0 for the duration of transmission of any message The messages, drawn from a certain set of size M, are encoded and transmitted as codewords in R that satisfy the transmit power constraint Every intermediate hop decodes possibly erroneously the original message from its noisy observation and re-encodes it before transmitting to the next hop One can form the row-stochastic probability transition matrix P i for the i th hop in which each row gives the conditional probability distribution of the message decoded at hop i, given a certain message sent by the previous hop: p i p i p i M p i p i p i M P i = p i M p i M p i M M Assume that a network of the type outlined above consists of n hops If the encoder-decoder pairs are identical at the hops, and if the noise variance at every hop is σ, then the matrices P i will be identical to, say P for all i Then, the rows of P n will give the conditional probability distributions of the message Ŵn decoded at the final destination, given a certain message W at the originating node For decisions made on signals corrupted by Gaussian noise, P has the property that any column is either all-zero or contains only non-zero entries For such a matrix P, the rows of P n tend to be identical for large n see Theorem 49 in [], if M,, P 0, and σ are independent of n In other words, the This work has been supported by the Australian Research Council under the ARC Discovery Grant DP probability distribution of Ŵ n will be almost independent of the original message W This implies that the mutual information IW ; Ŵn between the original message and the decoded message tends to zero with n Hence, communication in the network is highly unreliable if and M do not vary with the number of hops n On the other hand, if is large and if one chooses M = R for any R < log + P0 σ, random-coding theorem states that there exists a channel code of block length such that the probability of error during decoding at any link is bounded above by e ER,P/σ,whereE is the random coding exponent for the corresponding AWG channel [], [3] By employing such a code for each link in the line network model, choosing any fn ω, and letting vary with log n+fn n as, one obtains the following union bound on the ER,P/σ probability of communication failure in the network: ] P [Ŵn W np B ne ER,P/σ = e fn 0 Hence, it is possible to transmit messages with very high reliability in the network for a fixed code rate R, if the code length satisfies n Θlog n On the other hand, the previous discussion about diminishing IW ; Ŵn implies that any fixed-rate fixed-length coding scheme is not scalable for the line network This leads to the natural question whether the scaling Θlog n for is indeed optimal In this paper, we show that for any scaling of satisfying n olog n, the mutual information IW ; Ŵn diminishes to zero with n for any set of fixed-rate codes This implies that the scaling order Θlog n for is necessary for reliable communication in the network Hence, the sufficiency criterion for given by the union bound in Eqn is indeed optimal up to a constant factor While this scaling problem has been solved partially in the past for cascades of homogeneous DMCs by iesen et al in [4], it has also been pointed out in the same work that the tools developed there apply neither to non-homogeneous cascades nor to cascades of continuous-input channels as is the case in the current paper In contrast, the results developed here are applicable to continuous-input AWG links where the links can be non-identical For any fn > 0 and gn > 0, fn ωgn lim n fn gn =,fn ogn lim n fn gn = 0,fn Θgn c>0 st lim n fn gn = c 7

3 Int Zurich Seminar on Communications IZS, February 9 March, 0 II OTATIOS AD DEFIITIOS The set of all real numbers is denoted by R and the set of all natural numbers by atural logarithms are assumed in the notations, unless the base is specified The notation represents L norm throughout We employ the following terminologies in this paper Let, called the code length or block length of the transmission scheme A code rate R>0 is a real number such that R is an integer Let M {,, 3,, R }, called the message alphabet LetP 0 > 0 be the input power constraint for transmission Definition : A rate-r length- code C with a power constraint of P 0 is an ordering of M = R elements from R, called code words that satisfies C =x, x, x 3,, x M st w M, x w P 0 Definition : A rate-r length- decision rule R is an ordering of M = R sub-sets of R, called decision regions, spanning R and pairwise disjoint: { w M R =R, R,, R M st R w = R,and w w R w R w = 3 Definition 3: The encoding function EC C : M R for a code C is defined by: EC C w =x w, 4 where x w is the w-th code word in C Definition 4: The decoding function DEC R : R M for a decision rule R is defined by: DEC R y = w Rw y, 5 i M where is the indicator function and R w is the w-th subset in R III ETWORK MODEL The line network model to be considered is described in Fig There are n + nodes in the network identified by the indices {0,,,,n} Then hops in the network are each associated with noise variances σ i, i n For encoding, nodes 0,,,n choose C 0, C,, C n respectively to transmit, each code being rate-r length- with a power constraint of P 0 For decoding, nodes,,,n choose rate- R length- decision rules R, R,, R n for reception From here on, we write EC i and DEC i in place of EC Ci and DEC Ri respectively, for simplicity ode 0 generates a random message W M with probability distribution p W w and intends to convey the same to node n through the cascade of noisy links in a multihop fashion Each of the n intermediate nodes estimates the message as sent by the node in the previous hop from its own noisy observation, re-encodes the decoded message, and transmits the resulting codeword to the next hop The codeword transmitted by node i, forany0 i n is given by X i = EC i Ŵi, 6 where Ŵi is the estimate of the message at node i after decoding ote that Ŵ 0 = W in this notation The observation received by node i, forany i n is given by Y i,which follows a conditional density function that depends on the codeword X i sent by the previous node: p Yi X i y x = πσ i e y x σ i 7 The above density function follows from the assumptions of AWG noise and memoryless-ness in the channel The message Ŵi decoded by node i is given by Ŵ i = DEC i Y i 8 ote that the random variable Ŵn represents the message decoded by the final destination, as per the above notation IV A UPPER BOUD O THE RELIABILITY OF THE ETWORK The key result presented here is summarized by the following theorem and its corollary: Theorem : In a line network consisting of a cascade of n AWG links, let for m n of the links the noise variances satisfy σ i σ 0 > 0 Then, for any choice of codes C 0, C,, C n and decision rules R, R,, R n with rate R length, andforanyɛ > 0, the following holds for sufficiently large m: m ɛ IW ; Ŵn R Γ +e E P 0/σ 0, where E S =exp {cosh S + } +cosh S + The corollary to the above theorem leads to the upper bound on the reliability vs block-length scaling as claimed in the introductory section: Corollary : Let R > 0, σ 0 > 0, and r > 0 be independent of n In a line network of n links, let for rn of the hops the noise variance be at least σ 0 > 0 Then, for any n olog n, lim n IW ; Ŵn =0 Proof of Theorem : At any hop i, the process of channel encoding, noisy reception, followed by decoding induces the conditional probabilities pŵi Ŵi k j, j,k M For every hop i, letp i be the M M row-stochastic matrix whose entry in row j and column k is pŵi Ŵi k j ote that the j th row in P i gives the conditional probability mass function on the estimate of message W at hop i ie,ŵi given that the message sent by hop i was j Let P n P i 9 i= Then, P clearly represents the row-stochastic probability transition matrix between the original message W and the message decoded at the final destination, Ŵ n The transition matrix P 7

4 Int Zurich Seminar on Communications IZS, February 9 March, 0 ode 0 ode i ode n p W W EC 0 X 0 p Y X 0 Y p Yi X i Y i DEC i Ŵ i EC i X i p Yn Xn Yn DECn Ŵn Fig : Line etwork Model along with p W, the probability mass function of the original message W together induce a joint distribution between W and Ŵn Our goal is to find an upper bound on IW ; Ŵn, with the constraints as given in the statement of Theorem For any M M row-stochastic matrix Q = {Q jk },letus consider the following measures of deviation: δ Q max max j,j k Q jk Q jk, 0 M λ Q min min Q jk,q jk j,j k= Both the above deviations measure the degree to which the rows of Q differ The measure λ Q is known as Dobrushin s coefficient of ergodicity and plays a role in the convergence of non-homogeneous stochastic processes [5] Moreover, for any stochastic matrix Q, 0 λ Q ow consider the total variational distance between the joint distribution of W and Ŵn and the product of the respective marginal distributions: τ W ; Ŵn w,ŵ M p W w pŵn ŵ p W Ŵ n w, ŵ In the above expression, substituting p W w pŵn W ŵ w for p W Ŵ n w, ŵ, w p M Ŵ ŵ w n W p W w for p W Ŵ n w, ŵ and by using the triangle inequality, one obtains the following bound: τ W ; Ŵn w,ŵ p W w w p W w δ P = MδP = R δ P 3 Recalling the definition of P in our case, and from Theorem in [6] equivalently, Lemma in [7], we have: n n δ P =δ P i λ P i i= i= i:σ i σ 0 λ P i 4 The second inequality in the above follows by applying Eqn for those hops for which σ i <σ 0 ote that as per the statement of the theorem, we have at least m terms in the product in the last expression above ow consider λ P i for any hop i such that σ i σ 0 Suppose P i has L distinct columns with indices k,k,,k L such that j,p jk = pŵi Ŵi k j p 5 j,p jkl = pŵi Ŵi k L j p L 6 Then from the definition in Eqn, it follows that λ P i L p l 7 We now show that the conditions given by Eqn 5-6 hold true in our case, so that we can apply Eqn 7 to our bound Let α>0 Consider one particular hop i satisfying σ i σ 0 Define the spherical regions S = {y R : y } P 0 S α = {y R : y α } P 0 Clearly, any code word in C i transmitted by the previous hop has to be in S ext, observe that R i has to contain decision regions, say L in number, spanning S α entirely Let us designate them as R k, R k,, R kl Also define for each l L, V l R kl S α Hence, we have: L V l = S α, and l l V l V l = 8 Let Ŵi = j be the message as decoded by the previous hop Hence, the code word x j from C i was transmitted by the node i The probability that this was decoded as message k l at hop i is given by: pŵi Ŵi k l j = p Yi X i y x j dy y R kl p Yi X i y x j dy πσ i pŵi Ŵi k l j a b e y + x j σ i dy c e y x j σ i e dy +α P 0 σ i dy = e +αp 0 σ i V l, 9 73

5 Int Zurich Seminar on Communications IZS, February 9 March, 0 where V l denotes the -dimensional volume of V l Inthe above series of inequalities, step a follows from Eqn 7, b from the triangle inequality, and c from the fact that any y V l must satisfy y α P 0,sinceV l S α, and from the fact that x j, being a code word in C i naturally satisfies the power constraint criterion given by x j P 0 ote that the lower bound given by Eqn 9 is independent of the message j decoded by the previous hop Hence, the conditions given by Eqns 5-6 are satisfied for p l = e +αp 0 σ πσ i i V l 0 Hence, the bound given by Eqn 7 holds true In our case, it gives: L λ P i p l = e +α P0 L σ πσ i i V l e +α P0 σ i π Γ π = πσ i Γ + α P 0 The last equality follows from the fact that V l are mutually disjoint and span the entire -dimensional sphere S α Eqn 8 The volume of S α is computed as: S α = α P 0 + The ideas in the above argument are outlined in Figs a and b V 6 V V 3 V 5 V V 4 a Intersecting regions of S α with the decision rule R i S α α P 0 O x j y P0 V l S b Integrating the density function of noise centered at x j over a subset V l of S α Fig : Geometrical argument used for bounding λ P i ext, note that the bound given by Eqn holds true for any α>0 Hence, we can maximize the right hand side of Eqn over all α>0 to obtain λ P i Γ +e E P 0/σ i, 3 where the function E S is as given in the statement of the theorem oting further that ES is a non-decreasing function in S, we will have for hops satisfying σ i σ 0, λ P i Γ +e E P 0/σ 0 4 Since at least m hops satisfy the minimum noise variance criterion σ i σ 0, we can now combine Eqn 4 with Eqn 4 to obtain m δ P Γ +e E P 0/σ 0 5 Applying the above to Eqn 3, we will have τ W ; Ŵn R Γ +e E P 0/σ 0 m 6 We can finally bound IW ; Ŵn using Lemma 7 of Chapter in [8] by: IW ; Ŵn R τ W ; Ŵn log 7 τ W ; Ŵn The bound given by the statement of the theorem then follows from combining the above with Eqn 6 and by observing that for sufficiently large m, τ W ; Ŵn e,that xlog x is non-decreasing for x e, and finally from the fact that for any ɛ>0, x ɛ log x for sufficiently large x Proof of Corollary : For the corollary, we have m = nr Moreover, Γ + Hence, the bound given by Theorem reduces to: IW ; Ŵn R e E P 0/σ 0 nr ɛ 8 R e nr ɛe E P 0 /σ 0 9 = e Rlog elog nr ɛ E P 0 /σ 0 30 The last expression diminishes to 0 with n if = n o log n Hence, we will have IW ; Ŵn 0 for any o log n REFERECES [] David A Levin, Yuval Peres, and Elizabeth L Wilmer, Markov Chains and Mixing Times, American Mathematical Society, edition, 008 [] Claude E Shannon, Probability of error for optimal codes in a gaussian channel, Claude Elwood Shannon: Collected Papers, pp 79 34, 993 [3] R Gallager, A simple derivation of the coding theorem and some applications, IEEE Trans Inform Theory, vol, no, pp 3 8, Jan 965 [4] U iesen, C Fragouli, and D Tuninetti, On capacity of line networks, IEEE Trans Inform Theory, vol 53, no, pp , ov 007 [5] Adolf Rhodius, On the maximum of ergodicity coefficients, the Dobrushin ergodicity coefficient, and products of stochastic matrices, Linear Algebra and its Applications, vol 53, no -3, pp 4 54, 997 [6] J Hajnal and MS Bartlett, Weak ergodicity in non-homogeneous markov chains, Mathematical Proceedings of the Cambridge Philosophical Society, vol 54, pp 33 46, 958 [7] J Wolfowitz, Products of indecomposable, aperiodic, stochastic matrices, Proceedings of the American Mathematical Society, vol 4, no 5, pp pp , Oct 963 [8] I Csiszar and J Korner, Information Theory: Coding Theorems for Discrete Memoryless Systems, volofprobability and Mathematical Statistics, Academic Press, 98 74

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Graph-based codes for flash memory

Graph-based codes for flash memory 1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background

More information

Polar codes for the m-user MAC and matroids

Polar codes for the m-user MAC and matroids Research Collection Conference Paper Polar codes for the m-user MAC and matroids Author(s): Abbe, Emmanuel; Telatar, Emre Publication Date: 2010 Permanent Link: https://doi.org/10.3929/ethz-a-005997169

More information

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Optimization in Information Theory

Optimization in Information Theory Optimization in Information Theory Dawei Shen November 11, 2005 Abstract This tutorial introduces the application of optimization techniques in information theory. We revisit channel capacity problem from

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Exponential Error Bounds for Block Concatenated Codes with Tail Biting Trellis Inner Codes

Exponential Error Bounds for Block Concatenated Codes with Tail Biting Trellis Inner Codes International Symposium on Information Theory and its Applications, ISITA004 Parma, Italy, October 10 13, 004 Exponential Error Bounds for Block Concatenated Codes with Tail Biting Trellis Inner Codes

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Network Coding and Schubert Varieties over Finite Fields

Network Coding and Schubert Varieties over Finite Fields Network Coding and Schubert Varieties over Finite Fields Anna-Lena Horlemann-Trautmann Algorithmics Laboratory, EPFL, Schweiz October 12th, 2016 University of Kentucky What is this talk about? 1 / 31 Overview

More information

Low-density parity-check codes

Low-density parity-check codes Low-density parity-check codes From principles to practice Dr. Steve Weller steven.weller@newcastle.edu.au School of Electrical Engineering and Computer Science The University of Newcastle, Callaghan,

More information

Feedback Capacity of the First-Order Moving Average Gaussian Channel

Feedback Capacity of the First-Order Moving Average Gaussian Channel Feedback Capacity of the First-Order Moving Average Gaussian Channel Young-Han Kim* Information Systems Laboratory, Stanford University, Stanford, CA 94305, USA Email: yhk@stanford.edu Abstract The feedback

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Practical Polar Code Construction Using Generalised Generator Matrices

Practical Polar Code Construction Using Generalised Generator Matrices Practical Polar Code Construction Using Generalised Generator Matrices Berksan Serbetci and Ali E. Pusane Department of Electrical and Electronics Engineering Bogazici University Istanbul, Turkey E-mail:

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes Bounds on the Maximum ikelihood Decoding Error Probability of ow Density Parity Check Codes Gadi Miller and David Burshtein Dept. of Electrical Engineering Systems Tel-Aviv University Tel-Aviv 69978, Israel

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

EE 550: Notes on Markov chains, Travel Times, and Opportunistic Routing

EE 550: Notes on Markov chains, Travel Times, and Opportunistic Routing EE 550: Notes on Markov chains, Travel Times, and Opportunistic Routing Michael J. Neely University of Southern California http://www-bcf.usc.edu/ mjneely 1 Abstract This collection of notes provides a

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)

More information

Approximate Ergodic Capacity of a Class of Fading Networks

Approximate Ergodic Capacity of a Class of Fading Networks Approximate Ergodic Capacity of a Class of Fading Networks Sang-Woon Jeon, Chien-Yi Wang, and Michael Gastpar School of Computer and Communication Sciences EPFL Lausanne, Switzerland {sangwoon.jeon, chien-yi.wang,

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Submitted: December, 5 Abstract In modern communication systems,

More information

Algebraic gossip on Arbitrary Networks

Algebraic gossip on Arbitrary Networks Algebraic gossip on Arbitrary etworks Dinkar Vasudevan and Shrinivas Kudekar School of Computer and Communication Sciences, EPFL, Lausanne. Email: {dinkar.vasudevan,shrinivas.kudekar}@epfl.ch arxiv:090.444v

More information

MARKOV CHAINS AND HIDDEN MARKOV MODELS

MARKOV CHAINS AND HIDDEN MARKOV MODELS MARKOV CHAINS AND HIDDEN MARKOV MODELS MERYL SEAH Abstract. This is an expository paper outlining the basics of Markov chains. We start the paper by explaining what a finite Markov chain is. Then we describe

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

On Two Probabilistic Decoding Algorithms for Binary Linear Codes

On Two Probabilistic Decoding Algorithms for Binary Linear Codes On Two Probabilistic Decoding Algorithms for Binary Linear Codes Miodrag Živković Abstract A generalization of Sullivan inequality on the ratio of the probability of a linear code to that of any of its

More information

Channel combining and splitting for cutoff rate improvement

Channel combining and splitting for cutoff rate improvement Channel combining and splitting for cutoff rate improvement Erdal Arıkan Electrical-Electronics Engineering Department Bilkent University, Ankara, 68, Turkey Email: arikan@eebilkentedutr arxiv:cs/5834v

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

On Comparability of Multiple Antenna Channels

On Comparability of Multiple Antenna Channels On Comparability of Multiple Antenna Channels Majid Fozunbal, Steven W. McLaughlin, and Ronald W. Schafer School of Electrical and Computer Engineering Georgia Institute of Technology Atlanta, GA 30332-0250

More information