Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
|
|
- Jewel Stewart
- 5 years ago
- Views:
Transcription
1 Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University of Singapore {silas fong, vtan}@nus.edu.sg January 21, 2015 Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
2 Overview 1 Background Multimessage Multicast Networks Weak Converse vs Strong Converse 2 Problem Formulation Network Model Cut-Set Outer Bound and Noisy Network Coding Inner Bound 3 Main Result Main Theorem Strong Converse for Classes of MMNs 4 Proof Sketch Rényi Divergence Proof Sketch 5 Conclusion Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
3 Multimessage Multicast Networks (MMNs) Multiple sources transmit messages to multiple destinations. Each source transmits 1 message. Each destination decodes all the messages. Examples include the butterfly network and the relay channel. s 1 d 1 r 1 r 2 s 2 d 2 r s d Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
4 Some MMNs with Known Capacity Regions Y 2 2 X 2 1 X 1 Y 3 3 Finite-field linear deterministic network [Avestimehr et al., 2011] Y 3 = X 1 + X 2 GF(p n ) for some finite field GF(p n ) MMN consisting of independent DMCs [Köetter et al., 2011] Y 3 = (X 1 + Z 1, X 2 + Z 2) where Z 1 and Z 2 are independent noises. The linear network coding model is a special case when Z 1 = Z 2 = 0 [Li et al., 2003]. Wireless erasure network [Dana et al., 2006] { Y 3 = ( ˆX 1, ˆX 2) where ˆX erasure with prob. ε i i = X i with prob. 1 ε i. The direct part uses random coding. The converse part uses Fano s inequality, which yields a weak converse. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
5 Weak Converse Assumption: The probability of decoding error vanishes as the blocklength increases. If the rate tuple of a code falls outside the capacity region, the probability of decoding error must be bounded away from 0 as the blocklength increases. For the DMC example, consider any sequence of the optimal length-n schemes with rate R which minimize the error probability: Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
6 Strong Converse Weaker assumption: The asymptotic probability of decoding error is upper bounded by some ε [0, 1) as the blocklength increases. If the rate tuple of a code falls outside the capacity region, the probability of decoding error must tend to 1 as the blocklength increases. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
7 Motivation of This Work Although the capacity regions of the aforementioned DM-MMNs are well-known, only the weak converse has been proved using Fano s inequality. Therefore we are motivated to prove the strong converse using Rényi divergence: Powerful technique for establishing strong converse theorems. Has been employed previously to establish strong converse for DMC with output feedback [Polyanskiy and Verdú, 2010], classical-quantum channels [Ogawa and Nagaoka, 1999], and entanglement-breaking quantum channels [Wilde et al., 2013]. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
8 Network Model Let I {1, 2,..., N} be the index set of the nodes. Let S I and D I denote the set of source and destination nodes respectively. The sources in S transmit information to the destinations in D in n time slots: Each source i S chooses W i to transmit. Message W i is uniform on {1, 2,..., 2 nr i } where R i denotes the rate of W i. Each destination j D wants to decode W S. Each node i transmits X i,k and receives Y i,k in time slot k. X i,k is a function of (W i, Y k 1 i ). The channel is characterized by q YI X I : For each k {1, 2,..., n}, Pr{Y I,k = y I,k W I = w I, X k I = x k I, Y k 1 I = y k 1 I } = q YI X I (y I,k x I,k ) Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
9 ε-capacity Region Define R I (R 1, R 2,..., R N ) to be a rate tuple. Assume wlog R i = 0 i / S. A length-n code operating at rate R I is called an (n, R I, ε n )-code if the average probability of decoding error is ε n. R I is ε-achievable if a sequence of (n, R I, ε n )-codes such that lim sup ε n ε. n ε-capacity region C ε {R I : R I is ε-achievable} Fano s inequality yields an outer bound for only C 0 Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
10 Cut-Set Outer Bound A well-known outer bound on the capacity region of DM-MMN developed by El Gamal in C 0 {R R i I pxi q I YI X I (X T ; Y T c X T c ) } i T p XI T I: T c D Applying it to the relay channel, we have R max min{i (X 1 ; Y 2, Y 3 X 2 ), I (X p X1,X 2 }{{} 1, X 2 ; Y 3 ) }{{} cut T 1 cut T 2 } Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
11 Simplified Noisy Network Coding (NNC) Inner Bound Simplified noisy network coding (NNC) inner bound [Lim et al., 2011]: C 0 {R R i I (X T ; Y T c X T c ) H(Y T X I, Y T c ) } I p XI :p XI = T I: N i=1 p X T c D i i T Similar to the cut-set bound: C 0 {R R i I pxi q I YI X I (X T ; Y T c X T c ) i T p XI T I: T c D } For the finite-field linear deterministic network, MMN consisting of independent channels and the wireless erasure network, NNC inner bound = cut-set bound Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
12 Main Theorem Theorem (Strong Converse Outer Bound) For each ε [0, 1), C ε {R R i I pxi q I YI X I (X T ; Y T c X T c ) } T I: p XI T c D i T Compare with cut-set outer bound: C 0 {R R i I pxi q I YI X I (X T ; Y T c X T c ) i T p XI T I: T c D = Reason for the gap: Both proofs first fix an arbitrary T and then find a distribution p (T ) X I that R i I (T ) p (X q T ; Y T c X T c ) for the ε-achievable R I. X YI X I I i T } such In the proof of cut-set bound, p (T ) X I are the same for all T. In the proof of strong converse bound, p (T ) X I can be different for different T. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
13 Strong Converse for Classes of MMNs Proposition For the finite-field linear deterministic network, MMN consisting of independent channels and the wireless erasure network, strong converse bound = cut-set bound known = NNC inner bound. Corollary (Strong converse) Consider a network belonging to one of the above three classes. For each ε [0, 1), our main theorem implies that C ε strong converse bound. Combining with the proposition above, we have C ε NNC inner bound. Since NNC inner bound C ε, we have C ε = NNC inner bound. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
14 Rényi Divergence Definition The conditional Rényi divergence with parameter λ [1, ) between p X Z and q X Z given r Z is 1 D λ (p X Z q X Z r Z ) log r λ 1 Z (z) (p X Z (x z)) λ if λ > 1, (q z Z x X X Z (x z)) λ 1 D(p X Z q X Z r Z ) if λ = 1 where D(p X Z q X Z r Z ) z Z r Z (z) x X p X Z (x z) log p X Z (x z) q X Z (x z) is the relative entropy. Data processing inequality (DPI) D λ (p X q X ) D λ (p g(x ) q g(x ) ) for any function g. In particular, D λ (p X,Y q X,Y ) D λ (p X q X ). Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
15 Properties of Rényi Divergence Lemma 1 ( Mutual information approximation) D λ (p XY Z p X Z p Y Z r Z ) D(p XY Z p X Z p Y Z r Z ) + 8(λ 1) X 5 Y 5 = I rz p XY Z (X ; Y Z) + 8(λ 1) X 5 Y 5 Lemma 2 (Chain rule) Given n where r (λ) X k p Yk X k, n q Yk X k D λ ( n and r X n I, we have ) n p Yk X k q Yk X k r X n I = which is determined by λ, k 1 m=1 n p Ym Xm, k 1 D λ (p Yk X k q Yk X k r (λ) X k ) m=1 q Ym X m and r X k I Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
16 Recalling Proof Steps for Cut-Set Bound 1 Lower bounding the error probability in terms of mutual information using Fano s inequality n i T R i I (W T ; ŴT,d W T c ) ε n n i T R 2 Using the DPI of the mutual information to introduce the channel output I (W T ; ŴT,d W T c ) I (W T ; Y n T c W T c ) 3 Single-letterizing the mutual information I (W T ; Y n T c W T c ) n I (X T,k; Y T c,k X T c,k) 4 Introduction of a time-sharing random variable Q n n I (X T,k; Y T c,k X T c,k) ni (X T,Qn ; Y T c,q n X T c,q n ) 5 Combining the above inequalities, using lim n ε n = 0 i T R i I (X T ; Y T c X T c ) Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
17 Proof Steps Using Rényi Divergence 1) Lower bounding the error probability in terms of the Rényi divergence ( ) nr i D λ (p WT,Ŵ T,d p WT sŵt,d ) + λ(λ 1) 1 1 log 1 ε n i T for any choice of sŵt,d. 2) Using the DPI of the Rényi divergence to introduce the channel input and output with a proper choice of s X n I,Y I n,ŵt,d ( n ) n D λ (p WT,Ŵ T,d p WT sŵt,d ) D λ q YT c,k X I,k s YT c,k X I,k p X n I 3) Single-letterizing the Rényi divergence using the chain rule D λ ( n n q YT c,k X I,k s YT c,k X I,k p X n I )= n D λ (q YT c X I s YT c,k X T c,k p (λ) X I,k ). Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
18 Proof Steps Using Rényi Divergence 4) Representing distributions in the Rényi divergence by a single distribution Due to the careful choice of s X n I,Y I n, we can define,ŵt p(λ) {d} X I,k,Y T c,k s.t. n D λ (q YT c X I s YT c,k X T c,k p (λ) X I,k ) n D λ ( p (λ) Y T c,k X I,k p (λ) Y T c,k X T c,k p (λ) X I,k ). 5) Introduction of a time-sharing variable followed by approximating I with D λ. Using a time-sharing variable Q n and letting λ = n, n D 1+ 1 ( p (1+ 1 ) n n Y T c,k X I,k p (1+ 1 ) n Y T c,k X p (1+ 1 T c,k ) n X I,k ) nd 1+ 1 ( p (1+ 1 ) n n Y T c,qn X I,Qn p (1+ 1 ) n Y T c,qn X p (1+ 1 T c,qn ni (X T ; Y T c X T c ) + 8 X 5 Y 5 n Combining the steps and letting n, R i I (X T ; Y T c X T c ). i T ) n X I,Qn ) Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
19 Conclusion In a multimessage multicast network (MMN), every source node transmits a message and every destination node decodes all the messages. A strong converse outer bound for the discrete memoryless MMN have been established using Rényi divergence, i.e., outer bound on C ε. For any sequence of codes that operate at a rate tuple outside the strong converse bound, the average probability of decoding error must tend to 1. Our strong converse bound implies the strong converse some classes of MMNs including The finite-field linear deterministic network. The MMN consisting of independent DMCs. The wireless erasure network. For the aforementioned MMNs, we have fully characterized their ε-capacity regions, which coincide with the NNC inner bound and the cut-set bound. Open problem: Prove or disprove that the cut-set bound contains C ε. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
20 References Avestimehr, A. S., Diggavi, S. N., and Tse, D. N. (2011). Wireless network information flow: A deterministic approach. IEEE Trans. Inf. Theory, 57(4): Dana, A. F., Gowaikar, R., Palanki, R., Hassibi, B., and Effros, M. (2006). Capacity of wireless erasure networks. IEEE Trans. Inf. Theory, 52(3): Köetter, R., Effros, M., and Médard, M. (2011). A theory of network equivalence Part I: Point-to-point channels. IEEE Trans. Inf. Theory, 57(2): Li, S.-Y. R., Yeung, R. W., and Cai, N. (2003). Linear network coding. IEEE Trans. Inf. Theory, 49(2): Lim, S. H., Kim, Y.-H., El Gamal, A., and Chung, S.-Y. (2011). Noisy network coding. IEEE Trans. on Inform. Th., 57(5): Ogawa, T. and Nagaoka, H. (1999). Strong converse to the quantum channel coding theorem. IEEE Trans. on Inform. Th., 45(7): Polyanskiy, Y. and Verdú, S. (2010). Arimoto channel coding converse and Rényi divergence. In Proc. Allerton Conference on Communication, Control and Computing, pages Wilde, M. M., Winter, A., and Yang, D. (2013). Strong converse for the classical capacity of entanglement-breaking and hadamard channels via a sandwiched Rényi relative entropy. ArXiv Preprint. Silas Fong (NUS) Strong Converse for MMNs January 21, / 20
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationQuantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels
Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National
More informationA Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationThe Role of Directed Information in Network Capacity
The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationA tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard
A tool oriented approach to network capacity Ralf Koetter Michelle Effros Muriel Medard ralf.koetter@tum.de effros@caltech.edu medard@mit.edu The main theorem of NC [Ahlswede, Cai, Li Young, 2001] Links
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationEquivalence for Networks with Adversarial State
Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationUCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5
UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).
More informationMixed Noisy Network Coding
Mixed Noisy Networ Coding Arash Behboodi Dept of elecommunication Systems, U Berlin 10587 Berlin, Germany Email: behboodi@tntu-berlinde Pablo Piantanida Dept of elecommunications, SUPELEC 9119 Gif-sur-Yvette,
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationThe Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel
The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationAnalyzing Large Communication Networks
Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate
More informationSum Capacity of General Deterministic Interference Channel with Channel Output Feedback
Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of
More informationAsymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationOn the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity
03 EEE nternational Symposium on nformation Theory On the K-user Cognitive nterference Channel with Cumulative Message Sharing Sum-Capacity Diana Maamari, Daniela Tuninetti and Natasha Devroye Department
More informationCan Feedback Increase the Capacity of the Energy Harvesting Channel?
Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationClassical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo
Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet
More informationOn Source-Channel Communication in Networks
On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationSimultaneous Nonunique Decoding Is Rate-Optimal
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationRefined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC
More informationApproximate capacity region of the two-pair bidirectional Gaussian relay network
Approximate capacity region of the two-pair bidirectional Gaussian relay network Aydin Sezgin UC Irvine CA USA asezgin@uci.edu M. Amin Khajehnejad Caltech Pasadena CA USA amin@caltech.edu A. Salman Avestimehr
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More information5 Mutual Information and Channel Capacity
5 Mutual Information and Channel Capacity In Section 2, we have seen the use of a quantity called entropy to measure the amount of randomness in a random variable. In this section, we introduce several
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationBounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints
Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering
More informationStrong converse theorems using Rényi entropies
Strong converse theorems using Rényi entropies Felix Leditzky joint work with Mark M. Wilde and Nilanjana Datta arxiv:1506.02635 5 January 2016 Table of Contents 1 Weak vs. strong converse 2 Rényi entropies
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationApproximate Capacity of the MIMO Relay Channel
04 IEEE International Symposium on Information Theory Approximate Capacity of the MIMO Relay Channel Xianglan Jin Department of Electrical and Computer Engineering Pusan National University Busan 609-735
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationSecond-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationDifferent quantum f-divergences and the reversibility of quantum operations
Different quantum f-divergences and the reversibility of quantum operations Fumio Hiai Tohoku University 2016, September (at Budapest) Joint work with Milán Mosonyi 1 1 F.H. and M. Mosonyi, Different quantum
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationLinearly Representable Entropy Vectors and their Relation to Network Coding Solutions
2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationChannels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationAn Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets
An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationNetwork With Costs: Timing and Flow Decomposition
etwork With Costs: Timing and Flow Decomposition Shreeshankar Bodas, Jared Grubb, Sriram Sridharan, Tracey Ho, Sriram Vishwanath {bodas, grubb, sridhara, sriram}@ece.utexas.edu, tho@caltech.edu Abstract
More informationOn queueing in coded networks queue size follows degrees of freedom
On queueing in coded networks queue size follows degrees of freedom Jay Kumar Sundararajan, Devavrat Shah, Muriel Médard Laboratory for Information and Decision Systems, Massachusetts Institute of Technology,
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationGaussian Relay Channel Capacity to Within a Fixed Number of Bits
Gaussian Relay Channel Capacity to Within a Fixed Number of Bits Woohyuk Chang, Sae-Young Chung, and Yong H. Lee arxiv:.565v [cs.it] 23 Nov 2 Abstract In this paper, we show that the capacity of the threenode
More informationOn Capacity Under Received-Signal Constraints
On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationAmobile satellite communication system, like Motorola s
I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired
More informationIET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009
Published in IET Communications Received on 10th March 2008 Revised on 10th July 2008 ISSN 1751-8628 Comprehensive partial decoding approach for two-level relay networks L. Ghabeli M.R. Aref Information
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More information2014 IEEE International Symposium on Information Theory. Two-unicast is hard. David N.C. Tse
Two-unicast is hard Sudeep Kamath ECE Department, University of California, San Diego, CA, USA sukamath@ucsd.edu David N.C. Tse EECS Department, University of California, Berkeley, CA, USA dtse@eecs.berkeley.edu
More informationRelay Networks With Delays
Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu
More informationSimple Channel Coding Bounds
ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,
More informationCapacity of 1-to-K Broadcast Packet Erasure Channels with Channel Output Feedback
Capacity of 1-to-K Broadcast Packet Erasure Channels with Channel Output Feedback Chih-Chun Wang Center of Wireless Systems and Applications (CWSA School of Electrical and Computer Engineering, Purdue
More informationThe Generalized Degrees of Freedom of the Interference Relay Channel with Strong Interference
The Generalized Degrees of Freedom of the Interference Relay Channel with Strong Interference Soheyl Gherekhloo, Anas Chaaban, and Aydin Sezgin Chair of Communication Systems RUB, Germany Email: {soheyl.gherekhloo,
More informationLinear Codes, Target Function Classes, and Network Computing Capacity
Linear Codes, Target Function Classes, and Network Computing Capacity Rathinakumar Appuswamy, Massimo Franceschetti, Nikhil Karamchandani, and Kenneth Zeger IEEE Transactions on Information Theory Submitted:
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationOn the Capacity of the Two-Hop Half-Duplex Relay Channel
On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,
More informationCapacity Region of the Permutation Channel
Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes
More informationOn the Capacity of the Binary-Symmetric Parallel-Relay Network
On the Capacity of the Binary-Symmetric Parallel-Relay Network Lawrence Ong, Sarah J. Johnson, and Christopher M. Kellett arxiv:207.3574v [cs.it] 6 Jul 202 Abstract We investigate the binary-symmetric
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationAchieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel
Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationImproving on the Cutset Bound via a Geometric Analysis of Typical Sets
Imroving on the Cutset Bound via a Geometric Analysis of Tyical Sets Ayfer Özgür Stanford University CUHK, May 28, 2016 Joint work with Xiugang Wu (Stanford). Ayfer Özgür (Stanford) March 16 1 / 33 Gaussian
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More information