The Hallucination Bound for the BSC
|
|
- Suzanna McLaughlin
- 5 years ago
- Views:
Transcription
1 The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University of Wisconsin at Madison Major Support from NSF CISE July 8th 2008: ISIT Tu-AM-3.3 Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
2 Outline 1 Motivation and review 2 Block-coding converse: minimum error probability 3 Streaming converse: bit error probability Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
3 Review: Fied blocks Feedback is pointless at high rates. (Dobrushin and Haroutunian) Hard decision regions cover space Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
4 Review: Fied blocks: Forney-68 Decision regions catch the typical sets only Refuse to decide when ambiguous Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
5 Review: Fied blocks: Forney-68 Decision regions catch the typical sets only 1 bit feedback can request retransmissions Can interpret as epected block-length Refuse to decide when ambiguous Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
6 Review: Fied blocks: Forney-68 Decision regions catch the typical sets only 1 bit feedback can request retransmissions Can interpret as epected block-length No converse ecept at zero rate Refuse to decide when ambiguous Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
7 Review: Fied blocks, Soft deadlines: Burnashev Burnashev bound eponent Forney performance eponent Sphere packing packing boundeponent Random signature performance error eponent average communication rate (bits/channel use) Showed C 1 (1 R C ) was a bound where C 1 = ma i,j D(p i p j ) Considered epected stopping time and used Martingale arguments. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
8 Review: Fied blocks, Soft deadlines: Burnashev Burnashev bound eponent Forney performance eponent Sphere packing packing boundeponent Random signature performance error eponent average communication rate (bits/channel use) Block length n Data Transmission Ack/Nak... possible... retransmissions Enc λ n Decision feedback = m Dec ( 1 λ ) n Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
9 Streaming: an opportunity presents itself Data Blocks decoding error Nak Acks What if we only sent NAKs when needed? Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
10 Sliding blocks with collective punishment only (Kudryashov-79) Data Blocks Emergency msg, fied length = decision delay decision delay Final decision on purple msg (unless detect Nak) decision delay Final decision on blue msg (unless detect Nak) Make packet length n much smaller than soft deadline. A NAK collectively denies the past n 1 packets Error only if n 1 NAKs are all missed Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
11 Reason for hope: Csiszar s result 80 Can pack-in control messages at lower rates and give each subset their own random-coding bound! Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
12 Even simpler: need only one special message Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
13 Even simpler: need only one special message Typical noise sphere Unequal Error Protection Hypothesis testing threshold: Is it a Nak or is it data? Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
14 Specialize to BSC case 1 q 0.5 q y q (1 p) + (1 q) p Gap Use all zero for NAK 0 p codeword composition BSC (p) output composition Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
15 Specialize to BSC case 1 q 0.5 q y q (1 p) + (1 q) p Gap Use all zero for NAK Use composition q code for data: R < H(q y ) H(p) 0 p codeword composition BSC (p) output composition Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
16 Specialize to BSC case 1 q Gap p q y q (1 p) + (1 q) p Use all zero for NAK Use composition q code for data: R < H(q y ) H(p) Probability of missed NAK is 2 nd(qy p) codeword composition BSC (p) output composition Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
17 Resulting eponents Delay eponent Burnashev eponent Forney eponent Sphere packing eponent error eponent average communication rate (bits/channel use) Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
18 Outline 1 Motivation and review 2 Block-coding converse: minimum error probability 3 Streaming converse: bit error probability Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
19 Towards the Hallucination Bound B(p) + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
20 Towards the Hallucination Bound B(p) 1 B(2p) + B(0.5) X + X + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
21 Towards the Hallucination Bound B(p) 1 B(2p) + B(0.5) X + X + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Trivial bound: let channel disconnect (2p) n Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
22 Towards the Hallucination Bound B(p) 1 B(2p) + B(0.5) X + X + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Trivial bound: let channel disconnect (2p) n What is the chance we land on something 1 2 n Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
23 Towards the Hallucination Bound B(p) 1 B(2p) + B(0.5) X + X + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Trivial bound: let channel disconnect (2p) n What is the chance we land on something 1 2 n How many decode to normal codewords? > 2 nr Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
24 Towards the Hallucination Bound B(p) 1 B(2p) + B(0.5) X + X + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? Trivial bound: let channel disconnect (2p) n What is the chance we land on something 1 2 n How many decode to normal codewords? > 2 nr Eponent at most: log 2 1 p R Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
25 Towards the Hallucination Bound B(p) + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? 1 q codeword composition BSC (p) Gap p q y q (1 p) + (1 q) p output composition Each normal message needs 2 nh(p) to decode to it. Must claim 2 n(r+h(p)) volume. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
26 Towards the Hallucination Bound B(p) + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? 1 q codeword composition BSC (p) Gap p q y q (1 p) + (1 q) p output composition Each normal message needs 2 nh(p) to decode to it. Must claim 2 n(r+h(p)) volume. Place special-message as far away as possible. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
27 Towards the Hallucination Bound B(p) + No converse for Forney, unlike Burnashev and Sphere-packing. What about for the minimum probability of error with feedback? 1 q codeword composition BSC (p) Gap p q y q (1 p) + (1 q) p output composition Each normal message needs 2 nh(p) to decode to it. Must claim 2 n(r+h(p)) volume. Place special-message as far away as possible. Matches achievability! Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
28 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
29 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. 2 nr regular messages could have their normal (non-erased) decoding regions packed into any of those types. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
30 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. 2 nr regular messages could have their normal (non-erased) decoding regions packed into any of those types. Typical footprint at least min p 2 nh(y X) inside. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
31 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. 2 nr regular messages could have their normal (non-erased) decoding regions packed into any of those types. Typical footprint at least min p 2 nh(y X) inside. Lower bound: pick most distant p y P y. 2 n(d(py epy)+h(py)) 2 nr+nh(y X) Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
32 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. 2 nr regular messages could have their normal (non-erased) decoding regions packed into any of those types. Typical footprint at least min p 2 nh(y X) inside. Lower bound: pick most distant p y P y. Allow conve hull of E hal (R) = 2 n(d(py epy)+h(py)) 2 nr+nh(y X) ma ma [D(P(p ) p y ) + (I(p, P) R)] I(p,P) R ep y P y Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
33 Generalizes to general symmetric channels Consider fied block-length and moderate probability of correct decoding for many regular codewords. Berger s source-coding game reveals that typical channel outputs can only reach P y output types. 2 nr regular messages could have their normal (non-erased) decoding regions packed into any of those types. Typical footprint at least min p 2 nh(y X) inside. Lower bound: pick most distant p y P y. Allow conve hull of E hal (R) = 2 n(d(py epy)+h(py)) 2 nr+nh(y X) ma ma [D(P(p ) p y ) + (I(p, P) R)] I(p,P) R ep y P y On Friday Borade, et al will show a more general proof that holds with epected block-length. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
34 Outline 1 Motivation and review 2 Block-coding converse: minimum error probability 3 Streaming converse: bit error probability Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
35 The basic intuition Bit Enters Bit is decoded If we hallucinate for, we will miss our opportunity to decode. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
36 The basic intuition If we hallucinate for, we will miss our opportunity to decode. Challenge: overlaps with other bit-footprints. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
37 The idea of the proof View a block of many n. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
38 The idea of the proof View a block of many n. Count the volume of normal decoding regions in two different ways. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
39 The idea of the proof View a block of many n. Count the volume of normal decoding regions in two different ways. As a big block code based on correct decoding. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
40 The idea of the proof View a block of many n. Count the volume of normal decoding regions in two different ways. As a big block code based on correct decoding. With a tree-structure based on the desired eponent. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
41 The idea of the proof Bit Enters Bit is decoded We can guess early View a block of many n. Count the volume of normal decoding regions in two different ways. As a big block code based on correct decoding. With a tree-structure based on the desired eponent. Technical condition: impose sequentiality on the code: we can usually guess the answer even before the deadline runs out. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
42 Conclusions The Hallucination Bound is the probability that that the decoder imagines that everything is normal despite your best efforts to tell it otherwise. This corresponds to the best probability of error for a special message in the fied block-code setting. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
43 Conclusions The Hallucination Bound is the probability that that the decoder imagines that everything is normal despite your best efforts to tell it otherwise. This corresponds to the best probability of error for a special message in the fied block-code setting. Future work Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
44 Conclusions The Hallucination Bound is the probability that that the decoder imagines that everything is normal despite your best efforts to tell it otherwise. This corresponds to the best probability of error for a special message in the fied block-code setting. Future work Eliminate the technical condition for the streaming case. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
45 Conclusions The Hallucination Bound is the probability that that the decoder imagines that everything is normal despite your best efforts to tell it otherwise. This corresponds to the best probability of error for a special message in the fied block-code setting. Future work Eliminate the technical condition for the streaming case. Get a two-way Hallucination bound for the case of noisy feedback. Sahai/Draper (UC Berkeley) Hallucination Bound Jul 8, / 17
Feedback and Side-Information in Information Theory
Feedback and Side-Information in Information Theory Anant Sahai and Sekhar Tatikonda UC Berkeley and Yale sahai@eecs.berkeley.edu and sekhar.tatikonda@yale.edu ISIT 27 Tutorial T2 Nice, France June 24,
More informationDelay, feedback, and the price of ignorance
Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences
More informationAttaining maximal reliability with minimal feedback via joint channel-code and hash-function design
Attaining maimal reliability with minimal feedback via joint channel-code and hash-function design Stark C. Draper, Kannan Ramchandran, Biio Rimoldi, Anant Sahai, and David N. C. Tse Department of EECS,
More informationUniversal Anytime Codes: An approach to uncertain channels in control
Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer
More informationOn the variable-delay reliability function of discrete memoryless channels with access to noisy feedback
ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering
More informationModeling and Simulation NETW 707
Modeling and Simulation NETW 707 Lecture 6 ARQ Modeling: Modeling Error/Flow Control Course Instructor: Dr.-Ing. Maggie Mashaly maggie.ezzat@guc.edu.eg C3.220 1 Data Link Layer Data Link Layer provides
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationUnequal Error Protection: An Information Theoretic Perspective
1 Unequal Error Protection: An Information Theoretic Perspective Shashi Borade Barış Nakiboğlu Lizhong Zheng EECS, Massachusetts Institute of Technology { spb, nakib, lizhong } @mit.edu arxiv:0803.2570v4
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationCoding into a source: an inverse rate-distortion theorem
Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationThe error exponent with delay for lossless source coding
The error eponent with delay for lossless source coding Cheng Chang and Anant Sahai Wireless Foundations, University of California at Berkeley cchang@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract
More informationCoding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1
EECS 121 Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 PRINT your student ID: PRINT AND SIGN your name:, (last) (first) (signature) PRINT your Unix account login: ee121- Prob.
More informationRelaying Information Streams
Relaying Information Streams Anant Sahai UC Berkeley EECS sahai@eecs.berkeley.edu Originally given: Oct 2, 2002 This was a talk only. I was never able to rigorously formalize the back-of-the-envelope reasoning
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationOn the Error Exponents of ARQ Channels with Deadlines
On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic
More informationIntroduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.
Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information
More informationA Simple Converse of Burnashev s Reliability Function
A Simple Converse of Burnashev s Reliability Function 1 arxiv:cs/0610145v3 [cs.it] 23 Sep 2008 Peter Berlin, Barış Nakiboğlu, Bixio Rimoldi, Emre Telatar School of Computer and Communication Sciences Ecole
More informationEE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes
EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More information16.36 Communication Systems Engineering
MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication
More informationAn analysis of the computational complexity of sequential decoding of specific tree codes over Gaussian channels
An analysis of the computational complexity of sequential decoding of specific tree codes over Gaussian channels B. Narayanaswamy, Rohit Negi and Pradeep Khosla Department of ECE Carnegie Mellon University
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationConvolutional Coding LECTURE Overview
MIT 6.02 DRAFT Lecture Notes Spring 2010 (Last update: March 6, 2010) Comments, questions or bug reports? Please contact 6.02-staff@mit.edu LECTURE 8 Convolutional Coding This lecture introduces a powerful
More informationLecture 7 September 24
EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling
More informationBandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)
Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationChannel Coding for Secure Transmissions
Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear
More informationUpper Bounds to Error Probability with Feedback
Upper Bounds to Error robability with Feedbac Barış Naiboğlu Lizhong Zheng Research Laboratory of Electronics at MIT Cambridge, MA, 0239 Email: {naib, lizhong }@mit.edu Abstract A new technique is proposed
More informationLecture 18: Gaussian Channel
Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More informationOutline. EECS Components and Design Techniques for Digital Systems. Lec 18 Error Coding. In the real world. Our beautiful digital world.
Outline EECS 150 - Components and esign Techniques for igital Systems Lec 18 Error Coding Errors and error models Parity and Hamming Codes (SECE) Errors in Communications LFSRs Cyclic Redundancy Check
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 3: Solutions Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 126: Probablity and Random Processes Problem Set 3: Solutions Fall 2007 Issued: Thursday, September 13, 2007 Due: Friday, September
More informationA bound for block codes with delayed feedback related to the sphere-packing bound
A bound for block codes with delaed feedback related to the sphere-packing bound Harikrishna R Palaianur Anant Sahai Electrical Engineering and Computer Sciences Universit of California at Berkele echnical
More informationMATH3302. Coding and Cryptography. Coding Theory
MATH3302 Coding and Cryptography Coding Theory 2010 Contents 1 Introduction to coding theory 2 1.1 Introduction.......................................... 2 1.2 Basic definitions and assumptions..............................
More information6.02 Fall 2011 Lecture #9
6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First
More informationNoisy Group Testing and Boolean Compressed Sensing
Noisy Group Testing and Boolean Compressed Sensing Venkatesh Saligrama Boston University Collaborator: George Atia Outline Examples Problem Setup Noiseless Problem Average error, Worst case error Approximate
More informationChapter 7: Channel coding:convolutional codes
Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication
More informationDecoding Reed-Muller codes over product sets
Rutgers University May 30, 2016 Overview Error-correcting codes 1 Error-correcting codes Motivation 2 Reed-Solomon codes Reed-Muller codes 3 Error-correcting codes Motivation Goal: Send a message Don t
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationBinary Convolutional Codes
Binary Convolutional Codes A convolutional code has memory over a short block length. This memory results in encoded output symbols that depend not only on the present input, but also on past inputs. An
More informationarxiv: v1 [cs.ds] 30 Jun 2016
Online Packet Scheduling with Bounded Delay and Lookahead Martin Böhm 1, Marek Chrobak 2, Lukasz Jeż 3, Fei Li 4, Jiří Sgall 1, and Pavel Veselý 1 1 Computer Science Institute of Charles University, Prague,
More informationTraining-Based Schemes are Suboptimal for High Rate Asynchronous Communication
Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationLecture 12. Block Diagram
Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationA POMDP Framework for Cognitive MAC Based on Primary Feedback Exploitation
A POMDP Framework for Cognitive MAC Based on Primary Feedback Exploitation Karim G. Seddik and Amr A. El-Sherif 2 Electronics and Communications Engineering Department, American University in Cairo, New
More informationUniversal Incremental Slepian-Wolf Coding
Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu
More informationThe connection between information theory and networked control
The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of
More informationImproved Direct Product Theorems for Randomized Query Complexity
Improved Direct Product Theorems for Randomized Query Complexity Andrew Drucker Nov. 16, 2010 Andrew Drucker, Improved Direct Product Theorems for Randomized Query Complexity 1/28 Big picture Usually,
More informationDigital communication system. Shannon s separation principle
Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation
More informationCS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding
CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding Tim Roughgarden October 29, 2014 1 Preamble This lecture covers our final subtopic within the exact and approximate recovery part of the course.
More informationCoding theory: Applications
INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts
More informationA Key Recovery Attack on MDPC with CCA Security Using Decoding Errors
A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors Qian Guo Thomas Johansson Paul Stankovski Dept. of Electrical and Information Technology, Lund University ASIACRYPT 2016 Dec 8th, 2016
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationCooperative HARQ with Poisson Interference and Opportunistic Routing
Cooperative HARQ with Poisson Interference and Opportunistic Routing Amogh Rajanna & Mostafa Kaveh Department of Electrical and Computer Engineering University of Minnesota, Minneapolis, MN USA. Outline
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationA Simple Converse of Burnashev's Reliability Function
A Simple Converse of Burnashev's Reliability Function The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher
More informationUplink HARQ for Cloud RAN via Separation of Control and Data Planes
1 Uplink HARQ for Cloud RAN via Separation of Control and Data Planes Shahrouz Khalili, Student Member, IEEE and Osvaldo Simeone, Fellow, IEEE arxiv:158.657v5 [cs.it] 5 Sep 16 Abstract The implementation
More informationCHAPTER 8 Viterbi Decoding of Convolutional Codes
MIT 6.02 DRAFT Lecture Notes Fall 2011 (Last update: October 9, 2011) Comments, questions or bug reports? Please contact hari at mit.edu CHAPTER 8 Viterbi Decoding of Convolutional Codes This chapter describes
More informationDynamic Data-Driven Adaptive Sampling and Monitoring of Big Spatial-Temporal Data Streams for Real-Time Solar Flare Detection
Dynamic Data-Driven Adaptive Sampling and Monitoring of Big Spatial-Temporal Data Streams for Real-Time Solar Flare Detection Dr. Kaibo Liu Department of Industrial and Systems Engineering University of
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationA Comparison of Superposition Coding Schemes
A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationLecture 26: Arthur-Merlin Games
CS 710: Complexity Theory 12/09/2011 Lecture 26: Arthur-Merlin Games Instructor: Dieter van Melkebeek Scribe: Chetan Rao and Aaron Gorenstein Last time we compared counting versus alternation and showed
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationCSCI-B609: A Theorist s Toolkit, Fall 2016 Oct 4. Theorem 1. A non-zero, univariate polynomial with degree d has at most d roots.
CSCI-B609: A Theorist s Toolkit, Fall 2016 Oct 4 Lecture 14: Schwartz-Zippel Lemma and Intro to Coding Theory Lecturer: Yuan Zhou Scribe: Haoyu Zhang 1 Roots of polynomials Theorem 1. A non-zero, univariate
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationRandom Access: An Information-Theoretic Perspective
Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of
More informationCSE 123: Computer Networks
CSE 123: Computer Networks Total points: 40 Homework 1 - Solutions Out: 10/4, Due: 10/11 Solutions 1. Two-dimensional parity Given below is a series of 7 7-bit items of data, with an additional bit each
More informationEECS Components and Design Techniques for Digital Systems. Lec 26 CRCs, LFSRs (and a little power)
EECS 150 - Components and esign Techniques for igital Systems Lec 26 CRCs, LFSRs (and a little power) avid Culler Electrical Engineering and Computer Sciences University of California, Berkeley http://www.eecs.berkeley.edu/~culler
More informationMultichannel liar games with a fixed number of lies
Multichannel liar games with a fixed number of lies Robert Ellis 1 Kathryn Nyman 2 1 Illinois Institute of Technology 2 Loyola University SIAM Discrete Mathematics 2006 Ellis, Nyman (June 27, 2006) Multichannel
More informationExact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel
Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel Brian M. Kurkoski, Paul H. Siegel, and Jack K. Wolf Department of Electrical and Computer Engineering
More informationList Decoding of Reed Solomon Codes
List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationCyclic Redundancy Check Codes
Cyclic Redundancy Check Codes Lectures No. 17 and 18 Dr. Aoife Moloney School of Electronics and Communications Dublin Institute of Technology Overview These lectures will look at the following: Cyclic
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationFundamental rate delay tradeoffs in multipath routed and network coded networks
Fundamental rate delay tradeoffs in multipath routed and network coded networks John Walsh and Steven Weber Drexel University, Dept of ECE Philadelphia, PA 94 {jwalsh,sweber}@ecedrexeledu IP networks subject
More informationCommon Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel
Wang, ISIT 2011 p. 1/15 Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel Chih-Chun Wang, Jaemin Han Center of Wireless Systems and Applications (CWSA) School
More informationCommunication by Regression: Sparse Superposition Codes
Communication by Regression: Sparse Superposition Codes Department of Statistics, Yale University Coauthors: Antony Joseph and Sanghee Cho February 21, 2013, University of Texas Channel Communication Set-up
More informationPERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY
PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY BURTON ROSENBERG UNIVERSITY OF MIAMI Contents 1. Perfect Secrecy 1 1.1. A Perfectly Secret Cipher 2 1.2. Odds Ratio and Bias 3 1.3. Conditions for Perfect
More informationQuantum Wireless Sensor Networks
Quantum Wireless Sensor Networks School of Computing Queen s University Canada ntional Computation Vienna, August 2008 Main Result Quantum cryptography can solve the problem of security in sensor networks.
More informationSecond Order Reed-Muller Decoding Algorithm in Quantum Computing
Second Order Reed-Muller Decoding Algorithm in Quantum Computing Minji Kim June 2, 2011 1 Contents 1 Introduction 2 1.1 How to identify the order...................... 2 2 First order Reed Muller Code
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationResearch Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library
Research Collection Conference Paper The relation between block length and reliability for a cascade of AWG links Authors: Subramanian, Ramanan Publication Date: 0 Permanent Link: https://doiorg/0399/ethz-a-00705046
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More information6.3 Bernoulli Trials Example Consider the following random experiments
6.3 Bernoulli Trials Example 6.48. Consider the following random experiments (a) Flip a coin times. We are interested in the number of heads obtained. (b) Of all bits transmitted through a digital transmission
More informationLast few slides from last time
Last few slides from last time Example 3: What is the probability that p will fall in a certain range, given p? Flip a coin 50 times. If the coin is fair (p=0.5), what is the probability of getting an
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationCodes on graphs and iterative decoding
Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Funded by: National Science Foundation (NSF) Seagate Technology Defense Advanced Research Projects
More informationNew communication strategies for broadcast and interference networks
New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor Distributed Information Coding
More informationCode design: Computer search
Code design: Computer search Low rate codes Represent the code by its generator matrix Find one representative for each equivalence class of codes Permutation equivalences? Do NOT try several generator
More informationApproximating MAX-E3LIN is NP-Hard
Approximating MAX-E3LIN is NP-Hard Evan Chen May 4, 2016 This lecture focuses on the MAX-E3LIN problem. We prove that approximating it is NP-hard by a reduction from LABEL-COVER. 1 Introducing MAX-E3LIN
More informationChapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005
Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each
More informationReadings and Exercises
Nebraska Wesleyan University Math 4800: Research Experience Section 1 Spring 02016 Readings and Exercises Tuesday, January 19 Section 1 Exercise 1 Section 2 Exercise 1, 4ce Thursday, January 21 Section
More information