National University of Singapore Department of Electrical & Computer Engineering. Examination for
|
|
- Alberta Day
- 5 years ago
- Views:
Transcription
1 National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed: 3.0 hours INSTRUCTIONS FOR CANDIDATES: ˆ This paper contains FOUR (4) questions, printed on FIVE (5) pages. ˆ Answer all questions. ˆ Programmable calculators are NOT allowed. ˆ Electronic communicating devices MUST be turned off and inaccessible throughout the examination. They CANNOT be used as calculators, timers or clocks. ˆ You are allowed to bring SEVEN (7) HANDWRITTEN A4 size sheets. You may write on both sides of the sheet. No other material is allowed.
2 EE5139R Communication Systems Page 2 of 5 Problem 1 1(a) (7 points) State whether the following statement is TRUE or FALSE: There exists a discrete memoryless channel (DMC) with a binary (i.e., X = 2 symbols) input alphabet and a ternary (i.e., Y = 3 symbols) output alphabet such that its capacity is equal to C = 1.5 bits/channel use. Please justify your answer carefully. 1(b) (8 points) Prove a list decoding version of Fano s inequality. In particular, show that if (W, Y ) W Y are jointly distributed as P W,Y, and L(Y ) is a subset of W of size l 1, the probability of error P e := Pr(W / L(Y )) can be lower bounded as P e H(W L(Y )) 1 log l log W l. l Hint: Define an error random variable E which takes the value of 1 when W / L(Y ) and 0 otherwise. Then expand H(W, E L(Y )) in two different ways using the chain rule for joint entropy. 1(c) (3 points) For a DMC P Y X (y x) over n channel uses, find an upper bound for 1 n I(Xn ; Y n ) in terms of the channel capacity C = max PX I(X; Y ). State where you used the assumption of memorylessness in your derivations. 1(d) (7 points) A (2 nr, 2 nl, n)-list code for a DMC P Y X (y x) with capacity C consists of an encoder that assigns a codeword x n (w) to each message w W := 1,..., 2 nr } and a decoder that upon receiving y n tries to finds the list of messages L(y n ) W = 1,..., 2 nr } of size L(y n ) 2 nl that contains the transmitted message. An error occurs if the list does not contain the transmitted message W, i.e., P (n) e := Pr(W / L(Y n )). A rate-list exponent pair (R, L) is said to be achievable if there exists a sequence of (2 nr, 2 nl, n)-list codes with P (n) e 0 as n. Show using parts (b) and (c) that there exists an upper bound on R called R + such that every sequence of (2 nr, 2 nl, n) list codes with P e (n) 0 (i.e., every (R, L) pair that is achievable) must satisfy R R +. Find this upper bound R + in terms of C and L.
3 EE5139R Communication Systems Page 3 of 5 Problem 2 2(a) (10 points) Consider the Gaussian channel shown above, in which the transmitted signal X with E[X 2 ] = P is received by two antennas with Y 1 = X + Z 1, Y 2 = X + Z 2 where Z 1 and Z 2 are independent, and E[Zi 2] = σ2 i with σ2 1 < σ2 2. Moreover, the signals at the two antennas are combined as Y = αy 1 + (1 α)y 2 before decoding where 0 α 1. Find the capacity of the channel for a given α. Please provide the units of capacity. 2(b) (10 points) The following parts are separate from part (a). Now assume that you have two parallel Gaussian channels with inputs X 1 and X 2 and noises Z 1 and Z 2, respectively. Assume that the noise powers are E[Z1 2] = 0.3 and E[Z2 2 ] = 0.6, while the total available power is P = E[X1 2] + E[X2 2 ] = 0.1. Find the optimal power allocation (waterfilling) and corresponding capacity. Please provide the units of capacity. Hint: You may assume the waterfilling conditions derived in class (without proving them again). Separately consider the cases P 1, P 2 > 0 and one of the powers is 0. 2(c) (5 points) Now let S n be a binary memoryless source (BMS) with entropy 0.15 bits per source symbol. Is it possible to transmit S n over the channel in part (b)? If it is possible, please explain how to achieve sending the BMS over the Gaussian channel. If it is not possible, explain clearly why not. Hint: Even if you didn t get part (b), you can still do (c) by assuming the capacity of the channel in part (c) is some value C > 0.
4 EE5139R Communication Systems Page 4 of 5 Problem 3 In this problem we use minimum distance decoding to establish achievability of the capacity for a Binary Symmetric Channel (BSC) with crossover probability p < 1/2. Define the Hamming distance d(x n, y n ) between two binary sequences x n, y n 0, 1} n as the number of positions where they differ, i.e., n n d(x n, y n ) = i 1, 2,..., n} : x i y i } = 1x i y i } = x i y i Here x y = 0 if x = y and x y = 1 if x y. We fix a rate R and generate 2 nr codewords X n (w) each from the uniform distribution on 0, 1} n so each X i (w) is uniform on 0, 1}, i.e., Pr(X i (w) = 0) = Pr(X i (w) = 1) = 1/2. The random codebook is C = X n (1), X n (2)..., X n (2 nr )}. For any codebook, the decoder chooses message ŵ such that the Hamming distance of X n (ŵ) to the channel output Y n is minimized, i.e., d(x n (ŵ), Y n ) < d(x n ( w), Y n ), for all w ŵ. Advice: Each part below is designed so that it may be done independently of earlier parts. So if you get stuck, move on. 3(a) (2 points) Assume that message W = 1 was sent. Argue that the probability of error averaged over the random code is P e (n) := Pr ( A ) 2nR W = 1, where A := w=2 i=1 i=1 d(x n (w), Y n ) d(x n (1), Y n ) } Hint: How does an error occur under the minimum distance decoding rule above? 3(b) (4 points) Recall the law of total probability Pr(A) = Pr(A, E) + Pr(A, E c ). Let Now show the upper bound E := d(x n (1), Y n ) > n(p + ɛ)}, for some 0 < ɛ < 1 2 p P (n) e Pr ( d(x n (1), Y n ) > n(p + ɛ) W = 1 ) + 2 nr Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) (1) Hint: Use Bayes rule and the fact that probabilities are no larger than one. 3(c) (2 points) Let Z i = Y i X i (1). What is the distribution of Z i? 3(d) (4 points) Write the first term on the right-hand-side of (1), namely Pr(d(X n (1), Y n ) > n(p + ɛ) W = 1) in terms of the Z i random variables of part (c) and argue using Chebyshev s inequality that this first term tends to zero as n tends to infinity. 3(e) (2 points) Let U i = Y i X i (2). What is the distribution of U i? 3(f) (8 points) Write the second probability in (1), namely Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) in terms of the U i random variables of part (e) and hence show using Sanov s theorem that Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) (n + 1) 2 2 nd(p+ɛ 1 2 ) where D(p q) is the relative entropy between a Bernoulli-p and Bernoulli-q random variable. 3(g) (3 points) Using the above, show that the capacity of the BSC with crossover probability p is at least 1 H b (p) where H b (p) = p log p (1 p) log(1 p) is the entropy of a Bernoulli-p random variable. Hint: Use parts (d) and (f) to show that P (n) e goes to zero under some condition on R.
5 EE5139R Communication Systems Page 5 of 5 Problem 4 Let X n := (X 1, X 2,..., X n ) be independent (but not identically distributed) discrete random variables on a finite set X, drawn according to the following rule: P1 (x) i odd Pr(X i = x) = P 2 (x) i even (2) Let Q be any other distribution on X (not the same as P 1 or P 2 ). Let r(n) be the remainder when you divide n by 2, i.e., r(n) = 0 if n is even and 1 otherwise. 4(a) (7 points) Fix ɛ > 0. Define the relative entropy typical set A (n) ɛ (P 1, P 2 Q) := x n : D ɛ < 1 n log P } 1(x 1, x 3,..., x n r(n)+1 )P 2 (x 2, x 4,..., x n r(n) ) < D + ɛ Q(x 1, x 2,..., x n ) for some D > 0. Note that 1, 3,..., n r(n) + 1 is simply the collection of odd numbers up to and including n. Similarly, 2, 4,..., n r(n) is the collection of even numbers up to and including n. Under hypothesis H 0, let X n be distributed as in (2). Find D in terms of P 1, P 2, Q such that ( Pr X n A (n) ɛ (P 1, P 2 Q) ) H 0 1 Hint: By independence, P 1 (X 1, X 3,..., X n r(n)+1 ) = P 1 (X 1 )P 1 (X 3 )... P 1 (X n r(n)+1 ) and similarly for the other probabilities where X n follows (2). Apply the weak law of large numbers or Chebyshev s inequality. 4(b) (10 points) Under hypothesis H 1, let X n be distributed i.i.d. Q. Find the largest E such that E satisfies ( Pr X n A (n) ɛ (P 1, P 2 Q) ) H 1 2 ne for every integer n 1. The number E should be stated in terms of D and ɛ. 4(c) (8 points) Now consider the binary hypothesis test H 0 : X n according to (2) H 1 : X n i.i.d. Q Let A n X n be an acceptance region for hypothesis H 1 and let the probabilities of error be α n (A n ) := Pr(A c n H 0 ), β n (A n ) := Pr(A n H 1 ). Define Find an upper bound for in terms of P 1, P 2, Q and ɛ. β ɛ n := min A n X n :α n(a n)<ɛ β n 1 lim n n log βɛ n END OF PAPER
Lecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationMidterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016
Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationEE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16
EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationSolutions to Homework Set #3 Channel and Source coding
Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationECE 534 Information Theory - Midterm 2
ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationQuiz 2 Date: Monday, November 21, 2016
10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationEE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018
Please submit the solutions on Gradescope. Some definitions that may be useful: EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Definition 1: A sequence of random variables X
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)
ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationSolutions to Set #2 Data Compression, Huffman code and AEP
Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.44 Transmission of Information Spring 2006 Homework 2 Solution name username April 4, 2006 Reading: Chapter
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationSolutions to Homework Set #4 Differential Entropy and Gaussian Channel
Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7
ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will
More informationCovert Communication with Channel-State Information at the Transmitter
Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationLecture 12. Block Diagram
Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data
More information10-704: Information Processing and Learning Fall Lecture 9: Sept 28
10-704: Information Processing and Learning Fall 2016 Lecturer: Siheng Chen Lecture 9: Sept 28 Note: These notes are based on scribed notes from Spring15 offering of this course. LaTeX template courtesy
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationEE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Midterm, Tuesday February 10th Instructions: You have two hours, 7PM - 9PM The exam has 3 questions, totaling 100 points. Please start answering each question on a new page
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationIntermittent Communication
Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication
More informationOn the Capacity of the Two-Hop Half-Duplex Relay Channel
On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationInformation Theory and Statistics, Part I
Information Theory and Statistics, Part I Information Theory 2013 Lecture 6 George Mathai May 16, 2013 Outline This lecture will cover Method of Types. Law of Large Numbers. Universal Source Coding. Large
More informationMATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q
MATH-315201 This question paper consists of 6 printed pages, each of which is identified by the reference MATH-3152 Only approved basic scientific calculators may be used. c UNIVERSITY OF LEEDS Examination
More informationOn the Capacity of the Binary-Symmetric Parallel-Relay Network
On the Capacity of the Binary-Symmetric Parallel-Relay Network Lawrence Ong, Sarah J. Johnson, and Christopher M. Kellett arxiv:207.3574v [cs.it] 6 Jul 202 Abstract We investigate the binary-symmetric
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationCoding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1
EECS 121 Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 PRINT your student ID: PRINT AND SIGN your name:, (last) (first) (signature) PRINT your Unix account login: ee121- Prob.
More information4 An Introduction to Channel Coding and Decoding over BSC
4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the
More informationError Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class
Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationPrinciples of Coded Modulation. Georg Böcherer
Principles of Coded Modulation Georg Böcherer Contents. Introduction 9 2. Digital Communication System 2.. Transmission System............................. 2.2. Figures of Merit................................
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationCommunications Theory and Engineering
Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 AEP Asymptotic Equipartition Property AEP In information theory, the analog of
More informationHomework Set #3 Rates definitions, Channel Coding, Source-Channel coding
Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More informationDistributed Source Coding Using LDPC Codes
Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding
More informationQuiz 1 Date: Monday, October 17, 2016
10-704 Information Processing and Learning Fall 016 Quiz 1 Date: Monday, October 17, 016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED.. Write your name, Andrew
More informationEE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi
EE 376A: Information Theory Lecture Notes Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi January 6, 206 Contents Introduction. Lossless Compression.....................................2 Channel
More information