ECE 534 Information Theory - Midterm 2

Size: px
Start display at page:

Download "ECE 534 Information Theory - Midterm 2"

Transcription

1 ECE 534 Information Theory - Midterm Nov.4, :30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You may bring and use two 8.5x double-sided crib sheets. No other notes or books are ermitted. No calculators are ermitted. Talking, assing notes, coying (and all other forms of cheating) is forbidden. Make sure you exlain your answers in a way that illustrates your understanding of the roblem. Ideas are imortant, not just the calculation. Partial marks will be given. Write all answers directly on this exam. Your name: Your UIN: Your signature: The exam has 4 questions, for a total of 65 oints. Question: 3 4 Total Points: Score:

2 ECE534 Fall 009 Midterm. A sum channel. Let X = Y = {A, B, C, D} be the inut and outut alhabets of a discrete memoryless channel with transition robability matrix (y x), for 0 ɛ, δ given by ɛ ɛ 0 0 (y x) = ɛ ɛ δ δ. 0 0 δ δ Notice that this channel with 4 inuts and oututs looks like the sum or union of two arallel subchannels with transition robability matrices ɛ ɛ δ δ (y x) =, ɛ ɛ (y x) =, δ δ with alhabets X = Y = {A, B} and X = Y = {C, D} resectively. (a) ( oints) Draw the transition robability diagram of this channel. X A B C D -ε ε ε -ε -δ δ δ -δ A B C D Y Solution: (b) (3 oints) Find the caacity of this channel if ɛ = δ = /. Solution: If ɛ = δ = / we have a symmetric channel, whose caacity we know is achieved by a uniform inut distribution and has caacity C = log Y H( a row of the transition robability matrix) = log (4) H(/, /, 0, 0) = = (bit er channel use) (c) (5 oints) Let (x) be the robability mass function on X and let (A) + (B) = α, (C) + (D) = α. Show that the mutual information between the inut X and the outut Y may be exressed as I(X; Y ) = H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y X {C, D}). out of a ossible 0 oints

3 ECE534 Fall 009 Midterm Solution: Let θ be a random variable with the following robability mass function: { if x {A, B} () = α θ 0 if x {C, D} (0) = α We can then exress the mutual information between X and Y as I(X; Y ) = I(X; θ) + I(X; Y θ) = H(θ) H(θ X) + I(X; Y θ) = H(α) 0 + (θ = )I(X; Y θ = ) + (θ = 0)I(X; Y θ = 0) = H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y x {C, D}) (d) ( oints) Let C and C be the caacities of the subchannels described by (y x) and (y x). Argue why max I(X; Y ) = max [H(α) + αc + ( α)c ]. (x) α Solution: Notice that C = max I(X; Y ), C = max I(X; Y ). (x):x {A,B} (x):x {C,D} Then since x {A, B} y {A, B} and x {C, D} y {C, D}, we see that max I(X; Y ) = max [H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y X {C, D}) (x):x {A,B,C,D} (x):x {A,B,C,D},0 α = max [H(α) + 0 α max (x):x {A,B} = max 0 α [H(α) + αc + ( α)c ] I(X; Y X {A, B}) + max (x):x {C,D} I(X; Y X {C, D}) (e) (6 oints) Find the caacity C of the sum channel in terms of the caacities C and C of the sub-channels and NO other arameters. Solution: Problem (d) makes obtaining the caacity significantly easier since we know the caacities of the two binary symmetric channels are C = H(ɛ) and C = H(δ). Then finding the caacity of the sum channel amounts to a -D otimization over α, which may be achieved by setting the derivative of f(α) := H(α) + αc + ( α)c to zero to solve for α: df(α) dα = H (α) + C C = log ( α α = 0 α = ) + C C C C + C, ( α ) = C C + C Substituting this otimal value of α back into f(α), we obtain, after simlification that C = log ( C + C) out of a ossible 8 oints

4 ECE534 Fall 009 Midterm. out of a ossible 0 oints

5 ECE534 Fall 009 Midterm. Gaussian channels with interference. Consider a channel with indeendent transmitters and a single two-antenna receiver: user transmits X which is indeendent of the signal X transmitted by user. The signals of the two users are received at antenna and as Y and Y resectively as: Y = X + X + Z, Y = X + Z, where Z and Z are i.i.d N (0, σ ) additive white Gaussian noise. The signals of the two users X and X are indeendent and distributed as X N (0, P ), X N (0, P ). Our goal will be to determine the how much X can reliably communicate with the -antenna receiver while treating X as noise / interference - which deends on how the receiver rocesses the signals from the two receive antennas. Solution: All these questions rely on being able to calculate the different covariance matrices - al of which will be denoted by K with the aroriate subscrits, and using the fact that for random variable (vector) X with covariance matrix K X : h(x) = log ((πe) K X ) (a) (5 oints) Comute I(X ; Y, Y ). Solution: Here we need K Y,Y and K Y,Y X which can be obtained as E[Y K Y,Y = ] E[Y Y ] E[Y Y ] E[Y ] [ E[(X = + X + Z ) ] E[(X + X + Z )(X + Z )] E[(X + Z )(X + X + Z )] E[(X + Z ) ] P + 4P = + σ P P P + σ ] E[Y K Y,Y X = X ] E[Y Y X ] E[Y Y X ] E[Y X ] [ E[(X = + X + Z ) X ] E[(X + X + Z )(X + Z ) X ] E[(X + Z )(X + X + Z ) X ] E[(X + Z ) X ] 4P + σ = 0 0 σ ] Then I(X ; Y, Y ) = h(y, Y ) h(y, Y X ) = log ((πe) K Y,Y ) log ((πe) K Y,Y X ) = ( log (P + 4P + σ )(P + σ ) P ) (4P + σ )σ = ( log + P 4P + σ ) σ 4P + σ out of a ossible 5 oints

6 ECE534 Fall 009 Midterm (b) (5 oints) The receiver now decides to rocess the received signals Y and Y and see if/how it affects the otimal communication rate with X. Let Y b = Y + Y (the receiver sums the received signals). Comute I(X ; Y b ). Solution: Here we need K Y+Y and K Y+Y X which can be obtained as K Y+Y = E[(Y + Y ) ] = E[(X + X + Z + Z ) ] = 4P + 4P + σ K Y+Y X = E[(Y + Y ) X ] = E[(X + X + Z + Z ) X ] = E[(X + Z + Z ) ] = 4P + σ Then I(X ; Y + Y ) = h(y + Y ) h(y + Y X ) = log ((πe) K Y+Y ) log ((πe) K Y+Y X ) = ( log 4P + 4P + σ ) 4P + σ = ( log + P σ ) σ P + σ (c) (4 oints) Is Y b a sufficient statistic for decoding X? Solution: Y b is a sufficient statistic if I(X ; Y, Y ) = I(X, Y b ) (and we know that by the data rocessing inequality that I(X ; Y b ) I(X ; Y, Y ) - sufficient statistics lose no information! However, it is not a sufficient statistic since 4P + σ 4P + σ σ P + σ = 8P (4P + σ )(P + σ) > 0, P > 0 Hence I(X ; Y b ) < I(X ; Y, Y ), strictly and Y b is NOT a sufficient statistic. (d) ( oints) The receiver now decides to try and decode X using only Y, ignoring Y. Comute I(X ; Y ). Solution: I(X ; Y ) = h(y ) h(y X ) = log ((πe) K Y ) log ((πe) K Y X ) = ( log P + σ ) σ = ( log + P ) σ out of a ossible oints

7 ECE534 Fall 009 Midterm (e) ( oint) Find an examle of owers P and P for which I(X ; Y ) > I(X ; Y b ). Solution: For the solution in art (d) to be larger than that in art (c) we need σ P + σ < P > σ out of a ossible oints

8 ECE534 Fall 009 Midterm 3. True or false (T/F) and short answer. (a) ( oints) Find the 4-ary Huffman code (D = 4) for the source with robability mass function ( 8 36, 7 36, 6 36, 5 36, 4 36, 3 36, 36, 36 ). Solution: For this one, the key is to realize that the otimal Huffman code will have Dummy (0 robability) extra symbols, resulting in the following Huffman tree: Huffman code () () (3) (00) (0) (0) (030) (03) D D (b) ( oints) Describe the meaning and use of the rate-distortion function in two sentences - I m looking for meaning and intuition rather than formulas. Solution: Each rate-distortion air (R,D) on the rate-distortion function R(D) describes the minimal achievable rate (number of bits er source symbol) needed to reresent the source under consideration to within an exected distortion of D. The R(D) function is useful in lossy (non-erfect) comression or sources. (c) ( oints) Comare the caacities C and C of the channels where for the first channel, Y = (X mod 0) for X = {,,, 00} and for the second channel Y = (X mod 9) for X = {,,, 90}. Solution: By symmetry, uniform inuts will achieve uniform oututs and so the caacity C = log Y = log(0) which is greater than the caacity C = log Y = log(9). (d) ( oint) T/F: erfect feedback can increase the caacity of a discrete channel with memory. Solution: True. (e) ( oints) T/F: the differential entroy h(x ) of a continuous random variable X which is uniform on [a, b] is twice the differential entroy h(x ) of a continuous random variable X which is uniform on [a, b]. Solution: False. h(x ) = log(b a) = log((b a)) = +log(b a), while h(x ) = log(b a). This statement is only true if (b a) = but not in general. (f) (3 oints) Outline, in about 3-4 sentences, the main oints/techniques used in the achievability roof of the channel coding theorem. Solution: The main roof techniques are random coding, joint tyicality decoding and bounding the robability of error using what we know about how likely it is for two indeendently chosen sequences to be jointly tyical. The achievability roof follows these main lines: out of a ossible oints

9 ECE534 Fall 009 Midterm Generate a random codebook, which consists of nr sequences of length n, were each element is generated i.i.d. according to (x). Encoding of message w takes lace by looking u the w-th sequence in the codebook and sending that. The receiver decodes w using a joint tyicality decoder whereby it declares w was sent if there exists one and only one w such that x n (w) and the received y n re jointly tyical. Otherwise it declares an error. Analyze the robability of error in this scheme - we need to show that the maximal robability of error decays to 0 as the blocklength n tends to infinity. This will follow from arguing that the robability that indeendently generated x n and y n have a robability of n(i(x;y ) ɛ) of being jointly tyical - and there are maximally nr indeendently generated tyical x n which are no the correct one. Last art of the roof involves showing that if the *average* robability of error decaying to 0 over all randomly chosen codebooks imlies that there exists a codebook whose maximal robability of error will also decay to zero. out of a ossible 0 oints

10 ECE534 Fall 009 Midterm 4. Caacity of several simle channels. Find the caacity of the following channels AND the inut distribution which achieves this caacity. (a) (4 oints) Consider arallel Gaussian channels Y = X + Z and Y = X + Z where Z and Z are indeendent, zero mean additive white Gaussian noise Z N (0, σ = 3) and Z N (0, σ = 7) subject to a total ower constraint of P = 0. Solution: This is the classical, simlest waterfilling solution. Draw it out and see that we will fill 7 units of ower into channel and 3 units of ower into channel, thereby obtaining a caacity of C = ( log + 7 ) + ( 3 log + 3 ), 7 Variance Water-level = which is achieved by an inut distribution X N (0, 7) and X N (0, 3), and X, X indeendent. Channel Channel Channel (b) (6 oints) Again consider arallel Gaussian channels, but now you re constrained to send the same signal X on both channels, i.e. Y = X + Z and Y = X + Z where Z and Z are indeendent, zero mean additive white Gaussian noise Z N (0, σ = 3) and Z N (0, σ = 7) subject to a total ower constraint of P = 0. Solution: You can roceed with waterfilling, but it s easier if you just go directly from the caacity since due to out ower constraint and due to the fact that we are forced to send the same signal on both channels - we know that P = P = P/ = 5. Then, keeing things symbolic as long as ossible for illustration, and noting that we are inuting Gaussian random variables (so the outut are Gaussian as well) [ P K Y,Y = + ] σ P σ, K Y,Y X = 0 0 σ P P + σ So, C = log ( max I(X; Y, Y ) = h(y, Y ) h(y, Y X) (x) ( = ( P log + σ )( P + ) σ ) P 4 + 5(3+7) 3 7 = log ( + ), achieved by X N (0, 5). P σ σ (σ + σ ) σ σ ) out of a ossible 0 oints

11 ECE534 Fall 009 Midterm (c) (4 oints) Cascade of two binary symmetric channels with crossover robability without encoding between stages: X Y Solution: A cascade of binary symmetric channels is again a binary symmetric channel with a new corssover robability = ( ) + ( ) = ( ). Then the otimal inut distribution is uniform (x) = {/, /} and the caacity is C = H(( ). (d) (4 oints) Cascade of two binary symmetric channels with crossover robability with encoding between stages: X ENCODING Y Solution: When we can re-encode the symbols after the first BSC, the caacity becomes C = min(c, C }, where C and C are the caacities of the first and second BSCs, resectively. So, we see that, by symmetry, C = H(), achieved for (x) = {/, /}, uniform. out of a ossible 8 oints

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications; Formal Modeling in Cognitive Science Lecture 9: and ; ; Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Proerties of 3 March, 6 Frank Keller Formal Modeling in Cognitive

More information

An Introduction to Information Theory: Notes

An Introduction to Information Theory: Notes An Introduction to Information Theory: Notes Jon Shlens jonshlens@ucsd.edu 03 February 003 Preliminaries. Goals. Define basic set-u of information theory. Derive why entroy is the measure of information

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Improved Capacity Bounds for the Binary Energy Harvesting Channel

Improved Capacity Bounds for the Binary Energy Harvesting Channel Imroved Caacity Bounds for the Binary Energy Harvesting Channel Kaya Tutuncuoglu 1, Omur Ozel 2, Aylin Yener 1, and Sennur Ulukus 2 1 Deartment of Electrical Engineering, The Pennsylvania State University,

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

HetNets: what tools for analysis?

HetNets: what tools for analysis? HetNets: what tools for analysis? Daniela Tuninetti (Ph.D.) Email: danielat@uic.edu Motivation Seven Ways that HetNets are a Cellular Paradigm Shift, by J. Andrews, IEEE Communications Magazine, March

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B.

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. EE/Stats 376A: Information theory Winter 207 Lecture 5 Jan 24 Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. 5. Outline Markov chains and stationary distributions Prefix codes

More information

Lecture 21: Quantum Communication

Lecture 21: Quantum Communication CS 880: Quantum Information Processing 0/6/00 Lecture : Quantum Communication Instructor: Dieter van Melkebeek Scribe: Mark Wellons Last lecture, we introduced the EPR airs which we will use in this lecture

More information

Coding Along Hermite Polynomials for Gaussian Noise Channels

Coding Along Hermite Polynomials for Gaussian Noise Channels Coding Along Hermite olynomials for Gaussian Noise Channels Emmanuel A. Abbe IG, EFL Lausanne, 1015 CH Email: emmanuel.abbe@efl.ch Lizhong Zheng LIDS, MIT Cambridge, MA 0139 Email: lizhong@mit.edu Abstract

More information

Convex Optimization methods for Computing Channel Capacity

Convex Optimization methods for Computing Channel Capacity Convex Otimization methods for Comuting Channel Caacity Abhishek Sinha Laboratory for Information and Decision Systems (LIDS), MIT sinhaa@mit.edu May 15, 2014 We consider a classical comutational roblem

More information

On the capacity of the general trapdoor channel with feedback

On the capacity of the general trapdoor channel with feedback On the caacity of the general tradoor channel with feedback Jui Wu and Achilleas Anastasooulos Electrical Engineering and Comuter Science Deartment University of Michigan Ann Arbor, MI, 48109-1 email:

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

4. Score normalization technical details We now discuss the technical details of the score normalization method.

4. Score normalization technical details We now discuss the technical details of the score normalization method. SMT SCORING SYSTEM This document describes the scoring system for the Stanford Math Tournament We begin by giving an overview of the changes to scoring and a non-technical descrition of the scoring rules

More information

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code. Convolutional Codes Goals Lecture Be able to encode using a convolutional code Be able to decode a convolutional code received over a binary symmetric channel or an additive white Gaussian channel Convolutional

More information

I - Information theory basics

I - Information theory basics I - Information theor basics Introduction To communicate, that is, to carr information between two oints, we can emlo analog or digital transmission techniques. In digital communications the message is

More information

Radial Basis Function Networks: Algorithms

Radial Basis Function Networks: Algorithms Radial Basis Function Networks: Algorithms Introduction to Neural Networks : Lecture 13 John A. Bullinaria, 2004 1. The RBF Maing 2. The RBF Network Architecture 3. Comutational Power of RBF Networks 4.

More information

On Code Design for Simultaneous Energy and Information Transfer

On Code Design for Simultaneous Energy and Information Transfer On Code Design for Simultaneous Energy and Information Transfer Anshoo Tandon Electrical and Comuter Engineering National University of Singaore Email: anshoo@nus.edu.sg Mehul Motani Electrical and Comuter

More information

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anant Sahai, Salman Avestimehr, Paolo Minero Deartment of Electrical Engineering and Comuter Sciences University of California

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology Charles Leiserson, Michael Bender, Bradley Kuszmaul

Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology Charles Leiserson, Michael Bender, Bradley Kuszmaul Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology 6.896 Charles Leiserson, Michael Bender, Bradley Kuszmaul Final Examination Final Examination ffl Do not oen this exam booklet

More information

MATH 2710: NOTES FOR ANALYSIS

MATH 2710: NOTES FOR ANALYSIS MATH 270: NOTES FOR ANALYSIS The main ideas we will learn from analysis center around the idea of a limit. Limits occurs in several settings. We will start with finite limits of sequences, then cover infinite

More information

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets Imroving on the Cutset Bound via a Geometric Analysis of Tyical Sets Ayfer Özgür Stanford University CUHK, May 28, 2016 Joint work with Xiugang Wu (Stanford). Ayfer Özgür (Stanford) March 16 1 / 33 Gaussian

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )). l I ~-16 / (a) (5 points) What is the capacity Cr of the channel X -> Y? What is C of the channel Y - Z? (b) (5 points) What is the capacity C 3 of the cascaded channel X -3 Z? (c) (5 points) A ow let.

More information

Analysis of execution time for parallel algorithm to dertmine if it is worth the effort to code and debug in parallel

Analysis of execution time for parallel algorithm to dertmine if it is worth the effort to code and debug in parallel Performance Analysis Introduction Analysis of execution time for arallel algorithm to dertmine if it is worth the effort to code and debug in arallel Understanding barriers to high erformance and redict

More information

Feedback-error control

Feedback-error control Chater 4 Feedback-error control 4.1 Introduction This chater exlains the feedback-error (FBE) control scheme originally described by Kawato [, 87, 8]. FBE is a widely used neural network based controller

More information

General Linear Model Introduction, Classes of Linear models and Estimation

General Linear Model Introduction, Classes of Linear models and Estimation Stat 740 General Linear Model Introduction, Classes of Linear models and Estimation An aim of scientific enquiry: To describe or to discover relationshis among events (variables) in the controlled (laboratory)

More information

LDPC codes for the Cascaded BSC-BAWGN channel

LDPC codes for the Cascaded BSC-BAWGN channel LDPC codes for the Cascaded BSC-BAWGN channel Aravind R. Iyengar, Paul H. Siegel, and Jack K. Wolf University of California, San Diego 9500 Gilman Dr. La Jolla CA 9093 email:aravind,siegel,jwolf@ucsd.edu

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Interactive Hypothesis Testing Against Independence

Interactive Hypothesis Testing Against Independence 013 IEEE International Symosium on Information Theory Interactive Hyothesis Testing Against Indeendence Yu Xiang and Young-Han Kim Deartment of Electrical and Comuter Engineering University of California,

More information

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE MATHEMATICS OF COMPUTATIO Volume 75, umber 256, October 26, Pages 237 247 S 25-5718(6)187-9 Article electronically ublished on June 28, 26 O POLYOMIAL SELECTIO FOR THE GEERAL UMBER FIELD SIEVE THORSTE

More information

q-ary Symmetric Channel for Large q

q-ary Symmetric Channel for Large q List-Message Passing Achieves Caacity on the q-ary Symmetric Channel for Large q Fan Zhang and Henry D Pfister Deartment of Electrical and Comuter Engineering, Texas A&M University {fanzhang,hfister}@tamuedu

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points. Solved Problems Solved Problems P Solve the three simle classification roblems shown in Figure P by drawing a decision boundary Find weight and bias values that result in single-neuron ercetrons with the

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Notes on Instrumental Variables Methods

Notes on Instrumental Variables Methods Notes on Instrumental Variables Methods Michele Pellizzari IGIER-Bocconi, IZA and frdb 1 The Instrumental Variable Estimator Instrumental variable estimation is the classical solution to the roblem of

More information

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding Outline EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Error detection using arity Hamming code for error detection/correction Linear Feedback Shift

More information

A Coordinate System for Gaussian Networks

A Coordinate System for Gaussian Networks A Coordinate System for Gaussian Networs The MIT Faculty has made this article oenly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher Abbe, Emmanuel,

More information

Elements of Asymptotic Theory. James L. Powell Department of Economics University of California, Berkeley

Elements of Asymptotic Theory. James L. Powell Department of Economics University of California, Berkeley Elements of Asymtotic Theory James L. Powell Deartment of Economics University of California, Berkeley Objectives of Asymtotic Theory While exact results are available for, say, the distribution of the

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Homework Solution 4 for APPM4/5560 Markov Processes

Homework Solution 4 for APPM4/5560 Markov Processes Homework Solution 4 for APPM4/556 Markov Processes 9.Reflecting random walk on the line. Consider the oints,,, 4 to be marked on a straight line. Let X n be a Markov chain that moves to the right with

More information

Elementary Analysis in Q p

Elementary Analysis in Q p Elementary Analysis in Q Hannah Hutter, May Szedlák, Phili Wirth November 17, 2011 This reort follows very closely the book of Svetlana Katok 1. 1 Sequences and Series In this section we will see some

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

The Poisson Regression Model

The Poisson Regression Model The Poisson Regression Model The Poisson regression model aims at modeling a counting variable Y, counting the number of times that a certain event occurs during a given time eriod. We observe a samle

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

15-451/651: Design & Analysis of Algorithms October 23, 2018 Lecture #17: Prediction from Expert Advice last changed: October 25, 2018

15-451/651: Design & Analysis of Algorithms October 23, 2018 Lecture #17: Prediction from Expert Advice last changed: October 25, 2018 5-45/65: Design & Analysis of Algorithms October 23, 208 Lecture #7: Prediction from Exert Advice last changed: October 25, 208 Prediction with Exert Advice Today we ll study the roblem of making redictions

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

PHYS 301 HOMEWORK #9-- SOLUTIONS

PHYS 301 HOMEWORK #9-- SOLUTIONS PHYS 0 HOMEWORK #9-- SOLUTIONS. We are asked to use Dirichlet' s theorem to determine the value of f (x) as defined below at x = 0, ± /, ± f(x) = 0, - < x

More information

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK Comuter Modelling and ew Technologies, 5, Vol.9, o., 3-39 Transort and Telecommunication Institute, Lomonosov, LV-9, Riga, Latvia MATHEMATICAL MODELLIG OF THE WIRELESS COMMUICATIO ETWORK M. KOPEETSK Deartment

More information

Named Entity Recognition using Maximum Entropy Model SEEM5680

Named Entity Recognition using Maximum Entropy Model SEEM5680 Named Entity Recognition using Maximum Entroy Model SEEM5680 Named Entity Recognition System Named Entity Recognition (NER): Identifying certain hrases/word sequences in a free text. Generally it involves

More information

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010 ECE 6960: Adv. Random Processes & Alications Lecture Notes, Fall 2010 Lecture 16 Today: (1) Markov Processes, (2) Markov Chains, (3) State Classification Intro Please turn in H 6 today. Read Chater 11,

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Robustness of classifiers to uniform l p and Gaussian noise Supplementary material

Robustness of classifiers to uniform l p and Gaussian noise Supplementary material Robustness of classifiers to uniform l and Gaussian noise Sulementary material Jean-Yves Franceschi Ecole Normale Suérieure de Lyon LIP UMR 5668 Omar Fawzi Ecole Normale Suérieure de Lyon LIP UMR 5668

More information

Topic: Lower Bounds on Randomized Algorithms Date: September 22, 2004 Scribe: Srinath Sridhar

Topic: Lower Bounds on Randomized Algorithms Date: September 22, 2004 Scribe: Srinath Sridhar 15-859(M): Randomized Algorithms Lecturer: Anuam Guta Toic: Lower Bounds on Randomized Algorithms Date: Setember 22, 2004 Scribe: Srinath Sridhar 4.1 Introduction In this lecture, we will first consider

More information

CSE 599d - Quantum Computing When Quantum Computers Fall Apart

CSE 599d - Quantum Computing When Quantum Computers Fall Apart CSE 599d - Quantum Comuting When Quantum Comuters Fall Aart Dave Bacon Deartment of Comuter Science & Engineering, University of Washington In this lecture we are going to begin discussing what haens to

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Introduction to Optimization (Spring 2004) Midterm Solutions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Introduction to Optimization (Spring 2004) Midterm Solutions MASSAHUSTTS INSTITUT OF THNOLOGY 15.053 Introduction to Otimization (Sring 2004) Midterm Solutions Please note that these solutions are much more detailed that what was required on the midterm. Aggregate

More information

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK Towards understanding the Lorenz curve using the Uniform distribution Chris J. Stehens Newcastle City Council, Newcastle uon Tyne, UK (For the Gini-Lorenz Conference, University of Siena, Italy, May 2005)

More information

Cryptanalysis of Pseudorandom Generators

Cryptanalysis of Pseudorandom Generators CSE 206A: Lattice Algorithms and Alications Fall 2017 Crytanalysis of Pseudorandom Generators Instructor: Daniele Micciancio UCSD CSE As a motivating alication for the study of lattice in crytograhy we

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

Training sequence optimization for frequency selective channels with MAP equalization

Training sequence optimization for frequency selective channels with MAP equalization 532 ISCCSP 2008, Malta, 12-14 March 2008 raining sequence otimization for frequency selective channels with MAP equalization Imed Hadj Kacem, Noura Sellami Laboratoire LEI ENIS, Route Sokra km 35 BP 3038

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

8 STOCHASTIC PROCESSES

8 STOCHASTIC PROCESSES 8 STOCHASTIC PROCESSES The word stochastic is derived from the Greek στoχαστικoς, meaning to aim at a target. Stochastic rocesses involve state which changes in a random way. A Markov rocess is a articular

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

Universal Finite Memory Coding of Binary Sequences

Universal Finite Memory Coding of Binary Sequences Deartment of Electrical Engineering Systems Universal Finite Memory Coding of Binary Sequences Thesis submitted towards the degree of Master of Science in Electrical and Electronic Engineering in Tel-Aviv

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220 ECE 564/645 - Digital Communications, Spring 08 Midterm Exam # March nd, 7:00-9:00pm Marston 0 Overview The exam consists of four problems for 0 points (ECE 564) or 5 points (ECE 645). The points for each

More information

Hotelling s Two- Sample T 2

Hotelling s Two- Sample T 2 Chater 600 Hotelling s Two- Samle T Introduction This module calculates ower for the Hotelling s two-grou, T-squared (T) test statistic. Hotelling s T is an extension of the univariate two-samle t-test

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00 NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes

More information

arxiv: v1 [physics.data-an] 26 Oct 2012

arxiv: v1 [physics.data-an] 26 Oct 2012 Constraints on Yield Parameters in Extended Maximum Likelihood Fits Till Moritz Karbach a, Maximilian Schlu b a TU Dortmund, Germany, moritz.karbach@cern.ch b TU Dortmund, Germany, maximilian.schlu@cern.ch

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

HENSEL S LEMMA KEITH CONRAD

HENSEL S LEMMA KEITH CONRAD HENSEL S LEMMA KEITH CONRAD 1. Introduction In the -adic integers, congruences are aroximations: for a and b in Z, a b mod n is the same as a b 1/ n. Turning information modulo one ower of into similar

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information