Information Theory and Coding

Similar documents
Entropies & Information Theory

ECE 564/645 - Digital Communication Systems (Spring 2014) Final Exam Friday, May 2nd, 8:00-10:00am, Marston 220

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes

Lecture 11: Channel Coding Theorem: Converse Part

Lecture 27. Capacity of additive Gaussian noise channel and the sphere packing bound

Fig. 2. Block Diagram of a DCS

Lecture 7: Channel coding theorem for discrete-time continuous memoryless channel

Shannon s noiseless coding theorem

Lecture 14: Graph Entropy

AMS570 Lecture Notes #2

Channel coding, linear block codes, Hamming and cyclic codes Lecture - 8

Cooperative Communication Fundamentals & Coding Techniques

Lecture 5. Random variable and distribution of probability

Overview of Gaussian MIMO (Vector) BC

Parameter, Statistic and Random Samples

Lecture 4. Random variable and distribution of probability

2(25) Mean / average / expected value of a stochastic variable X: Variance of a stochastic variable X: 1(25)

Approximations and more PMFs and PDFs

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Chapter 6: BINOMIAL PROBABILITIES

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit

Introduction to Probability and Statistics Twelfth Edition

Performance Analysis and Channel Capacity for Multiple-Pulse Position Modulation on Multipath Channels

Kurskod: TAMS11 Provkod: TENB 21 March 2015, 14:00-18:00. English Version (no Swedish Version)

Differential PSK (DPSK) and Its Performance

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

ON THE CAPACITY OF THE MIMO CHANNEL - A TUTORIAL INTRODUCTION - Bengt Holter

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame

Module 5 EMBEDDED WAVELET CODING. Version 2 ECE IIT, Kharagpur

DESCRIPTION OF THE SYSTEM

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

Topic 9: Sampling Distributions of Estimators

Introduction to Probability. Ariel Yadin

HOMEWORK 2 SOLUTIONS

BER results for a narrowband multiuser receiver based on successive subtraction for M-PSK modulated signals

Lecture 12: November 13, 2018

Exmple Questions for the Examination for 4041 OPTICAL COMMUNICATION ENGINEERING

Lecture 7: MIMO Architectures Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH

Axioms of Measure Theory

Introduction of Expectation-Maximization Algorithm, Cross-Entropy Method and Genetic Algorithm

A Partial Decode-Forward Scheme For A Network with N relays

Increasing timing capacity using packet coloring

Lecture 1: Basic problems of coding theory

Review of Discrete-time Signals. ELEC 635 Prof. Siripong Potisuk

Lecture 2: Poisson Sta*s*cs Probability Density Func*ons Expecta*on and Variance Es*mators

The Gaussian Channel with Noisy Feedback

PRACTICE PROBLEMS FOR THE FINAL

Lecture 6: Source coding, Typicality, and Noisy channels and capacity

Hybrid Coding for Gaussian Broadcast Channels with Gaussian Sources

Symmetric Two-User Gaussian Interference Channel with Common Messages

A Non-Asymptotic Achievable Rate for the AWGN Energy-Harvesting Channel using Save-and-Transmit

Lecture 15: Strong, Conditional, & Joint Typicality

SUCCESSIVE INTERFERENCE CANCELLATION DECODING FOR THE K -USER CYCLIC INTERFERENCE CHANNEL

Reliability and Queueing

Massachusetts Institute of Technology

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 17 Lecturer: David Wagner April 3, Notes 17 for CS 170

PH 425 Quantum Measurement and Spin Winter SPINS Lab 1

Review on Probability Distributions

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 18

CS284A: Representations and Algorithms in Molecular Biology

Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3

A Hybrid Random-Structured Coding Scheme for the Gaussian Two-Terminal Source Coding Problem Under a Covariance Matrix Distortion Constraint


Lecture 7: Properties of Random Samples

On Evaluating the Rate-Distortion Function of Sources with Feed-Forward and the Capacity of Channels with Feedback.

2000 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes

Incremental Redundancy: A Comparison of a Sphere-Packing Analysis and Convolutional Codes

Probability theory and mathematical statistics:

Fundamentals. Relative data redundancy of the set. C.E., NCU, Taiwan Angela Chih-Wei Tang,

Introduction to Probability I: Expectations, Bayes Theorem, Gaussians, and the Poisson Distribution. 1

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

A Study of Capacity and Spectral Efficiency of Fiber Channels

The method of types. PhD short course Information Theory and Statistics Siena, September, Mauro Barni University of Siena


Arithmetic Distribution Matching

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

Topic 9: Sampling Distributions of Estimators

Vector Permutation Code Design Algorithm. Danilo SILVA and Weiler A. FINAMORE

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

EE 4TM4: Digital Communications II Probability Theory

Distortion Bounds for Source Broadcast. Problem

Statisticians use the word population to refer the total number of (potential) observations under consideration

Sums, products and sequences

( ) = p and P( i = b) = q.

CS / MCS 401 Homework 3 grader solutions

Probability for mathematicians INDEPENDENCE TAU

Basics of Probability Theory (for Theory of Computation courses)

Putnam Training Exercise Counting, Probability, Pigeonhole Principle (Answers)

Random Variables, Sampling and Estimation

Finite Block-Length Gains in Distributed Source Coding

Lecture 7: October 18, 2017

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Lecture 2: April 3, 2013

2.4 - Sequences and Series

ln(i G ) 26.1 Review 26.2 Statistics of multiple breakdowns M Rows HBD SBD N Atoms Time

ADVANCED DIGITAL SIGNAL PROCESSING

Sets and Probabilistic Models

Transcription:

Sol. Iformatio Theory ad Codig. The capacity of a bad-limited additive white Gaussia (AWGN) chael is give by C = Wlog 2 ( + σ 2 W ) bits per secod(bps), where W is the chael badwidth, is the average power received ad σ 2 is the oe-sided power spectral desity of the AWGN. For a fixed = 000, the chael capacity (i kbps) with ifiite σ2 badwidth (W ) is approximately (a).44 (b).08 C = W log 2 ( + σ 2 W ) (c) 0.72 (d) 0.36 [GATE 204: Mark] = σ2 σ2 W log 2 ( + σ 2 W ) = σ2 σ2 W log 2 ( + σ 2 W ) = lim σ2 [x log 2 ( + x )] where x = σ2 W = σ 2 log 2e =. 44 σ 2 =. 44 000 =. 44 kbps Optio (a)

2. A fair is tossed repeatedly util a Head appears for the first time. Let L be the umber of tosses to get this first Head. The etropy H(L) i bits is [GATE 204: 2 Marks] Sol. If toss is required to get first head, the probability = 2 If 2 tosses are required to get first head the 2 = 2 2 = 4 If 3 tosses are required to get first head the 3 = 2 2 2 = 8 Etropy H = i log 2 i= i = 2 log 22 + 4 log 24 + 8 log 28 + 6 log 26 = 2 + 2 2 2 + 3 2 3 + 4 2 4 2 Sol. 3. The capacity of a Biary Symmetric Chael (BSC) with cross-over probability 0.5 is [GATE 204: Mark] p (0) (0) X ( p) ( p) Y p () () Give cross over probability of 0.5

(x ) = 2 (x 2 ) = 2 Chael capacity for BSC (C) = log 2 [ p 2 j= ( y k xj ) log p ( y k xj )] log 2 2 + p log p + ( p) log( p) = + 2 log 2(/2) + 2 log 2(/2) = 2 2 = 0 C = 0 Capacity = 0 4. I a digital commuicatio system, trasmissio of successive bits through a oisy chael are assumed to be idepedet evets with error probability p. The probability of at most oe error i the trasmissio of a 8-bit sequece is (a) 7( p)/+p/8 (b) ( p) 8 + 8( p) 7 Sol. Gettig almost oe error be success robability of at most oe error = p Say, success Failure (X = at most error) = (X = 0) + (X = ) = p (c) ( p) 8 + ( p) 7 (d) ( p) 8 + p( p) 7 [GATE 988: 2 Marks]

Note that probability that evet A occurs r times is give by bioomical probability ma fuctio defied as (X = r) Cr p r ( p) r = 8 C0 (p) 0 ( p) 8 0 + 8 C (p) ( p) 8 = ( p) 8 + 8p ( p) 7 Optio (b) 5. Cosider a Biary Symmetric Chael (BSC) with probability of error beig p. To trasmit a bit say, we trasmit a sequece of three sequece to represet if at least two bits bit will be represet i error is (a) p 3 + 3p 2 ( p) (b) p 3 Sol. ( 0 ) = ( 0 ) = p ( ) = (0 0 ) = (c) ( p) 3 (d) p 3 + p 2 ( p) Receptio with error meas gettig at the most oe. (receptio with error) = (X = 0) + (X = ) Usig the relatio of Biomial probability ma fuctio (X = r) = Cr p r ( p) r For r = 0,, 2, -------- = 3 C0 ( p) 0 p 3 + 3 C ( p) p 2 = p 3 + 3p 2 ( p) Optio (a) [GATE 2008: 2 Marks] 6. Durig trasmissio over a certai biary commuicatio chael, bit errors occur idepedetly with probability p. The probability of at most oe bit i error i a block of bits is give by (a) p (b) p (c) p( p) + ( p) (d) ( p) [GATE 2007: 2 Marks]

Sol. robability of at most oe bit is error = (o error)+(oe bit error) Usig the relatio of Biomial probability ma fuctio = C0 (p) 0 ( p) + C (p) ( p) = ( p) + p( p) Note, C0 = ad C = Optio (c) 7. Let U ad V be two idepedet ad idepedet ad idetically distributed radom variables such that (U = +) = (U = ) = 2. The etropy H(U+V) i bits is (a) 3/4 (b) (c) 3/2 (d) log 2 3 [GATE 203: 2 Marks] Sol. U ad V are two idepedet ad idetically distributed radom variables (U = +) = (U = ) = 2 (V = +) = (V = ) = 2 So, radom variables U ad V ca have followig values U = +, ; V = +, 2 Whe U = V = U + V { 0 whe U =, V = 2 whe U = V = or U =, V =, U + V = 2 U + V = 0 U + V = 2 (U + V) = 2 = 2 2 = 4 (U + V) = 0 = 2 2 + 2 2 = 2 (U + V) = 2 = 2 2 = 4 Etropy of (U + V) = H(U + V)

= (U + V) log 2 (U+V) = 4 log 24 + 2 log 22 + 4 log 24 = 2 4 + 2 + 2 4 = 3 2 Optio (c) 8. A source alphabet cosists of N symbols with the probability of the first two symbols beig the same. A source ecoder icreases the probability of the first symbol by a small amout e. After ecodig, the etropy of the source (a) icreases (b) remais the same (c) icreases oly if N = 2 (d) decreases [GATE 202: Mark] Sol. Etropy is maximum, whe symbols are equally probable, whe probability chages from equal to o-equal, etropy decreases Optio (d) 9. A commuicatio chael with AWGN operatig at a sigal at a sigal to oise ratio SNR >> ad badwidth B has capacity C. If the SNR is doubled keepig B costat, the resultig capacity C 2 is give by (a) C 2 2C (b) C 2 C + B Sol. Whe SNR >>, chael capacity C C = B log 2 ( + S N ) C B log 2 ( S N ) Whe SRN is doubled C B log 2 ( 2S N ) = B log 22 + B log 2 ( S N ) (c) C 2 C + 2B (d) C 2 C + 0.3B [GATE 2009: 2 Marks]

C = B log 2 ( S N ) + B = C + B Optio (b) 0. A memoryless source emits symbols each with a probability p. The etropy of the source as a fuctio of (a) icreases (c) icreases as (b) decreases as log (d) icreases as log [GATE 2008: 2 Marks] Sol. Etropy H(m) for the memoryless source H(m) = i log 2 i bits i= i = robability of idividual symbol = 2 = = H(m) = log 2 i= = log 2 Etropy H(m) icreases as a fuctio of log 2 Optio (a). A source geerates three symbols with probability 0.25, 0.25, 0.50 at a rate of 3000 symbols per secod. Assumig idepedet geeratio of symbols, the most efficiet source ecoder would have average bit rate of (a) 6000 bits/sec (b) 4500 bits/sec (c) 3000 bits/sec (d) 500 bits/sec [GATE 2006: 2 Marks]

Sol. Three symbols with probability of 0.25, 0.25 ad 0.50 at the rate of 3000 symbols per secod. Etropy H = 0. 25 log 2 = 0. 25 2 + 0. 25 2 + 0. 5 Rate of iformatio R = r.h R = 3000. 5 = 4500 bits/sec Optio (b) 0. 25 + 0. 25 log 2 =. 5 R = 3000 symbol/sec 0. 25 + 0. 5 log 2 0. 5 2. A image uses 52 52 picture elemets. Each of the picture elemets ca take ay of the 8 distiguishable itesity levels. The maximum etropy i the above image will be (a) 209752 bits (b) 786432 bits Sol. For 8 distiguishable itesity levels = log 2 L = log 2 8 = 3 Maximum etropy = 52 52 = 52 52 3 = 786432 (c) 648 bits (d) 44 bits [GATE 990: 2 Marks] 3. A source produces 4 symbols with probability,, ad. For this 2 4 8 8 source, a practical codig scheme has a average codeword legth of 2 bits/symbols. The efficiecy the code is (a) (c) /2 (b) 7/8 (d) /4 [GATE 989: 2 Marks]

Sol. Four symbol with probability 2, 4, 8 ad 8 Etoropy = H = i log 2 ( i ) i= H = [ 2 log 2 ( 2 ) + 4 log 2 ( 4 ) + 8 log 2 ( 8 ) + 8 log 2 ( 8 )] = 2 + 4 2 + 8 3 + 8 3 = + 3 4 = 7 4 Code efficiecy H L = 7 4 2 Optio (b) = 7 8