Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Similar documents
Principles of Communications

Revision of Lecture 5

Information Theory - Entropy. Figure 3

Revision of Lecture 4

One Lesson of Information Theory

Communication Theory II

Chapter 9 Fundamental Limits in Information Theory

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Lecture 2. Capacity of the Gaussian channel

ECE Information theory Final

3F1 Information Theory, Lecture 1

Lecture 18: Gaussian Channel

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions


Lecture 14 February 28

Lecture 2: August 31

Lecture 5b: Line Codes

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Shannon s noisy-channel theorem

ELEC546 Review of Information Theory

Basic information theory

Exercise 1. = P(y a 1)P(a 1 )

Towards a Theory of Information Flow in the Finitary Process Soup

16.36 Communication Systems Engineering

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

X 1 : X Table 1: Y = X X 2

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Noisy channel communication

Chapter 4: Continuous channel and its capacity

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Solutions to Homework Set #3 Channel and Source coding

Physical Layer and Coding

ECE Information theory Final (Fall 2008)

Shannon's Theory of Communication

Lecture 22: Final Review

Shannon Information Theory

Chapter I: Fundamental Information Theory

Coding for Discrete Source

ELEMENT OF INFORMATION THEORY

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Massachusetts Institute of Technology

Uncertainity, Information, and Entropy

6.02 Fall 2011 Lecture #9

MODULATION AND CODING FOR QUANTIZED CHANNELS. Xiaoying Shao and Harm S. Cronie

Lecture 1. Introduction

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

UNIT I INFORMATION THEORY. I k log 2

Entropies & Information Theory

Block 2: Introduction to Information Theory

Noisy-Channel Coding

CS 630 Basic Probability and Information Theory. Tim Campbell

Chapter 2 Review of Classical Information Theory

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

(Classical) Information Theory III: Noisy channel coding

Computing and Communications 2. Information Theory -Entropy

Information in Biology

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Information Theory and Coding Techniques

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 8: Channel Capacity, Continuous Random Variables

Information in Biology

1 Introduction to information theory

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Digital communication system. Shannon s separation principle

Basic Principles of Video Coding

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias

Lecture 4. Capacity of Fading Channels

Lecture 3: Channel Capacity

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

ITCT Lecture IV.3: Markov Processes and Sources with Memory

Compression and Coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Constellation Shaping for Communication Channels with Quantized Outputs

Channel Coding 1. Sportturm (SpT), Room: C3165

A Brief Introduction to Shannon s Information Theory

Appendix B Information theory from first principles

VID3: Sampling and Quantization

Lecture 12. Block Diagram

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications

Transcription:

Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1

1. Source entropy Given X a memoryless symbol source. The source alphabet : J different symbols x 0,x 1,...,x J 1 Each symbol is associated with an emission probability : p(x 0 ), p(x 1 ),..., p(x J 1 ) J 1 p(x j ) = 1 j=0 To each symbol, we associate its specific information : i(x j ) = log 2 p(x j ) The source entropy is then defined by : H(X) = J 1 p(x j ) log 2 p(x j ) j=0 expressed in bit/symbol. = average information per symbol ENTROPY UNCERTAINTY INFORMATION Exercise session 11 : Channel capacity 2

2. Discrete memoryless channel X Y x 0 y 0 x 1 y 1 Channel : p(y k x j ) x J 1 y K 1 The noise on the channel the source and destination alphabets might be differents. p(y k x j ) : transition probabilities. Exercise session 11 : Channel capacity 3

3. Mutual information We observe Y = y k. Which uncertainty remains on X? We define the entropy of X conditionally to Y = y k : J 1 H(X Y = y k ) = p(x j y k ) log 2 p(x j y k ) j=0 We take the average value of Y : H(X Y ) = K 1 k=0 = = p(y k )H(X Y = y k ) K 1 k=0 K 1 k=0 J 1 p(x j y k ) p(y k ) log 2 p(x j y k ) j=0 J 1 p(x j,y k ) log 2 p(x j y k ) j=0 The average mutual information is defined by I(X;Y ) = H(X) H(X Y ) Exercise session 11 : Channel capacity 4

I(X;Y ) = H(X) H(X Y ) Two particular cases : 1. Channel without noise : H(X Y ) = 0 I(X;Y ) = H(X) the channel convey only the useful information. 2. Very noisy channel : H(X Y ) = H(X) I(X;Y ) = 0 the channel doesn t convey any useful information. Remark : The mutual information is symetric I(X;Y ) = I(Y ;X) Exercise session 11 : Channel capacity 5

4. Channel capacity Definition : C s = max I(X;Y ) p(x j ) expressed in bit/symbol. If s = symbol transmission rate (symbol/s), C = sc s is the channel capacity in bit/s. Binary symmetric channel case p(x 0 ) = 1 α x 0 = 0 X 1 p e Y y 0 = 0 p e p e p(x 1 ) = α x 1 = 1 1 p e y 1 = 1 J = K = 2. The mutual information is given by I(X;Y ) = H(Y ) H(Y X) Exercise session 11 : Channel capacity 6

Computation of H(Y X) : H(Y X) = 1 k=0 independent of the p(x j ). 1 p(x j ) p(y k x j ) log 2 p(y k x j ) j=0 = (1 α)(1 p e ) log 2 (1 p e ) (1 α) p e log 2 p e α (1 p e ) log 2 (1 p e ) α p e log 2 p e = (1 p e ) log 2 (1 p e ) p e log 2 p e may be considered as a channel entropy. Therefore, and I(X;Y ) = H(Y ) + (1 p e ) log 2 (1 p e ) + p e log 2 p e C s = max I(X;Y ) p(x j ) = 1 + (1 p e ) log 2 (1 p e ) + p e log 2 p e NRZ baseband transmission case : p e = 1 ( ) 2 er f c Eb N 0 Exercise session 11 : Channel capacity 7

4 C s (bit/symbol) 16 states 3 8 states 2 4 states 1 2 states 0 0 10 20 30 Shannon Theorem Continuous inpout and output alphabets. Exemple : E b N 0 [db] Then C s = 1 2 log 2 where σ 2 X = input power. Y = X + N(0,σ 2 N) ( 1 + σ X 2 ) σn 2 [bit/symbole] If the channel bandwidth is equal to B, its capacity is given by ( ) σ 2 C = B log X 2 σn 2 [bit/second] (Shannon-Hartley relation). Exercise session 11 : Channel capacity 8

Information rate : R = sh(x) If R < C, we can find a source and channel encoding which give rise to a perfect transmission. Exercise session 11 : Channel capacity 9

5. Exercices 1. Determine the capacity of the discrete channel whose transition probabilities are given by 1 p 0 0 1 1 p p p 1 2 2. Two binary symetric transmission channel of error probability p are cascaded. Determine the global channel capacity. 3. We consider a channel with some white additive gaussian noise whose bandwith is equal to 4 khz and the noise power spectral density is equal to N 0 /2 = 10 12 W/Hz. The required signal power at the receiver is equal to 0,1 mw. Compute the channel capacity. 4. An analog signal with a bandwidth of 4 khz is sampled at 1,25 times the Nyquist frequency, each sample is quantized into 256 levels of equal probability. We assume that the samples are statistically independents. (a) What is the source information rate? (b) (c) (d) Is it possible to transmit without errors the signals from this source on a channel subject to a Gaussian additive white noise with a bandwidth of 10 khz and a signal to noise ratio of 20 db? Compute the required signal to noise to ensure a transmission without errors in the conditions edicted in (b). Compute the required bandwidth to transmit without errors the signals from the same source through a channel with a Gaussian additive white noise to ensure a signal to noise ratio of 20 db. Exercise session 11 : Channel capacity 10

5. The problem is to design a transmission system for packets comprising 1500 bytes. We impose the usage of a two states digital phase modulation (PSK-2) and that 99% of the packets be entirely corrects at the receiver (meaning that the packet error rate should be less than 1%). (a) If the noise density N 0 2 is 10 2 [W/Hz], what is the energy per bit E b? (b) Determine the maximum theoretical value of the channel capacity! (c) Determine the real value of the channel capacity in the conditions of this question! Remark : Error probability for a bipolar NRZ signal Exercise session 11 : Channel capacity 11

Answer 1. (1 p). 2. 1 + 2p(1 p)log 2 [2p(1 p)] + (1 2p + 2p 2 )log 2 (1 2p + 2p 2 ). 3. 54,44 kb/s. 4. (a) 80 kb/s. (b) C = 66,6 kb/s. It is not possible to have a transmission without errors. (c) 24,1 db. (d) 12 khz. 5. (a) E b = 0.252 [J] (b) C s,max = 1.88 [bits/symbol] (c) C s = 0.919 [bits/symbol] Exercise session 11 : Channel capacity 12