Revision of Lecture 5

Similar documents
Revision of Lecture 4

Communication Theory II

Principles of Communications

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Information Theory - Entropy. Figure 3

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

16.36 Communication Systems Engineering

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Block 2: Introduction to Information Theory

Chapter 9 Fundamental Limits in Information Theory

Lecture 18: Gaussian Channel

Lecture 2. Capacity of the Gaussian channel

ECE Information theory Final

ELEC546 Review of Information Theory

Basic information theory

Lecture 8: Channel Capacity, Continuous Random Variables

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Chapter 4: Continuous channel and its capacity

Lecture 4. Capacity of Fading Channels


One Lesson of Information Theory

Lecture 8: Shannon s Noise Models

Chapter I: Fundamental Information Theory

Lecture 14 February 28

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Massachusetts Institute of Technology

Lecture 11: Quantum Information III - Source Coding

Channel Coding 1. Sportturm (SpT), Room: C3165

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Capacity of AWGN channels

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Coding for Discrete Source

A Brief Introduction to Shannon s Information Theory

(Classical) Information Theory III: Noisy channel coding

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Shannon Information Theory

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 22: Final Review

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

ECE Information theory Final (Fall 2008)

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Lecture 3: Channel Capacity

MODULATION AND CODING FOR QUANTIZED CHANNELS. Xiaoying Shao and Harm S. Cronie

CSE468 Information Conflict

Entropies & Information Theory

Lecture 5 Channel Coding over Continuous Channels

Basic Principles of Video Coding

Noisy channel communication

Lecture 4 Noisy Channel Coding

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics.

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

CS 630 Basic Probability and Information Theory. Tim Campbell

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

6.02 Fall 2011 Lecture #9

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

An introduction to basic information theory. Hampus Wessman

Lecture 11: Continuous-valued signals and differential entropy

On the Capacity of the Two-Hop Half-Duplex Relay Channel

Lecture 2: August 31

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Noisy-Channel Coding

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Entropy as a measure of surprise

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

EE 4TM4: Digital Communications II Scalar Gaussian Channel

Exercise 1. = P(y a 1)P(a 1 )

Solutions to Homework Set #3 Channel and Source coding

Energy State Amplification in an Energy Harvesting Communication System

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Constellation Shaping for Communication Channels with Quantized Outputs

3F1 Information Theory, Lecture 1

Appendix B Information theory from first principles

Shannon theory. Myths and reality

ECE 4400:693 - Information Theory

Estimation of the Capacity of Multipath Infrared Channels

Shannon s noisy-channel theorem

CHANNEL CAPACITY CALCULATIONS FOR M ARY N DIMENSIONAL SIGNAL SETS

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Channel Coding I. Exercises SS 2017

Information Theory Primer:

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Lecture 15: Thu Feb 28, 2019

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Coding theory: Applications

Lecture 5b: Line Codes

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Transcription:

Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information transmitted across channel, or it characterises channel But average mutual information is a bit too mathematical (too abstract) As an engineer, one would rather characterises channel by its physical quantities, such as bandwidth, signal power and noise power or SNR Also intuitively given source with information rate R, one would like to know if channel is capable of carrying the amount of information transferred across it In other word, what is the channel capacity? This lecture answers this fundamental question 71

A Closer Look at Channel Schematic of communication system source MODEM part X(k) Y(k) {X } x(t) y(t) {Y } i Analogue channel continuous time discrete time channel i sink Depending which part of system: discrete-time and continuous-time channels We will start with discrete-time channel then continuous-time channel Source has information rate, we would like to know capacity of channel Moreover, we would like to know under what condition, we can achieve error-free transferring information across channel, i.e. no information loss According to Shannon: information rate capacity error-free transfer 72

Review of Channel Assumptions No amplitude or phase distortion by the channel, and the only disturbance is due to additive white Gaussian noise (AWGN), i.e. ideal channel In the simplest case, this can be modelled by a binary symmetric channel (BSC) The channel error probability p e of the BSC depends on the noise power N P relative to the signal power S P, i.e. SNR= S P /N P Hence p e could be made arbitrarily small by increasing the signal power The channel noise power can be shown to be N P = N 0 B, where N 0 /2 is power spectral density of the noise and B the channel bandwidth AWGN PSD N /2 0 AWGN power f Our aim is to determine the channel capacity C, the maximum possible error-free information transmission rate across the channel B f 73

Channel Capacity for Discrete Channels Shannon s channel capacity C is based on the average mutual information (average conveyed information across the channel), and one possible definition is C = max{i(x,y )} = max{h(y ) H(Y X)} (bits/symbol) where H(Y ) is the average information per symbol at channel output or destination entropy, and H(Y X) error entropy Let t i be the symbol duration for X i and t av be the average time for transmission of a symbol, the channel capacity can also be defined as C = max {I(X,Y )/t av } (bits/second) C becomes maximum if H(Y X) = 0 (no errors) and the symbols are equiprobable (assuming constant symbol durations t i ) Channel capacity can be expressed in either (bits/symbol) or (bits/second) 74

Channel Capacity: Noise-Free Case In noise free case, error entropy H(Y X) = 0 and I(X, Y ) = H(Y ) = H(X) But the entropy of the source is given by: q H(X) = P(X i ) log 2 P(X i ) i=1 (bits/symbol) Let t i be symbol duration for X i ; average time for transmission of a symbol is q t av = P(X i ) t i (second/symbol) i=1 By definition, the channel capacity is C = max{h(x)/t av } (bits/second) Assuming constant symbol durations t i = T s, the maximum or the capacity is obtained for memoryless q-ary source with equiprobable symbols C = log 2 q/t s This is the maximum achievable information transmission rate 75

Channel Capacity for BSC BSC with equiprobable source symbols P(X 0 ) = P(X 1 ) = 0.5 and variable channel error probability p e (due to symmetry of BSC, P(Y 0 ) = P(Y 1 ) = 0.5) The channel capacity C (in bits/symbol) is given as C = 1 + (1 p e ) log 2 (1 p e ) + p e log 2 p e 1 0.8 0.6 C 0.4 0.2 0 10 4 10 3 10 2 10 1 If p e = 0.5 (worst case), C = 0; and if p e = 0 (best case), C = 1 p e 76

Channel Capacity for BSC (Derivation) P(X 0 ) = 1 2 P(X 1 ) = 1 2 P(Y 0 X 0 ) = 1 p e X 0 X 1 1 1 1 1 Y 0 Y 1 P(Y 1 X 1 ) = 1 p e P(Y 0 X 1 ) = p e P(Y 1 X 0 ) = p e P(Y 0 ) = 1 2 P(Y 1 ) = 1 2 P(X 0, Y 0 ) = P(X 0 )P(Y 0 X 0 ) = (1 p e )/2, P(X 0, Y 1 ) = P(X 0 )P(Y 1 X 0 ) = p e /2 P(X 1, Y 0 ) = p e /2, P(X 1, Y 1 ) = (1 p e )/2 I(X, Y ) = P(X 0, Y 0 ) log 2 P(Y 0 X 0 ) P(Y 0 ) +P(X 1, Y 0 ) log 2 P(Y 0 X 1 ) P(Y 0 ) + P(X 0, Y 1 ) log 2 P(Y 1 X 0 ) P(Y 1 ) + P(X 1, Y 1 ) log 2 P(Y 1 X 1 ) P(Y 1 ) + = 1 2 (1 p e)log 2 2(1 p e ) + 1 2 p e log 2 2p e + 1 2 p e log 2 2p e + 1 2 (1 p e) log 2 2(1 p e ) = 1 + (1 p e ) log 2 (1 p e ) + p e log 2 p e (bits/symbol) 77

Channel Capacity and Channel Coding Shannon s theorem: If information rate R C, there exists a coding technique such that information can be transmitted over the channel with arbitrarily small error probability; if R > C, error-free transmission is impossible C is the maximum possible error-free information transmission rate Even in noisy channel, there is no obstruction of reliable transmission, but only a limitation of the rate at which transmission can take place Shannon s theorem does not tell how to construct such a capacity-approaching code Most practical channel coding schemes are far from optimal, but capacityapproaching codes exist, e.g. turbo codes and low-density parity check codes Practical communication systems are far from near capacity, but recently nearcapacity techniques have been developed iterative turbo detection-decoding 78

Entropy of Analogue Source Entropy of a continuous-valued (analogue) source, where the source output x is described by the PDF p(x), is defined by H(x) = + p(x)log 2 p(x)dx According to Shannon, this entropy attends the maximum for Gaussian PDFs p(x) (equivalent to equiprobable symbols in the discrete case) Gaussian PDF with zero mean and variance σ 2 x: p(x) = 1 2πσx e (x2 /2σ 2 x ) The maximum entropy can be shown to be H max (x) = log 2 2πeσx = 1 2 log 2 2πeσ 2 x 79

source x Gaussian Channel channel ε y Σ sink Signal with Gaussian PDF attains maximum entropy, thus we consider Gaussian channel Channel output y is linked to channel input x by y = x + ε Channel AWGN ε is independent of channel input x, having noise power N P = σ 2 ε Assume Gaussian channel, i.e. channel input x has a Gaussian PDF, having signal power S P = σ 2 x Basic linear system theory: Gaussian signal operated by linear operator remains Gaussian Channel output signal y is also Gaussian, i.e. having a Gaussian PDF with power σ 2 y σ 2 y = S P + N P = σ 2 x + σ2 ε 80

Capacity for Continuous Channels Thus channel output y has a Gaussian PDF ) 1 (y p(y) = e 2 /2σy 2 2πσy with power σ 2 y = S P + N P, and entropy of y attains maximum value H max (y) = 1 2 log 2 2πe(S P + N P ) Since I(x, y) = H(y) H(y x), and H(y x) = H(ε) with ε being AWGN H(y x) = 1 2 log 2 2πeN P Therefore, the average mutual information I(x,y) = 1 ( 2 log 2 1 + S ) P N P [bits/symbol] 81

Shannon-Hartley Law With a sampling rate of f s = 2 B, the Gaussian channel capacity is given by ( C = f s I(x,y) = B log 2 1 + S ) P (bits/second) N P where B is the signal bandwidth For digital communications, signal bandwidth B (Hz) is channel bandwidth Sampling rate f s is the symbol rate (symbols/second) Channel noise power is N P = N 0 B, where N 0 is the power spectral density of the channel AWGN Two basic resources of communication: bandwidth and signal power Increasing the SNR S P N P increases the channel capacity Increasing the channel bandwidth B increases the channel capacity Gaussian channel capacity is often a good approximation for practical digital communication channels 82

Bandwidth and SNR Trade off From the definition of channel capacity, we can trade the channel bandwidth B for the SNR or signal power S P, and vice versa Depending on whether B or S P is more precious, we can increase one and reduce the other, and yet maintain the same channel capacity A noiseless analogue channel (S P /N P = ) has an infinite capacity C increases as B increases, but it does not go to infinity as B ; rather C approaches an upper limit ( C = B log 2 1 + S ) P N 0 B = S ( P log N 2 1 + S ) N0 B/S P P 0 N 0 B Recall that We have lim (1 + x 0 x)1/x = e C = lim B C = S P N 0 log 2 e = 1.44 S P N 0 83

Bandwidth and SNR Trade off Example Q: A channel has an SNR of 15. If the channel bandwidth is reduced by half, determine the increase in the signal power required to maintain the same channel capacity A: ( B log 2 1 + S ) ( P = B log N 0 B 2 1 + S P N 0 B ( ) 4 B = B 2 log 2 1 + (S P /S P) S P N 0 B/2 ( ) 8 = log 2 1 + 30 S P S P ) 256 = 1 + 30 S P S P S P = 8.5S P 84

Summary Channel capacity, a fundamental physical quantity, defines maximum rate that information can be transfer across channel error-free It is based on concept of maximum achievable mutual information between channel input and output, either defined as [bits/symbol] or [bits/s] Channel capacity for discrete channels, e.g. channel capacity for BSC Shannon theorem Channel capacity for continuous channels Continuous-valued signal attains maximum entropy, if its PDF is Gaussian Gaussian channel capacity: Shannon-Hartley law C = B log 2 1 + S P N P where B is channel bandwidth; SNR = S P N P Bandwidth and signal power trade off «(bits/second) Shannon s information theory provides underlying principles for communication and information processing systems 85