Basics of Information Theory: Markku Juntti. Basic concepts and tools 1 Introduction 2 Entropy, relative entropy and mutual information

Similar documents
3. Basic Concepts: Consequences and Properties

Basics of Information Theory: Markku Juntti. Basic concepts and tools 1 Introduction 2 Entropy, relative entropy and mutual information

ECE 559: Wireless Communication Project Report Diversity Multiplexing Tradeoff in MIMO Channels with partial CSIT. Hoa Pham

12. Maximum Entropy and Spectrum Estimation

TESTS BASED ON MAXIMUM LIKELIHOOD

Entropies & Information Theory

Lecture 3 Probability review (cont d)

5. Data Compression. Review of Last Lecture. Outline of the Lecture. Course Overview. Basics of Information Theory: Markku Juntti

Signal,autocorrelation -0.6

Functions of Random Variables

CODING & MODULATION Prof. Ing. Anton Čižmár, PhD.

Convergence of the Desroziers scheme and its relation to the lag innovation diagnostic

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

Summary of the lecture in Biostatistics

ECE 729 Introduction to Channel Coding

Wireless Link Properties

D. VQ WITH 1ST-ORDER LOSSLESS CODING

A New Measure of Probabilistic Entropy. and its Properties

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

CHAPTER VI Statistical Analysis of Experimental Data

Source-Channel Prediction in Error Resilient Video Coding

The Mathematical Appendix

Chapter 5 Properties of a Random Sample

Chapter 4 Multiple Random Variables

Chapter 10 Two Stage Sampling (Subsampling)

ρ < 1 be five real numbers. The

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

Stochastic control viewpoint in coding and information theory for communications

Chapter 2 - Free Vibration of Multi-Degree-of-Freedom Systems - II

Chain Rules for Entropy

18.413: Error Correcting Codes Lab March 2, Lecture 8

Chapter 5. Presentation. Entropy STATISTICAL CODING

STK4011 and STK9011 Autumn 2016

Chapter 14 Logistic Regression Models

Strong Convergence of Weighted Averaged Approximants of Asymptotically Nonexpansive Mappings in Banach Spaces without Uniform Convexity

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

5 Short Proofs of Simplified Stirling s Approximation

X ε ) = 0, or equivalently, lim

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Rademacher Complexity. Examples

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

VARIABLE-RATE VQ (AKA VQ WITH ENTROPY CODING)

MATH 247/Winter Notes on the adjoint and on normal operators.

Econometric Methods. Review of Estimation

Unsupervised Learning and Other Neural Networks

Channel Polarization and Polar Codes; Capacity Achieving

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

ESS Line Fitting

Introduction to Matrices and Matrix Approach to Simple Linear Regression

ECE 421/599 Electric Energy Systems 7 Optimal Dispatch of Generation. Instructor: Kai Sun Fall 2014

A tighter lower bound on the circuit size of the hardest Boolean functions

Lecture 07: Poles and Zeros

Point Estimation: definition of estimators

On generalized fuzzy mean code word lengths. Department of Mathematics, Jaypee University of Engineering and Technology, Guna, Madhya Pradesh, India

2. Independence and Bernoulli Trials

Some Notes on the Probability Space of Statistical Surveys

Assignment 7/MATH 247/Winter, 2010 Due: Friday, March 19. Powers of a square matrix

Detection and Estimation Theory

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Chapter 3 Sampling For Proportions and Percentages

Lower Bounds of the Kirchhoff and Degree Kirchhoff Indices

Chapter 4 Multiple Random Variables

On the construction of symmetric nonnegative matrix with prescribed Ritz values

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 9 Jordan Block Matrices

Lecture Note to Rice Chapter 8

Lecture 7: Linear and quadratic classifiers

Lecture Notes Types of economic variables

On Interactive Encoding and Decoding for Distributed Lossless Coding of Individual Sequences

Extreme Value Theory: An Introduction

Square Root Law for Communication with Low Probability of Detection on AWGN Channels

Extend the Borel-Cantelli Lemma to Sequences of. Non-Independent Random Variables

Logistic regression (continued)

1 Lyapunov Stability Theory

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

Error probability and error stream properties in channel with slow Rician fading

Reexamination of Quantum Data Compression and Relative Entropy

On the Delay-Throughput Tradeoff in Distributed Wireless Networks

Lecture 3. Sampling, sampling distributions, and parameter estimation

α1 α2 Simplex and Rectangle Elements Multi-index Notation of polynomials of degree Definition: The set P k will be the set of all functions:

QR Factorization and Singular Value Decomposition COS 323

Model Fitting, RANSAC. Jana Kosecka

PERFORMANCE EVALUATION OF C-BLAST MIMO SYSTEMS USING MMSE DETECTION ALGORITHM

Maps on Triangular Matrix Algebras

Generalization of the Dissimilarity Measure of Fuzzy Sets

Capacity Bounds for Backhaul-Supported Wireless Multicast Relay Networks with Cross-Links

A Method for Damping Estimation Based On Least Square Fit

Research on Efficient Turbo Frequency Domain Equalization in STBC-MIMO System

Broadcast Channel with Transmitter Noncausal Interference and Receiver Side Information

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

IEICE TRANS.??, VOL.Exx??, NO.xx XXXX 200x 1

Entropy ISSN by MDPI

MAX-MIN AND MIN-MAX VALUES OF VARIOUS MEASURES OF FUZZY DIVERGENCE

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

A Remark on the Uniform Convergence of Some Sequences of Functions

C.11 Bang-bang Control

Bounds for the Connective Eccentric Index

Transcription:

: Maru Jutt Overvew he propertes of adlmted Gaussa chaels are further studed, parallel Gaussa chaels ad Gaussa chaels wth feedac are solved. Source he materal s maly ased o Sectos.4.6 of the course oo [] ad also Beedetto & Bgler [3, Sect. 3.3]. elecomm. Laoratory Course Overvew Basc cocepts ad tools Itroducto Etropy, relatve etropy ad mutual formato 3 Asymptotc equpartto t property 4 Etropy rates of a stochastc process Source codg or data compresso 5 Data compresso Chael capacty 8 Chael capacty 9 Dfferetal etropy he Gaussa chael Other applcatos Maxmum etropy ad spectral estmato 3 Rate dstorto theory 4 Networ formato theory elecomm. Laoratory Outle of the Lecture Revew of the last lecture Commucato theoretc ouds arallel Gaussa chaels Colored Gaussa ose chaels Gaussa chaels wth feedac Summary Revew of Last Lecture Gaussa ose at tme ~N(, ); Chael put at tme depedet o put Chael output at tme Y = + he Gaussa chael wth dscrete tme dex ad cotuous ampltude. he most mportat cotuous alphaet chael s the Gaussa chael. Wthout costrats o the sgal-to-ose rato (SNR) the capacty would e fte. Usually a power costrat s appled for a codeword of flegth : x. elecomm. Laoratory 3 elecomm. Laoratory 4

Capacty of Gaussa Chael he (operatoal) capacty of a Gaussa chael wth power costrat ad ose varace s log. C I other words, all rates elow chael capacty C are achevale. For all rate R < C: a sequece of ( R,) codes so that, as. Coverse: For ay sequece of achevale ( R,) codes R C. (t) Bad-Lmted Chael Model (t) h(t) Y(t) W H(f) Assume a deal adpass chael mpulse respose h(t). Chael output: Y t t t h t h t t h t t. I other words, oth the ecoded formato sgal ad ose are ad-lmted. W elecomm. Laoratory 5 elecomm. Laoratory 6 Capacty of Bad-Lmted Gaussa Chael Remd the capacty of Gaussa chael per oe sgle sample trasmsso: log. C Capacty per sample: C log. N W Capacty per tme ut ((W) samples per secod): C W log. N W Commucato heoretc Bouds Assume that we have ary data to e trasmtted over a adlmted Gaussa chael. Called formato data symols. Assume that the formato t rate [ts/s] of the source s R. he eergy e per formato o t s E = /R/ or = R E. he formato data s the ecoded y a forward error cotrol (FEC) code wth rate R = / to yeld chael data ts trasmtted over a commucato chael. he ecoded t rate trasmtted to the chael s R /R, where R s the code rate. elecomm. Laoratory 7 elecomm. Laoratory 8

Capacty wth Nyqust Badwdth he mmum adwdth for relale (tersymol terferece free) commucato [Dgtal Commucatos] s the Nyqust adwdth: W Nyq = R /R. he capacty per tme ut wth = R E ecomes RE C W log W log. N W N W he capacty per sgle chael use wth W = W Nyq = R /R: RE RE C log log log. NW NR /R N Rate ad SNR Bouds Fao s equalty for BSC: H Y H e e log. It ca e show (for detals, see [3, Sect. 3.3]) that the code rate must satsfy C R H. e Applyg ths to the adlmted Gaussa chael, we get RE log R N E R. H e N R By settg equalty, curves ca e draw. elecomm. Laoratory 9 elecomm. Laoratory roalty of Error Boud vs. SNR per Bt Requred SNR per Bt vs. Code Rate elecomm. Laoratory elecomm. Laoratory

Mmum SNR for Relale Commucatos If tae the lmtg case of fte adwdth (W ): R E R E C W log log e, W. N W N hefamousloweroudforsnrpertforrelale for t for relale commucatos: R E C loge R N E.693.59 db. N loge Spectral Effcecy Oe ca smlarly derve a oud for spectral effcecy r = R /W [(ts/s)/hz]: r E. N r elecomm. Laoratory 3 elecomm. Laoratory 4 arallel Gaussa Chaels Cosder depedet parallel Gaussa chaels (or suchaels) wth commo power costrat jot ecodg ad power allocato jot decodg depedet addtve Gaussa ose each chael geeralzed soo for correlated ose case. I other words, we do ot have dfferet commucato prolems, ut parallel chaels to carry the same message. elecomm. Laoratory 5 arallal Gaussa Chael Model ~ N (,, ). Chael put vector: Y. Nose vector: ~ N (,, ). Y ~ N,. Chael output vector: Y Y Y Y. Nose covarace: ~ N (,, ). dag,,,,,,. Y Ucorrelated for the tme eg. he commo power costrat: arallel Gaussa chaels. E. elecomm. Laoratory 6

Mutual Iformato Boud for arallel Gaussa Chaels Smlarly as sgle chael case, the mutual formato ecomes I ; Y h Y h Y h Y h h Y h h Y h. Sce the ose compoets are depedet, we get I ; Y h Y h log e log e,,,. log, E, elecomm. Laoratory 7 Capacty of arallel Gaussa Chaels Equalty ca e acheved the mutual formato oud f N, dag,,,. ~ he capacty of parallel Gaussa chaels: C max I f x log, ; Y. he remag prolem: how to allocated the powers to the suchaels so that the aove maxmum s acheved? Waterfllg. elecomm. Laoratory 8 ower Allocato Optmzato : rolem Set-Up Optmzato prolem: Maxmze suject to C log. Lagrage fuctoal:, J,,,, log.. ower Allocato Optmzato: Dfferetato Dfferetate the Lagrage fuctoal ad set dervatve to zero: J,,,,. Aove s selected so that the power costrat s satsfed,.e.,. Sce the power caot e egatve the fal soluto s foud y uh-ucer codtos., elecomm. Laoratory 9 elecomm. Laoratory

ower Allocato Optmzato: Water-Fllg Soluto Applcatos of arallel Chaels,,,, x, x, x x. ower 4 = 3,,, 3, 4 Ch # Ch # Ch #3 Ch #4 Suchaels multcarrer commucatos. ower ad data rate adapto (ofte called t loadg) over sucarrers. Spatal chaels multatea multple-put multple-output (MIMO) commucatos. Smlar power, data ad phase adaptato possle. I wreless commucatos, the parallel chaels ca e ether for creased relalty y dversty same data over several parallel suchaels creased data rate y multplexg dfferet data over dfferet suchaels. elecomm. Laoratory elecomm. Laoratory Colored Gaussa Nose Chaels We have so far assumed that the ose s whte (ucorrelated) oth temporally (adlmted Gaussa chael) or over suchaels (parallel Gaussa chaels). he treatmet ca e geeralzed to colored (correlated) Gaussa ose for oth cases. emporal correlato (ose wth memory) s modelled wth a loc of cosequtve uses of the chael. he parallel chael case s modelled wth parallel depedet ose processes. he same mathematcal treatmet for oth cases. ad are the covarace matrces of the ose ad the put, respectvely. Mutual Iformato ower costrat: E tr, where tr() deotes the trace of a matrx,.e., the sum of the dagoal elemets. Note that ow the costrat depeds o. Mutual formato s le wth whte ose, sce the ose ad put are stll depedet from each other: I ; Y h Y h Y h Y h h Y h h Y h. he aove s mazmzed whe Y s Gaussa whch requres to e Gaussa. elecomm. Laoratory 3 elecomm. Laoratory 4

Dfferetal Etropy ad Egevalue Decomposto he covarace of the output Y: Y = +. he dfferetal etropy: h Y h Y, Y,, Y log e Y log e. Apply the egevalue decomposto = QQ Q, where Q s orthogoal,.e., QQ = Q Q = I, ad s a dagoal matrx wth egevalues of at the dagoal. QΛQ Q Q Q Λ Q Q Q A Λ, Q Λ Q A Q Q Q. Q Λ elecomm. Laoratory 5 Optmal ower Allocato Soluto Sce for ay matrces B ad C tr(bc) =tr(cb) tr(cb), the trace the power costrat ecomes tr(a) = tr(q = Q) tr(qq ) = tr( ). Maxmze A+ suject to tr(a). Apply Hadamard a d equalty: wth equalty f ad oly f s dagoal. A Λ A A. Smlarly to the water-fllg, the soluto s A, A. elecomm. Laoratory 6 Chaels wth Memory: Spectral Waterfllg Cosder a chael wth colored ose,.e., ose wth memory or o-uform power spectral desty (SD). If the process s statoary, the covarace matrx s oepltz, the egevalues have a lmt as, ad ther desty teds to the SD of the ose process. Water-fllg the spectral doma. Gaussa Chaels wth Feedac W ~ N (, ). Y = + he Gaussa chael wth feedac. We showed that feedac does ot crease the capacty of memoryless dscrete chaels. he same s true for memoryless Gaussa chaels. For chaels wth memory (correlated ose from tme stat to aother), capacty ca e creased. elecomm. Laoratory 7 elecomm. Laoratory 8

rolem Defto: Feedac Codes he chael outputs are assumed to e avalale at the trasmtter. A ( R,) feedac code s a sequece of mappgs x (W,Y ), where each symol x s a fucto oly of the message W ad prevous receved values Y,Y,,Y. I addto, we have the power costrat w, Y, w,,,. R x Because of feedac, ad deped o each other. ey Results o Feedac Capacty he capacty wth feedac per oe trasmsso: C,FB max log C ts, tr where the capacty wthout feedac C s max C log. tr Ay achevale code satsfes R log,,. elecomm. Laoratory 9 elecomm. Laoratory 3,,,, x, x, x x. Summary: Water-Fllg Soluto ower 3 4 =,,, 3, 4 Ch # Ch # Ch #3 Ch #4 ey Results o Feedac Capacty he capacty wth feedac per oe trasmsso: C,FB max log C ts, tr where the capacty wthout feedac C s C max log. tr Ay achevale code satsfes R log,,. elecomm. Laoratory 3 elecomm. Laoratory 3