Discrete Memoryless Channels

Similar documents
Source-Channel-Sink Some questions

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding

Departure Process from a M/M/m/ Queue

Lecture 3: Shannon s Theorem

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Multipoint Analysis for Sibling Pairs. Biostatistics 666 Lecture 18

Excess Error, Approximation Error, and Estimation Error

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE

COS 511: Theoretical Machine Learning

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

The Decibel and its Usage

Error Probability for M Signals

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

Fermi-Dirac statistics

System in Weibull Distribution

PHYS 342L NOTES ON ANALYZING DATA. Spring Semester 2002

1 Review From Last Time

Multi-dimensional Central Limit Theorem

1 Definition of Rademacher Complexity

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

EGR 544 Communication Theory

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Solving Fuzzy Linear Programming Problem With Fuzzy Relational Equation Constraint

+, where 0 x N - n. k k

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration

What is LP? LP is an optimization technique that allocates limited resources among competing activities in the best possible manner.

Pulse Coded Modulation

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

A Performance Model of Space-Division ATM Switches with Input and Output Queueing *

EE513 Audio Signals and Systems. Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky

6 Supplementary Materials

Xiangwen Li. March 8th and March 13th, 2001

Confidence intervals for weighted polynomial calibrations

Hidden Markov Model Cheat Sheet

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

Physics 3A: Linear Momentum. Physics 3A: Linear Momentum. Physics 3A: Linear Momentum. Physics 3A: Linear Momentum

Fall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede. . For P such independent random variables (aka degrees of freedom): 1 =

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

On the Construction of Polar Codes

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Matching Dyadic Distributions to Channels

On the Construction of Polar Codes

y new = M x old Feature Selection: Linear Transformations Constraint Optimization (insertion)

I - Information theory basics

What Independencies does a Bayes Net Model? Bayesian Networks: Independencies and Inference. Quick proof that independence is symmetric

The Impact of the Earth s Movement through the Space on Measuring the Velocity of Light

VQ widely used in coding speech, image, and video

XII.3 The EM (Expectation-Maximization) Algorithm

Revision: December 13, E Main Suite D Pullman, WA (509) Voice and Fax

e i is a random error

Naïve Bayes Classifier

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Composite Hypotheses testing

Generative classification models

LECTURE :FACTOR ANALYSIS

On the number of regions in an m-dimensional space cut by n hyperplanes

Assignment 2. Tyler Shendruk February 19, 2010

Lecture 20: Hypothesis testing

STATISTICAL MECHANICS

Preference and Demand Examples

Exam. Econometrics - Exam 1

The Dirac Equation. Elementary Particle Physics Strong Interaction Fenomenology. Diego Bettoni Academic year

1.3 Hence, calculate a formula for the force required to break the bond (i.e. the maximum value of F)

Priority Queuing with Finite Buffer Size and Randomized Push-out Mechanism

An Application of Fuzzy Hypotheses Testing in Radar Detection

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

On Pfaff s solution of the Pfaff problem

ON THE NUMBER OF PRIMITIVE PYTHAGOREAN QUINTUPLES

Two Conjectures About Recency Rank Encoding

Strong Markov property: Same assertion holds for stopping times τ.

What would be a reasonable choice of the quantization step Δ?

Quantum Particle Motion in Physical Space

PhysicsAndMathsTutor.com

Digital Modems. Lecture 2

Modelli Clamfim Equazione del Calore Lezione ottobre 2014

One-sided finite-difference approximations suitable for use with Richardson extrapolation

Answers Problem Set 2 Chem 314A Williamsen Spring 2000

Linear system of the Schrödinger equation Notes on Quantum Mechanics

On Syndrome Decoding of Punctured Reed-Solomon and Gabidulin Codes 1

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

BAYESIAN CURVE FITTING USING PIECEWISE POLYNOMIALS. Dariusz Biskup

6. Hamilton s Equations

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

Classification Bayesian Classifiers

Spectral method for fractional quadratic Riccati differential equation

Applied Mathematics Letters

Professor Chris Murray. Midterm Exam

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Universal communication part II: channels with memory

Logistic regression with one predictor. STK4900/ Lecture 7. Program

Our focus will be on linear systems. A system is linear if it obeys the principle of superposition and homogenity, i.e.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Chapter 1. Probability

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

NUMERICAL DIFFERENTIATION

Transcription:

Dscrete Meorless Channels Source Channel Snk Aount of Inforaton avalable Source Entro Generall nos, dstorted and a be te varng ow uch nforaton s receved? ow uch s lost? Introduces error and lts the rate at whch data can be transferred Ma.Caact? C. In general, the channel ncludes odulator, transsson edu, deodulator and channel decoder. Source Channel Snk Alhabet of M nut sbols Iarent causes error n the detected sbol Sae alhabet of M sbols The dscrete channel s odeled b,,,3,..., M : source sbol robablt,.e. th robablt that the nut to the channel s the sbol. Therefore, we have,,,..., 3 M C.

th, where,,,3,..., M : robablt that the th sbol s sent and the sbol of the alhabet s receved at the outut of the channel. That s. Therefore, we have,,...,,.,,...,,... These robabltes deend on the araeters of the odulator, transsson eda, nose, and deodulator. C.3 Two-sbol source For a two-sbol source, the odel s as follows: Source : ; Snk : and and due to nose. aths reresent correct receton. aths reresent erroneous receton C.4

Bnar Setr Channel BSC error,, Assuton: the occurrence of an error durng a bt nterval does not affect the sste durng other bt ntervals..e. assue channel to be eorless Dscrete Meorless channel If - Bnar setr channel BSC.e. C.5 M-sbol source Model for a -sbol source s: Source : Snk : 3 C.6

M-sbol source Note that for a artcular value of, and for a artcular value of.. error The odel ght have nut sbols and n outut sbols, and usuall n C.7 Source entro Entro of the nut source entro s defned as log bts/sbol where source sbol robablt C.8

C.9 Snk entro Entro of the outut snk entro s defned as bts/sbol where snk sbol robablt usuall unknowns log C. Condtonal entro The effect of nose on the sbols s to cause uncertant n the receved sbol. The aount of uncertant s gven b the condtonal entro or error entro and equvocaton. log log

C. Condtonal entro - easures the uncertant of about a receved bt based on a transtted bt. C. Condtonal entro It s also ossble to defne another condtonal entro n ters of the condtonal robabltes log log

Condtonal entro - reresents how uncertan we are of, on the average, when we know. In other words, t reresents the aount of uncertant reanng about the channel nut after the channel outut has been observed. C.3 Eale Source : Snk : 3 3 [ log [ log [ 3 log 3 3 3 log log log 3 3log 3 log 3 log 3 33 log 3] 3 ] 33] C.4

C.5 Eale log log log 3 3 where 3 3 3 3 3 3 C.6 Eale When the channel s noseless, the sbols are receved wthout error. and - NO uncertant about the outut when the nut s known. - NO nforaton s lost. f... log

Eale Source : Snk : C.7 Eale When the channel s nos that the outut s statstcall ndeendent of the nut,......... That s are equal for all and. and then / C.8

Eale log log log log { log } log C.9 Eale If BSC, bt/sbol. > bt uncertant. Therefore, No nforaton s conveed. C.

3 4 4? Eale 3 Gven : / 3; / 3 log log.98 bts,, log log C. Eale 3 Slarl,? Thus, f or, there s no uncertant about, but f?, we have uncertant about. C.

Rate of Inforaton transsson A dscrete sbol eorless channel s accetng sbols fro an M-sbol source at a rate of sbols /second. r s log bts/sbol The average rate at whch nforaton s gong nto the channel s D r bts/sec. n s owever, soe nforaton s lost due to nose n ractce. C.3 Eale 4 Suose two sbols {,} are transtted at sbols/sec wth /, /, D n bts/sec. Let the channel be setrc wth robablt of errorless transsson.95. What s the rate of transsson of nforaton? a. 95 bts/sec? b. >95bts/sec? c. <95bts/sec?.95.5.95.5 C.4

Eale 5 For eale,.5, then /, / rresectve of what s actuall beng transtted ver nose condton. Now the sbols receved are correct due to chance alone. In fact, we can dsconnect the ath and guessng the receved sbol to be ether a '' or ''. C.5 Mutual Inforaton reresents the aount of nforaton n that one cannot rel on. Thus the aount of nforaton at the snk ust be reduced b the aount of uncertant that gve the true aount of nforaton receved at the snk. We defne Mutual Inforaton as I ; Alternatvel, I ; can be defned b notng that the nforaton etted b the source,, s reduced b the loss nforaton caused b nose n the channel. I ; C.6

Eale 6 Recall fro eale and, Noseless channel I ; or I ; Ver nos channel or I ; I ; C.7 Eale 7 A bnar setrc channel s shown below. Fnd the rate of nforaton transsson over ths channel when.9,.8,.6; assue that the sbol rate bt rate n ths case s sbols/sec. log log log / / bt/sbol C.8

Eale 7 Average rate whch nforaton transsson over the channel s bts/sec. r s The rate of nforaton transsson over the channel s gven b the utual nforaton I ;. I ; or I ; r s C.9 Eale 7 Usng I ; / / [ log log ] [ log log ] [ log log ] [ log log ] C.3

Eale 7.9.8.6 I ;.53.78.9 I ; 53 bts/sec 78 bts/sec 9 bts/sec r s I ; r s decreases radl as the robablt of error /. Note that data rate bts/sec s not the sae as the nforaton rate I ;. r s C.3 Suar Source Entro: log bts/sbol Snk Entro: log bts/sbol Error Entro or Condtonal Entro: where log C.3

where Source sbol robabltes Snk sbol robabltes and log C.33 Channel Caact: Dscrete Meorless Channel DMC The concet of utual nforaton can be suarzed as follows: I ; In ractce,, are fed for a gven,,..., dscrete channel unless the nose characterstcs are tevarng. The utual nforaton of a channel deends not onl on the channel but also on the wa n whch the channel s used. The nut robablt dstrbuton { } s ndeendent of the channel. We can then aze the utual nforaton of the channel wth resect to. { } C.34

Channel caact The caact of a nos dscrete, eorless channel s defned as the au ossble rate of nforaton transsson over the channel. The au rate of transsson occurs when the source s "atched" to the channel. C Ma I ; { } { } Ma bts /sbol Note that the channel caact s a functon onl of the transton robablt, whch defne the channel. The calculaton of C nvolves azaton of the utual nforaton over varables subect to two constrants: { } { } for all C.35 In general, C s a functon of the source and the nose statstcs. The channel redundanc and channel effcent are defned as Channel Redundanc C I ; bts/sbol C I ; % C I ; Channel Effcenc % C Note that C s eressed n bts/sbol. If the sbol rate s τ s sbol/s, C can be eressed n bts/s b ultlng τ s. C.36

Eale 8 Bnar Setrc Channel robablt of a sbol beng receved correctl s q - sae for each sbol. robablt of a sbol beng receved ncorrectl s - sae for each sbol. q q q C.37 Let log log q log q log log q log q log log log log log q log q C.38

C.39 log log log where log log C.4 Thus The results show that au caact s when 4444 4 3 44 44 4 ndeendent of log log ; I { } { } and / when Ma / - - / Ma / log log then and C

.9.8.7.6.5.4.3...5..5..5.3.35.4.45.5 BSC s a usual odel whch aroates the behavour of an ractcal bnar channels. C.4 Other channel odels Bnar Erasure Channel BEC q e erasure q C.4

Other channel odels Setrc Erasure Channel SEC, a cobnaton of BSC and BEC q r r q e erasure C.43 Contnuous Channel Modulator Modulator Deodulator Shannon's Theore Contnuous Channel Gven a source of M equall lkel essages, wth M >>, whch s generatng nforaton at a rate of R bts er second. If R C, the channel caact, there ests a channel codng technque such that the councaton sste wll transt nforaton wth an arbtrar sall robablt of error. C.44

Shannon-artle Theore For a whte, bandlted Gaussan channel, the channel caact s C B log S N bt/sec Note: S/N not the 'db' value where S - average sgnal ower at outut of contnuos channel N - average nose ower at outut of contnuos channel C.45 η η N B ηb Watts Shannon-artle Theore : two-sded ower sectral denst sd of the nose n watt/z sd B - channel bandwdth. η / df B B frequenc Nose altude C.46

Soe secal cases C B log bt/sec gves uer lt for relable data transsson over Gaussan channel For eale: Bandwdth of a telehone lne 3kz, S/N 3dB C 3log 3kb / sec S N Echange of S/N for Bandwdth B As N S / N C C, therefore no nose, C.47 As B, does C? No. As B ncreases so does the nose. In the resence of nose, C reaches a fnte uer lt as bandwdth ncreases, for a fed sgnal ower η Nose ower N B ηb S C B log N S η S B log η S ηb S log η S ηb ηb S C.48

l / e S S l C log e.44 B η η C.49 Eale 9 Consder C bt/s Bandwdth 3z, S/N? 3 log S / N S / N 3.333 9 Bandwdth z, S/N? log S / N S / N For the sae C, bandwdth can be reduced fro kz to 3kz f we ncrease S/N 9 tes. C.5

Eale If If S / N 7, B 4kz C kbt / s S / N 5, B 3kz C kbt / s 3 Wth a 3 kz bandwdth, the nose ower ηb wll be as 4 large as wth a 4kz. Snce, N3 η3 3 and S / N 3 5 N η 4 4 S / N 7 4 S3 3 5.6 S 4 4 7 The sgnal ower s ncreased b.6 tes to gve the sae caact when the bandwdth s reduced fro 4 to 3 kz 5% 4 C.5 Eale Can we transt an analogue sgnal of bandwdth over a channel havng a bandwdth less than? f f z Suose salng rate s 3 Nqust rate and nuber of quantzaton level s M Nuber of levels er sbol Data rate R 6 f log M bts/s Sa M 64, R 36 f bt/s and bandwdth of channel B, we can then work out S/N, rovdng R C. C.5

Eale Sa M 64, R 36 f bt/s Let C R but the channel bandwdth s f / 36 f 7 log S N f log 7 S N S N 7dB C.53