Source-Channel-Sink Some questions

Similar documents
Discrete Memoryless Channels

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding

Departure Process from a M/M/m/ Queue

Lecture 3: Shannon s Theorem

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Excess Error, Approximation Error, and Estimation Error

COS 511: Theoretical Machine Learning

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE

Multipoint Analysis for Sibling Pairs. Biostatistics 666 Lecture 18

The Decibel and its Usage

Pulse Coded Modulation

EE513 Audio Signals and Systems. Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky

Error Probability for M Signals

Fermi-Dirac statistics

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

System in Weibull Distribution

1 Review From Last Time

1 Definition of Rademacher Complexity

What is LP? LP is an optimization technique that allocates limited resources among competing activities in the best possible manner.

Solving Fuzzy Linear Programming Problem With Fuzzy Relational Equation Constraint

PHYS 342L NOTES ON ANALYZING DATA. Spring Semester 2002

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

Confidence intervals for weighted polynomial calibrations

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

Multi-dimensional Central Limit Theorem

A Performance Model of Space-Division ATM Switches with Input and Output Queueing *

EGR 544 Communication Theory

Matching Dyadic Distributions to Channels

Xiangwen Li. March 8th and March 13th, 2001

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration

Priority Queuing with Finite Buffer Size and Randomized Push-out Mechanism

What Independencies does a Bayes Net Model? Bayesian Networks: Independencies and Inference. Quick proof that independence is symmetric

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

On the Construction of Polar Codes

On Pfaff s solution of the Pfaff problem

6 Supplementary Materials

Hidden Markov Model Cheat Sheet

Fall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede. . For P such independent random variables (aka degrees of freedom): 1 =

+, where 0 x N - n. k k

Lecture 20: Hypothesis testing

On the Construction of Polar Codes

Assignment 2. Tyler Shendruk February 19, 2010

Exam. Econometrics - Exam 1

Bayesian Decision Theory

VQ widely used in coding speech, image, and video

Physics 3A: Linear Momentum. Physics 3A: Linear Momentum. Physics 3A: Linear Momentum. Physics 3A: Linear Momentum

Composite Hypotheses testing

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

y new = M x old Feature Selection: Linear Transformations Constraint Optimization (insertion)

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

I - Information theory basics

STATISTICAL MECHANICS

On the number of regions in an m-dimensional space cut by n hyperplanes

Mathematical Models for Information Sources A Logarithmic i Measure of Information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

On Syndrome Decoding of Punctured Reed-Solomon and Gabidulin Codes 1

Lec 02 Entropy and Lossless Coding I

NUMERICAL DIFFERENTIATION

Revision: December 13, E Main Suite D Pullman, WA (509) Voice and Fax

Universal communication part II: channels with memory

Generative classification models

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

PhysicsAndMathsTutor.com

Machine Learning. What is a good Decision Boundary? Support Vector Machines

e i is a random error

Limited Dependent Variables

LECTURE :FACTOR ANALYSIS

Module 9. Lecture 6. Duality in Assignment Problems

x = , so that calculated

One-sided finite-difference approximations suitable for use with Richardson extrapolation

CS 2750 Machine Learning Lecture 8. Linear regression. Supervised learning. a set of n examples

Chapter 13. Gas Mixtures. Study Guide in PowerPoint. Thermodynamics: An Engineering Approach, 5th edition by Yunus A. Çengel and Michael A.

XII.3 The EM (Expectation-Maximization) Algorithm

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Computational and Statistical Learning theory Assignment 4

Preference and Demand Examples

Two Conjectures About Recency Rank Encoding

Answers Problem Set 2 Chem 314A Williamsen Spring 2000

The Bellman Equation

Chapter One Mixture of Ideal Gases

Chapter 1. Probability

Department of Electrical and Computer Engineering FEEDBACK AMPLIFIERS

Lecture 3: October 2, 2017

Our focus will be on linear systems. A system is linear if it obeys the principle of superposition and homogenity, i.e.

Quantum Particle Motion in Physical Space

ON THE NUMBER OF PRIMITIVE PYTHAGOREAN QUINTUPLES

ECE559VV Project Report

Lecture 3: Probability Distributions

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

Block-error performance of root-ldpc codes. Author(s): Andriyanova, Iryna; Boutros, Joseph J.; Biglieri, Ezio; Declercq, David

Digital Modems. Lecture 2

Implementation of Code Shift Keying signaling technique in GALILEO E1 signal

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

AN ANALYSIS OF A FRACTAL KINETICS CURVE OF SAVAGEAU

Analysis of Discrete Time Queues (Section 4.6)

Transcription:

Source-Channel-Snk Soe questons Source Channel Snk Aount of Inforaton avalable Source Entro Generall nos and a be te varng Introduces error and lts the rate at whch data can be transferred ow uch nforaton s receved? Equal to source entro? ow uch s lost? Mau Caact? C. In general, the channel ncludes odulator, transsson edu, deodulator and channel decoder. Source Channel Snk Alhabet of nut sbols Iarent causes error n the detected sbol Sae alhabet of sbols Eale: Bnar sste: {,} {,} Message etted fro the source: Message receved b the snk: C.

Dscrete Meorless Channels Dscrete: and have fnte szes,,,,,, { } { } L L Meorless: The current outut sbol deends onl on the current nut sbol and not an of the revous sbols {,, L } {,,, }, L C.3 Transton robablt, robablt that the sbol was sent and the sbol s receved at the outut of the channel., deends on the araeters of the odulator, transsson edu, nose, and deodulator. C.4

Two-sbol source For a two-sbol source, the odel s as follows: Source : ; Snk : and and due to nose. aths reresent correct receton. aths reresent erroneous receton C.5 Bnar Setr Channel BSC error,, Bnar Setr Channel BSC: C.6

Eale.9.5.. error..5..5..9 C.7 M-sbol source: M-sbol source Source : Snk : 3 C.8

M-sbol source Note that for a artcular value of. M M C.9 error reoved M-sbol source. 3 C.

Source entro Entro of the nut source entro s defned as log bts/sbol It reresents the average aount of nforaton etted fro the source. Eale: BSC [.5 log.5.5 log.5] bts/sbol.5.9.9 C... Snk entro Entro of the outut snk entro s defned as log bts/sbol where C.

Eale: BSC.9.5..5.5.5.5.9...9 log bt/sbol C.3 Condtonal entro The effect of nose on the sbols s to cause uncertant n the receved sbol. The aount of uncertant s gven b the condtonal entro. log It easures the uncertant at the snk f was sent. C.4

log : easures the uncertant of about a receved bt based on a transtted bt. C.5 Eale: BSC.5 log.9...9 [.9 log.9.log.].47 bts/sbol.47 bts/sbol.5.47.5.47.47 bts/sbol C.6

Condtonal entro It s also ossble to defne another condtonal entro n ters of the condtonal robabltes log M It easures the uncertant about the source f receved. was C.7 log : reresents how uncertan we are of, on the average, when we know. In other words, t reresents the aount of uncertant reanng about the channel nut after the channel outut has been observed. C.8

Eale: noseless When the channel s noseless, the sbols are receved wthout error.... f Source : Snk : C.9 and Eale: noseless log - NO uncertant about the outut when the nut s known. - NO nforaton was lost. C.

C. Eale: nos When the channel s nos that the outut s statstcall ndeendent of the nut, That s are equal for all and. and then......... C. Eale: nos log log log log } log { log

If BSC, Eale: nos bt/sbol. > bt uncertant. Therefore, No nforaton s conveed. C.3 3 4 Eale: bnar erasure channel 4? Gven : / 3; / 3 log log.98 bts log log C.4

Slarl,? Eale: bnar erasure channel Thus, f or, there s no uncertant about, but f?, we have uncertant about. C.5 Rate of Inforaton transsson A dscrete sbol eorless channel s accetng sbols fro an M-sbol source at a rate of sbols /second. r s log bts/sbol The average rate at whch nforaton s gong nto the channel s D r bts/sec. n s owever, soe nforaton s lost due to nose. C.6

Eale: BSC Suose two sbols {,} are transtted at sbols/sec wth /, /, D n bts/sec. Let the channel be setrc wth robablt of errorless transsson.95. What s the rate of transsson of nforaton? a. 95 bts/sec? b. >95bts/sec? c. <95bts/sec?.95.5.95.5 C.7 Eale: nos For eale,.5, then /, / rresectve of what s actuall beng transtted ver nose condton. Now the sbols receved are correct due to chance alone. In fact, we can dsconnect the ath and guessng the receved sbol to be ether a '' or ''. C.8

Mutual Inforaton reresents the aount of nforaton n that one cannot rel on. Thus the aount of nforaton at the snk ust be reduced b the aount of uncertant that gve the true aount of nforaton receved at the snk. We defne Mutual Inforaton as I ; Alternatvel, I ; can be defned b notng that the nforaton etted b the source,, s reduced b the loss nforaton caused b nose n the channel. I ; C.9 Eale Recall fro revous eales, Noseless channel I ; or I ; Ver nos channel I ; log log or I ; C.3

Eale: BSC Fnd the rate of nforaton transsson over ths channel when.9,.8,.6; assue that the sbol rate s sbols/sec. log log log / / bt/sbol C.3 Eale: BSC Average nforaton transsson over the channel s bts/second r s The rate of nforaton transsson over the channel s gven b the utual nforaton I ; r s. I ; or I ; C.3

Eale: BSC Usng I ; bt/sbol / / log [ log log ] [ log log ] [ log log ] [ log log ] log C.33 Eale: BSC.9.8.6 I ;.53.78.9 I ; 53 bts/sec 78 bts/sec 9 bts/sec r s I ; r s decreases radl as the robablt of error /. C.34

C.35 Suar Snk sbol robabltes Source Entro: bts/sbol Snk Entro: bts/sbol Condtonal Entro: where log log log C.36 where log

Channel Caact: Dscrete Meorless Channel DMC The concet of utual nforaton can be suarzed as follows: I ; In ractce,, are fed for a gven,,..., dscrete channel unless the nose characterstcs are tevarng. The utual nforaton of a channel deends not onl on the channel but also on the wa n whch the channel s used. The nut robablt dstrbuton { } s ndeendent of the channel. We can then aze the utual nforaton of the channel wth resect to. { } C.37 Channel caact The caact of a nos dscrete, eorless channel s defned as the au ossble rate of nforaton transsson over the channel. The au rate of transsson occurs when the source s "atched" to the channel. C Ma { I ; } { } Ma { } { } bts /sbol for all C.38

Channel caact Note that the channel caact s a functon onl of the transton robablt, whch defne the channel. The calculaton of C nvolves azaton of the utual nforaton over varables subect to two constrants: for all C.39 The channel redundanc and channel effcent are defned as Channel Redundanc C I ; C I ; C bts/sbol % Channel Effcenc I ; C % Note that C s eressed n bts/sbol. If the sbol rate s τ s sbol/s, C can be eressed n bts/s b ultlng τ s. C.4

Eale: BSC q q q Let C.4 q q q log log q log q log log q log q log log log log log q log q It deends on the channel characterstc onl! C.4

C.43 log log log where log log C.44 Thus The results show that au caact s acheved when 4444 4 3 44 44 4 ndeendent of log log ; I { } { } and / when Ma / - - / Ma / log log then and C

.9.8.7.6 C.5.4.3...5..5..5.3.35.4.45.5 BSC s a usual odel whch aroates the behavour of an ractcal bnar channels. C.45 Other channel odels Bnar Erasure Channel BEC q e erasure q C.46

Other channel odels Setrc Erasure Channel SEC, a cobnaton of BSC and BEC q r r q e erasure C.47 Contnuous Channel Modulator Modulator Deodulator Shannon's Theore nose Contnuous Channel Gven a source of M equall lkel essages, wth M >>, whch s generatng nforaton at a rate of R bts er second. If R C, the channel caact, there ests a channel codng technque such that the councaton sste wll transt nforaton wth an arbtrar sall robablt of error. C.48

Shannon-artle Theore For a whte, bandlted Gaussan channel, the channel caact s C B log S N bt/sec Note: S/N not the 'db' value where B - bandwdth S - average sgnal ower at outut of contnuous channel N - average nose ower at outut of contnuous channel C.49 Shannon-artle Theore Gaussan whte nose df sd η / η : two-sded ower sectral denst sd of the nose n watt/z η N B ηb Watts B - channel bandwdth B Nose altude B frequenc C.5

Soe secal cases Uer lt for relable data transsson over a Gaussan channel For eale: Bandwdth of a telehone lne 3kz, S/N 3dB C 3log 3kb / sec C and S/N rato C B log N S / N C S N bt/sec Therefore we can ncrease the channel caact b reducng nose ower. C.5 C and B η N B ηb S C B log N S η S B log η S ηb S log η S ηb ηb S S S l C log e.44 B η η Ql / e As B ncreases so does the nose. In the resence of nose, C reaches a fnte uer lt as bandwdth ncreases, for a fed sgnal ower. C.5

Eale Consder C bt/s Bandwdth 3z, S/N? 3 log S / N S / N 3.333 9 Bandwdth z, S/N? log S / N S / N For the sae C, bandwdth can be reduced fro kz to 3kz f we ncrease S/N 9 tes. C.53 Eale Can we transt an analogue sgnal of bandwdth over a channel havng a bandwdth less than? f f z Suose salng rate s 3 Nqust rate and nuber of quantzaton level s M Nuber of levels er sbol Data rate R 6 f log M bts/s Sa M 64, R 36 f bt/s and bandwdth of channel B, we can then work out S/N, rovdng R C. C.54

Eale Sa M 64, R 36 f bt/s Let C R but the channel bandwdth s f / 36 f 7 log S N f log 7 S N S N 7dB C.55