A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

Similar documents
EGR 544 Communication Theory

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding

Source-Channel-Sink Some questions

Lecture 3: Shannon s Theorem

Hidden Markov Model Cheat Sheet

Lec 02 Entropy and Lossless Coding I

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Pulse Coded Modulation

Discrete Memoryless Channels

Introduction to Information Theory, Data Compression,

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

} Often, when learning, we deal with uncertainty:

Fuzzy approach to solve multi-objective capacitated transportation problem

ECE559VV Project Report

Bayesian Decision Theory

Algorithms for factoring

Digital Modems. Lecture 2

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Naïve Bayes Classifier

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

Advanced Topics in Optimization. Piecewise Linear Approximation of a Nonlinear Function

Combinational Circuit Design

Channel Encoder. Channel. Figure 7.1: Communication system

Priority Queuing with Finite Buffer Size and Randomized Push-out Mechanism

Digital PI Controller Equations

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

VQ widely used in coding speech, image, and video

Reliability Gain of Network Coding in Lossy Wireless Networks

Lecture 9: Converse of Shannon s Capacity Theorem

Changing Topology and Communication Delays

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

THERMODYNAMICS. Temperature

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

6. Hamilton s Equations

Error Probability for M Signals

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration

Matching Dyadic Distributions to Channels

Non-Ideality Through Fugacity and Activity

Michael Batty. Alan Wilson Plenary Session Entropy, Complexity, & Information in Spatial Analysis

Performing Modulation Scheme of Chaos Shift Keying with Hyperchaotic Chen System

Lecture # 02: Pressure measurements and Measurement Uncertainties

On the Correlation between Boolean Functions of Sequences of Random Variables

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction

Chapter 6. Wideband channels. Slides for Wireless Communications Edfors, Molisch, Tufvesson

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

Secret Communication using Artificial Noise

Joint Decoding of Content-Replication Codes for Flash Memories

Module 9. Lecture 6. Duality in Assignment Problems

Independent Component Analysis

Externalities in wireless communication: A public goods solution approach to power allocation. by Shrutivandana Sharma

What Independencies does a Bayes Net Model? Bayesian Networks: Independencies and Inference. Quick proof that independence is symmetric

The Dirac Equation for a One-electron atom. In this section we will derive the Dirac equation for a one-electron atom.

EM and Structure Learning

Using Genetic Algorithms in System Identification

Parametric Useful Information Measure of Order Some Coding Theorem

Communication with AWGN Interference

Complete Variance Decomposition Methods. Cédric J. Sallaberry

Entropy of Markov Information Sources and Capacity of Discrete Input Constrained Channels (from Immink, Coding Techniques for Digital Recorders)

A new algorithm for computing distance matrix and Wiener index of zig-zag polyhex nanotubes

Lesson 16: Basic Control Modes

The Decibel and its Usage

On fuzzy information theory

Comparing two Quantiles: the Burr Type X and Weibull Cases

TLCOM 612 Advanced Telecommunications Engineering II

Introduction to information theory and data compression

On New Selection Procedures for Unequal Probability Sampling

Application of artificial intelligence in earthquake forecasting

Analytical Chemistry Calibration Curve Handout

Collision Resolution Based on Independent Component Analysis

Web-Mining Agents Probabilistic Information Retrieval

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Bayesian networks for scenario analysis of nuclear waste repositories

Bounds on the Effective-length of Optimal Codes for Interference Channel with Feedback

Interpolated Markov Models for Gene Finding

S Advanced Digital Communication (4 cr) Targets today

Derivatives of Value at Risk and Expected Shortfall

Supplementary Material for Spectral Clustering based on the graph p-laplacian

Lecture Notes on Linear Regression

Sequential Processes. In the case of sequential processes, this information indicates explicitly the possible transactions from a state.

SIMPLE REACTION TIME AS A FUNCTION OF TIME UNCERTAINTY 1

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing

A Robust Method for Calculating the Correlation Coefficient

Comparison of Different Techniques for Offshore Wind Farm Reliability Assessment

Joint Scheduling and Power-Allocation for Interference Management in Wireless Networks

A total variation approach

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

Neryškioji dichotominių testo klausimų ir socialinių rodiklių diferencijavimo savybių klasifikacija

State Amplification and State Masking for the Binary Energy Harvesting Channel

DO NOT DO HOMEWORK UNTIL IT IS ASSIGNED. THE ASSIGNMENTS MAY CHANGE UNTIL ANNOUNCED.

Lecture 3 Examples and Problems

1 Bref Introducton Ths memo reorts artal results regardng the task of testng whether a gven bounded-degree grah s an exander. The model s of testng gr

Photons and Quantum Information. Stephen M. Barnett

A Simple 3-Approximation of Minimum Manhattan Networks

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Communication Complexity 16:198: February Lecture 4. x ij y ij

Transcription:

A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00

Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory Frst erson to use a robablstc model of communcaton Develoed around same tme as Codng Theory uge Imact: now the mathematcal theory of communcaton follow on aers dea that all nformaton s essentally dgtal telecommuncatons, CD layers, comuter networks alcatons to bology, artfcal ntellgence..

Questons: ow much nformaton s roduced by a source? (nfo/symbol or nfo/sec) ow quckly can nformaton be transmtted through a channel? (nfo/sec) What s best achevable transmsson rate (source symbols/sec)? If channel has nose, under what condtons can the sent message be reconstructed from the receved message?

What s nformaton? Acqurng nformaton Reducng uncertanty Amount of nformaton Level of surrse { s :1 robablty that s Informaton( s 1/16, 0, n} set of all ossble events ) Examle: {0,1} 0 s 0 0 N 1 log occurs 1/ 2 "bts" {0,1} 1/ 2 Info(0) Info(1) 1bt 1 s 1 N 1/ 2 Info( s) N bts 15/16 Info(0) 4 bts 1 Info(1) 0 bts N N

Channel caacty measured n bts/sec N ( T ) number C of lm T log allowed N ( T ) / T sgnals of duraton T Examle: Dgtal channel All{0,1}sequences allowed, roduce N( T ) 2 rt C r bts/sec r symbols/sec. Allows more comlcated channel structures: varyng tme er symbol restrctons on allowed sequences of symbols

Defne nformaton generated by source (measured n bts/symbol) to be exected amount of nformaton generated er symbol. Recall, o, Info( s ) log1/, s E(Info) s Call ths quantty the Entroy of the source. Use the symbol. (x) Where x s a random varable reresentng our sgnal. s log1/ log

Nce roertes of Entroy: If (x, y) (x) + (y) only f x, y ndeendent. x (y) uose x, y two events, () 0 0 only f 1for some n, s maxmzed when 1/n then (x, y) - (y) Then (x, y) (x) +,j,j (,j) log (,j) (x) + (y) Defne Condtonal Entroy (uncertanty of y gven value of x): - - (,j) log x,j () (j) log (j) (y), and (y) (j) x (y)

Now consder messagesof uosesource roduceseach symbolndeendently at random. Then wth hgh robablty, for a message m #of for N large,have 2 occurencesof log so m N 2 log N Ths result also holds for more comlcated source models. robable,equally lkely messages, For ergodcmarkov rocesses,use entroy m m s N n length N m N N, N N large. P states

A channel wth nose: Consder two dstnct sgnals x sgnal nut nto the channel y sgnal receved at the other end Channel caacty C Equvocaton Rate of actual transmsson R(x) max nfo sources R(x) y (x) (x) - max nfo sources y (x) bts/sec ((x) y (x))

The Fundamental Theorem for a Dscrete Channel wth Nose: Let a dscretechannelhavecaacty C, entroy bts/second. that theoutut of arbtrarly smallerrors. Proof: RecallC uose encodng has nut entroy 2 2 (x)t (y)t max encodngs If and a dscretesourcehave < C, thereexsts a codngsystemsuch thesourcecan be transmtted over thechannelwth R(x) attans ths (x), outut entroy robablenut messages of robable receved messages of maxmum (or arbtrarly close). duraton T, (y). duraton T, and o there are 2 x (y)t robablenuts for a gven outut.

Construct a bartte grah, where each node s a robable nut or outut message of duraton T for source. Connect nodes A and B by an edge f message A s an nut lkely to roduce outut B. Let R be the source we re nterested n, wth entroy < C. Encode R by randomly assgnng messages of duraton T to nodes n the left column of the grah. Gven an outut message, the robablty that t s connected to more than one R-nut message s (2 R T / 2 T )2 s x T 2 ( R ( x ) T 2 ( R C) T 0 as T

Extensons to hannon s work: Contnuous source/channel (n 2nd art of aer) Consder mult-termnal case Consder mult-way channels (lke telehone lnes!) Consder more comlcated source structures (non-ergodc!) and dfferent memory models for transmtters. Kolmogorov aled hannon s deas to solve long-standng roblems n ergodc theory. Alcatons to bology: Entroy of DNA to dentfy bndng stes Intra-organsm communcaton

Dscusson Tocs: Any questons? Any hannon anecdotes? Requred readng at NA Wrote good artcle on the mathematcs of jugglng Made a maze-learnng mouse out of hone-relays Marred a numercal analyst from Bell Labs hannon says (.413) that no exlct descrton s known of aroxmatons to the deal codng for a nosy channel. I understand ths s stll the case. Comments on what s done n ractce? Other alcatons/mact of nformaton theory? Any deas about entroy of Englsh and crossword uzzles? (.399) ow to go about rovng such a result?