Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Similar documents
Introduction to Information Theory

ELEMENTS O F INFORMATION THEORY

Computing and Communications 2. Information Theory -Entropy

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Chapter 6 The Structural Risk Minimization Principle

ECE Information theory Final (Fall 2008)

(Classical) Information Theory III: Noisy channel coding

X 1 : X Table 1: Y = X X 2

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Design of Optimal Quantizers for Distributed Source Coding

Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur

Information Theory Primer:

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Exchanging Third-Party Information in a Network

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Chain Independence and Common Information

Memory in Classical Information Theory: A Brief History

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

Is there an Elegant Universal Theory of Prediction?

Information in Biology

Chapter 9 Fundamental Limits in Information Theory

Lecture 4 Noisy Channel Coding

Noisy-Channel Coding

CISC 876: Kolmogorov Complexity

Reliable Computation over Multiple-Access Channels

Information in Biology

Shannon s noisy-channel theorem

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

Entropies & Information Theory

Introduction to Information Theory. Part 4

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

6.02 Fall 2012 Lecture #1

PERFECTLY secure key agreement has been studied recently

Communication Cost of Distributed Computing


3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

UNIT I INFORMATION THEORY. I k log 2

On Common Information and the Encoding of Sources that are Not Successively Refinable

ECE Information theory Final

List Decoding of Reed Solomon Codes

Kolmogorov Complexity

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Non-binary Distributed Arithmetic Coding

5 Mutual Information and Channel Capacity

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

ITCT Lecture IV.3: Markov Processes and Sources with Memory

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity

Dept. of Linguistics, Indiana University Fall 2015

Information Theoretic Limits of Randomness Generation

Lecture 11: Polar codes construction

Quantum Information Theory and Cryptography

Noisy channel communication

6.02 Fall 2011 Lecture #9

Lecture 15: Conditional and Joint Typicaility

The Poisson Channel with Side Information

Hypothesis Testing with Communication Constraints

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Lecture 5: Asymptotic Equipartition Property

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Shannon and Poisson. sergio verdú

Expectation Maximization

Randomness. What next?

Information, Learning and Falsification

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Kolmogorov structure functions for automatic complexity

One Lesson of Information Theory

Lecture 9: Bayesian Learning

CSCI 2570 Introduction to Nanocomputing

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

(Classical) Information Theory II: Source coding

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Compression Complexity

Performance of Polar Codes for Channel and Source Coding

Applications of Information Theory in Science and in Engineering

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Lecture 11: Continuous-valued signals and differential entropy

On Source-Channel Communication in Networks

CODING WITH SIDE INFORMATION. A Dissertation SZE MING CHENG

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Coding for Discrete Source

Uncertainity, Information, and Entropy

Fundamentals of Information Theory Lecture 1. Introduction. Prof. CHEN Jie Lab. 201, School of EIE BeiHang University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Lecture 2: August 31

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Digital communication system. Shannon s separation principle

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006

Distributed Functional Compression through Graph Coloring

Lecture 1: Introduction, Entropy and ML estimation

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Transcription:

Information Theory and Coding Techniques: Chapter 1.1 What is Information Theory? Why you should take this course? 1

What is Information Theory? Information Theory answers two fundamental questions in communication theory: What is the ultimate data compression (the entropy H) What is the ultimate transmission rate of communication (the channel capacity C) It founds the most basic theoretical foundations of communication theory. 2

Moreover, Information Theory intersects Physics (Statistical Mechanics) Mathematics (Probability Theory) Electrical Engineering (Communication Theory) Computer Science (Algorithm Complexity) Economics (Portfolio / Game Theory) This is why you should learn Information Theory. 3

Electrical Engineering (Communication Theory) In the early 1940s, Shannon proved that the error probability of transmission error could be made nearly zero for all communication rates below Channel Capacity. Source:H channel:c Destination H<C The Capacity, C, can be computed simply from the noise characteristics (described by conditional probabilities) of the channel. 4

Shannon further argued that random processes (signals) such as music and speech have an irreducible complexity below which the signal cannot be compressed. This he named the Entropy. Shannon argued that if the entropy of the source is less than the Capacity of the channel, asymptotically (in probabilistic sense) error-free communication can be achieved. 5

Computer Science (Kolmogorov Complexity) Kolmogorov, Chaitin, and Solomonoff put the idea that the complexity of a string of data can be defined by the length of the shortest binary computer program for computing the string. The Complexity is the Minimum description length! This definition of complexity is universal, that is, computer independent, and is of fundamental importance. Kolmogorov Complexity lays the foundation for the theory of descriptive complexity. 6

Gratifyingly, the Kolmogorov complexity K is approximately equal to the Shannon entropy H if the sequence is drawn at random from a distribution that has entropy H. Kolmogorov complexity is considered to be more fundamental than Shannon entropy. It is the ultimate data compression and leads to a logically consistent procedure for inference. (For example, if one can descript a picture by a program with length L, then L and the associated program can be treated as a more meaningful representation of the picture than that of the bit count does.) 7

One can think about computational complexity (time complexity) and Kolmogorov complexity (program length or descriptive complexity) as two axes corresponding to program running time and program length. Kolmogorov complexity focuses on minimizing along the second axis, and computational complexity focuses on minimizing along the first axis. Little work has been done on the simultaneous minimization of the two. 8

Mathematics (Probability Theory and Statistics) The fundamental quantities of Information Theory Entropy, Relative Entropy, and Mutual Information are defined as functionals of probability distributions. In turn, they characterize the behavior of long sequences of random variables and allow us to estimate the probabilities of rare events and to find the best error exponent in hypothesis tests. 9

Computation vs. Communication. As we build larger Computers out of smaller components, we encounter both a computation limit and a communication limit. Computation is communication limited and communication is computation limited. These become intertwined, and thus all of the developments in communication theory via information theory should have a direct impact on the theory of computation. Real examples for the above dual-limit effects can be found in VLSI, Multi-Core Processor, and Cloud Computing fields. 10

New Trends in Information Theory. Compress each of many sources and then put the compressed descriptions together into a joint reconstruction of the sources Wyner-Ziv and Slepian-Wolf theorems. Distributed Source Coding! If one has many senders sending information independently to a common receiver, what is the channel capacity of this Multiple-Access channel Liao and Ahlswede theorem. 11

If one has one sender and many receives and wishes to communicate (perhaps different) information simultaneously to each of the receiver, what is the channel capacity of this Broadcasting channel. ------- Scalable Video Coding! If one has arbitrary number of senders and receivers in an environment of interference and noise, what is the capacity region of achievable rates from the various senders to the receivers. -------- Networked Information Theory! 12