Lecture 1. Introduction

Similar documents
Lecture 22: Final Review

Lecture 5: Asymptotic Equipartition Property

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

ECE Information theory Final

Lecture 20: Quantization and Rate-Distortion

Lecture 8: Shannon s Noise Models

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Chapter 9 Fundamental Limits in Information Theory

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

Lecture 2: August 31

ECE Information theory Final (Fall 2008)

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Lecture 18: Gaussian Channel


18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Homework Set #2 Data Compression, Huffman code and AEP

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Lecture 1: Introduction, Entropy and ML estimation

Non-binary Distributed Arithmetic Coding

Coding for Discrete Source

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

UNIT I INFORMATION THEORY. I k log 2

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Compression and Coding

(Classical) Information Theory II: Source coding

ECE INFORMATION THEORY. Fall 2011 Prof. Thinh Nguyen

Chapter 2: Source coding

Information Theory. M1 Informatique (parcours recherche et innovation) Aline Roumy. January INRIA Rennes 1/ 73

Shannon's Theory of Communication

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Information Theory and Coding Techniques

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Entropies & Information Theory

Reliable Computation over Multiple-Access Channels

Lecture 4 Channel Coding

Principles of Communications

CS 630 Basic Probability and Information Theory. Tim Campbell

One Lesson of Information Theory

CSCI 2570 Introduction to Nanocomputing

3F1 Information Theory, Lecture 3

Space-Time Coding for Multi-Antenna Systems

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Lecture 4 Noisy Channel Coding

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Computing and Communications 2. Information Theory -Entropy

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Exercises with solutions (Set D)

5 Mutual Information and Channel Capacity

Lecture 11: Continuous-valued signals and differential entropy

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

Digital communication system. Shannon s separation principle

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

Lecture 5 Channel Coding over Continuous Channels

(Classical) Information Theory III: Noisy channel coding

Information Theory and Statistics Lecture 2: Source coding

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Revision of Lecture 5

lossless, optimal compressor

Entropy as a measure of surprise

Chapter 4: Continuous channel and its capacity

at Some sort of quantization is necessary to represent continuous signals in digital form

3F1 Information Theory, Lecture 3

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Noisy channel communication

Block 2: Introduction to Information Theory

Massachusetts Institute of Technology

Shannon s noisy-channel theorem

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Appendix B Information theory from first principles

Lecture 2. Capacity of the Gaussian channel

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411

Lecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1)

Lecture 17: Differential Entropy

the Information Bottleneck

EE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi

Upper Bounds on the Capacity of Binary Intermittent Communication

Information & Correlation

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Communications Theory and Engineering

Information Theory, Statistics, and Decision Trees

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

ELEMENT OF INFORMATION THEORY

Welcome to Comp 411! 2) Course Objectives. 1) Course Mechanics. 3) Information. I thought this course was called Computer Organization

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

Lecture 14 February 28

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

The Method of Types and Its Application to Information Hiding

Transcription:

Lecture 1. Introduction What is the course about? Logistics Questionnaire Dr. Yao Xie, ECE587, Information Theory, Duke University

What is information? Dr. Yao Xie, ECE587, Information Theory, Duke University 1

!"#$%&'&($%&$#&'&)*(+$,'#-&*,.$&'&/0'(1&"$023& %"*("&"'4&"*5"2#&*,6$#7'+$,&0$448& 9&!$7&:$;2#& Dr. Yao Xie, ECE587, Information Theory, Duke University 2

How to quantify information? Small information content Dr. Yao Xie, ECE587, Information Theory, Duke University Large information content 3

What is the fundamental limit of data transfer rate?!"#"$%&'('%)'(*%+%,-"(./% #"-*)%0123/$%&'('%)'(*%+%4-"(./% Dr. Yao Xie, ECE587, Information Theory, Duke University 4

Some people think information theory (IT) is about... Dr. Yao Xie, ECE587, Information Theory, Duke University 5

But IT is also about these...!"#"$%&'()*++,&-$$ %&1,-2$ %&'(.#"0&-3$4&5'&2&)&6$%&'(5*7,#8$!"#"$%&''.-,/"0&-$ Dr. Yao Xie, ECE587, Information Theory, Duke University 6

And even these...!"#$%&'( )*%+,&"--"./(0"1-213( 91:"-#+"1#;(36+<=213( 42%215%&+678-( Dr. Yao Xie, ECE587, Information Theory, Duke University 7

Where IT all begins...!"#$%&'())&*+,-&.(/0-&123456)& *065525%&!"!7&8&9::!& Dr. Yao Xie, ECE587, Information Theory, Duke University 8

Information Theory Shannon s information theory deals with limits on data compression (source coding) and reliable data transmission (channel coding) How much can data can be compressed? How fast can data be reliably transmitted over a noisy channel? Two basic point-to-point communication theorems (Shannon 1948) Source coding theorem: the minimum rate at which data can be compressed losslessly is the entropy rate of the source Channel coding theorem: The maximum rate at which data can be reliably transmitted is the channel capacity of the channel Dr. Yao Xie, ECE587, Information Theory, Duke University 9

Since Shannon s 1948 paper, many extensions Rate distortion theory Source coding and channel capacity for more complex sources Capacity for more complex channels (multiuser networks) Information theory was considered (by most) an esoteric theory with no apparent relation to the real world Recently, advances in technology (algorithms, hardware, software) today there are practical schemes for data compression transmission and modulation error correcting coding compressed sensing techniques information security Dr. Yao Xie, ECE587, Information Theory, Duke University 10

IT encompasses many fields Dr. Yao Xie, ECE587, Information Theory, Duke University 11

In this class we will cover the basics Nuts and Bolts Entropy: uncertainty of a single random variable H(X) = x p(x) log 2 p(x)(bits) Conditional Entropy: H(X Y ) Mutual information: reduction in uncertainty due to another random variable I(X; Y ) = H(X) H(X Y ) Channel capacity C = max p(x) I(X; Y ) Relative entropy: D(p q) = p(x) x p(x) log q(x) Dr. Yao Xie, ECE587, Information Theory, Duke University 12

Physical Channel Source Encoder Transmitter Receiver Decoder Data compression limit (lossless source coding) Data transmission limit (channel capacity) Tradeoff between rate and distortion (lossy compression) Data compression limit Data transmission limit ^ min l(x; X ) max l(x; Y ) Dr. Yao Xie, ECE587, Information Theory, Duke University 13

Important Funcationals Upper case X, Y,... refer to random variables X, Y, alphabet of random variables p(x) = P(X = x) p(x, y) = P(X = x, Y = y) Probability density function f(x) Dr. Yao Xie, ECE587, Information Theory, Duke University 14

Expectation: µ = E{X} = xp(x) Why is this of particular interest? It appears in Law of Large Number (LLN): If x n independent and identically distributed, 1 N N x n E{X}, w.p.1 n=1 Variance: σ 2 = E{(X µ) 2 } = E{X 2 } µ 2 Why is this of particular interest? It appears in Central Limit Theorem (CLT): 1 N (x n µ) N (0, 1) Nσ 2 n=1 Dr. Yao Xie, ECE587, Information Theory, Duke University 15

Information theory: is it all about theory? Yes and No. Yes, it s theory. We will see many proofs. But it s also in preparation for other subjects Coding theory (Prof. R. Calderbank) Wireless communications Compressed sensing Stochastic network Many proof ideas come in handy in other areas of research Dr. Yao Xie, ECE587, Information Theory, Duke University 16

No. Hopefully you will walk out of this classroom understanding Basic concepts people talk on the streets: entropy, mutual information... Channel capacity - all wireless guys should know Huffman code (the optimal lossless code) Hamming code (commonly used single error correction code) Water-filling - power allocation in all communication systems Rate-distortion function - if you want to talk with data compression guy Dr. Yao Xie, ECE587, Information Theory, Duke University 17

Course Logistics Schedule: T, Th, 1:25-2:40pm, Hudson 207 TA: Miao Liu, Email: miao.liu@duke.edu Homework: out Thurs in class, due next Thurs in class Grading: homework 30%, two in-class Midterms (each 20%), one Final 30% Midterms: 10/4, 11/6 in class Final: 11/29 in class Prerequisite: basic probability and statistics Dr. Yao Xie, ECE587, Information Theory, Duke University 18