Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Similar documents
CSCI 2570 Introduction to Nanocomputing

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Chapter 9 Fundamental Limits in Information Theory

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Lecture 22: Final Review

Coding for Discrete Source

Lecture 8: Shannon s Noise Models


channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Digital communication system. Shannon s separation principle

Lecture 1. Introduction

ELEC 515 Information Theory. Distortionless Source Coding

Multimedia. Multimedia Data Compression (Lossless Compression Algorithms)

Entropies & Information Theory

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Exercise 1. = P(y a 1)P(a 1 )

Introduction to Information Theory. Part 4

Lecture 9 Polar Coding

(Classical) Information Theory III: Noisy channel coding

Shannon s noisy-channel theorem

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

exercise in the previous class (1)

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

16.36 Communication Systems Engineering

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Non-binary Distributed Arithmetic Coding

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Compression and Coding

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

ECE Information theory Final

UNIT I INFORMATION THEORY. I k log 2

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

BASICS OF COMPRESSION THEORY

Lecture 4 Channel Coding

Information and Entropy

Chapter 2: Source coding

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Revision of Lecture 5

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

Lecture 11: Polar codes construction

Lecture 4 Noisy Channel Coding

Source Coding: Part I of Fundamentals of Source and Video Coding

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

An introduction to basic information theory. Hampus Wessman

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

MATH 433 Applied Algebra Lecture 21: Linear codes (continued). Classification of groups.

Physical Layer and Coding

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

Lecture 1: Shannon s Theorem

Performance of Polar Codes for Channel and Source Coding

1 Background on Information Theory

6.02 Fall 2011 Lecture #9

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

COMM901 Source Coding and Compression. Quiz 1

ELEMENTS O F INFORMATION THEORY

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Reliable Computation over Multiple-Access Channels

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

Source Coding Techniques

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

National University of Singapore Department of Electrical & Computer Engineering. Examination for

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers

3F1 Information Theory, Lecture 3

Basic Principles of Video Coding

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

ELEMENT OF INFORMATION THEORY

Channel Coding and Interleaving

18.310A Final exam practice questions

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

ITCT Lecture IV.3: Markov Processes and Sources with Memory

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

Distributed Lossless Compression. Distributed lossless compression system

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

Principles of Communications

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Universal Anytime Codes: An approach to uncertain channels in control

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

Lecture 18: Gaussian Channel

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

4 An Introduction to Channel Coding and Decoding over BSC

Shannon's Theory of Communication

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Communication Theory II

Lecture 15: Conditional and Joint Typicaility

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Springer Undergraduate Texts in Mathematics and Technology

Transcription:

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8

I Overview II Overview of ECE 154C 2 / 8

I: Digital Communications Block Diagram I Overview II 3 / 8

I: Digital Communications Block Diagram I Overview II Note that the Source Encoder converts all types of information to a stream of binary digits. 3 / 8

I: Digital Communications Block Diagram I Overview II Note that the Source Encoder converts all types of information to a stream of binary digits. Note that the Channel Endcouter, in an attempt to protect the source coded (binary) stream, judiciously adds redundant bits. 3 / 8

I: Digital Communications Block Diagram I Overview II Sometimes the output of the source decoder must be an exact {replica of the information (e.g. computer data) called NOISELESS CODING (aka lossless compression) 3 / 8

I: Digital Communications Block Diagram I Overview II Sometimes the output of the source decoder must be an exact {replica of the information (e.g. computer data) called NOISELESS CODING (aka lossless compression) Other times the output of the source decoder can be approximately equal to the information (e.g. music, tv, speech) called CODING WITH DISTORTION (aka lossy compression) 3 / 8

Overview II: What will we cover? I Overview II REFERENCE: CHAPTER 10 ZIEMER & TRANTER SOURCE CODING - NOISELESS CODES Basic idea is to use as few binary digits as possible and still be able to recover the information exactly Topics include: Huffman Codes Shannon Fano Codes Tunstall Codes Entropy of Source Lempel-Ziv Codes 4 / 8

Overview II: What will we cover? I Overview II REFERENCE: CHAPTER 10 ZIEMER & TRANTER SOURCE CODING WITH DISTORTION Again the idea is to use minimum number of binary digits for a given value of distortion Topics include: Gaussian Source Optimal Quantizing 4 / 8

Overview II: What will we cover? I Overview II REFERENCE: CHAPTER 10 ZIEMER & TRANTER CHANNEL CAPACITY OF A NOISY CHANNEL Even if channel is noisy, messages can be sent essentially error free if extra digits are transmitted Basic idea is to use as few extra digits as possible Topics Covered: Channel Capacity Mutual Information Some 4 / 8

Overview II: What will we cover? I Overview II REFERENCE: CHAPTER 10 ZIEMER & TRANTER CHANNEL CODING Basic idea Detect errors that occured on channel and then correct them Topics Covered: Hamming Code General Theory of Block Codes (Parity Check Matrix, Generator Matrix, Minimum Distance, etc.) LDPC Codes Turbo Codes Code Performance 4 / 8

More A Few 5 / 8

Example 1: 4 letter DMS More Basic concepts came from one paper of one man named Claude Shannon! 6 / 8

Example 1: 4 letter DMS More Basic concepts came from one paper of one man named Claude Shannon! Shannon used simple models that capture the essence of the problem! 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) I.I.D. (Independent and Identically Distributed) source letters 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) I.I.D. (Independent and Identically Distributed) source letters Alphabet size of 4 (A,B,C,D) 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) I.I.D. (Independent and Identically Distributed) source letters Alphabet size of 4 (A,B,C,D) P(A) = p 1,P(B) = p 2,P(C) = p 3,P(D) = p 4, i p i = 1 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) I.I.D. (Independent and Identically Distributed) source letters Alphabet size of 4 (A,B,C,D) P(A) = p 1,P(B) = p 2,P(C) = p 3,P(D) = p 4, i p i = 1 Simplest Code A 00 B 01 C 10 D 11 6 / 8

Example 1: 4 letter DMS More EXAMPLE 1 Simple Model of a source (Called a DISCRETE MEMORYLESS SOURCE OR DMS) I.I.D. (Independent and Identically Distributed) source letters Alphabet size of 4 (A,B,C,D) P(A) = p 1,P(B) = p 2,P(C) = p 3,P(D) = p 4, i p i = 1 Simplest Code A 00 B 01 C 10 D 11 6 / 8

Example 1: 4 letter DMS More 6 / 8

Example 1: 4 letter DMS More Average length of code words L = 2(p 1 +p 2 +p 3 +p 4 ) = 2 6 / 8

Example 1: 4 letter DMS More Average length of code words L = 2(p 1 +p 2 +p 3 +p 4 ) = 2 Q: Can we use fewer than 2 binary digits per source letter (on the average) and still recover information from the binary sequence? 6 / 8

Example 1: 4 letter DMS More Average length of code words L = 2(p 1 +p 2 +p 3 +p 4 ) = 2 Q: Can we use fewer than 2 binary digits per source letter (on the average) and still recover information from the binary sequence? A: Depends on values of(p 1,p 2,p 3,p 4 ) 6 / 8

Example 2: Binary Symmetric Channel More EXAMPLE 2 Simple Model for Noisy Channel 7 / 8

Example 2: Binary Symmetric Channel More EXAMPLE 2 Simple Model for Noisy Channel Channels, as you saw in ECE154B, can be viewed as Ifs 0 (t) = s 1 (t) and equally likely signals, ( ) 2E P error = Q = P N 0 7 / 8

Example 2: Binary Symmetric Channel More EXAMPLE 2 Simple Model for Noisy Channel Channels, as you saw in ECE154B, can be viewed as Ifs 0 (t) = s 1 (t) and equally likely signals, ( ) 2E P error = Q = P N 0 Q: Can we send information error-free over such a channel even thoughp 0,1? 7 / 8

Example 2: Binary Symmetric Channel More EXAMPLE 2 Simple Model for Noisy Channel Shannon considered a simpler channel called binary symmetric channel (or BSC for short) Pictorially Mathematically P Y X (y x) = { 1 p y = x p y x Q: Can we send information error-free over such a channel even thoughp 0,1? 7 / 8

Example 2: Binary Symmetric Channel More EXAMPLE 2 Simple Model for Noisy Channel Shannon considered a simpler channel called binary symmetric channel (or BSC for short) Pictorially Mathematically P Y X (y x) = { 1 p y = x p y x Q: Can we send information error-free over such a channel even thoughp 0,1? A: Depends on the rate of transmission (how many channel uses are allowed per information bit). Essentially for small enough of transmission rate (to be defined precisely), the answer is YES! 7 / 8

Example 3: DMS with Alphabet size8 More 8 / 8

Example 3: DMS with Alphabet size8 More EXAMPLE 3 Discrete Memoryless Source with alphabet size of 8 letters: {A,B,C,D,E,F,G,H} Probabilities: {p A p B p C p D p E p F p G,p H } See the following codes: Q: Which codes are uniquely decodable? Which ones are instantaneously decodable? Compute the average length of the codewords for each code. 8 / 8

Example 3: DMS with Alphabet size8 More EXAMPLE 4 Can you optimally design a code? L = 1 2 1+ 1 4 2+ 1 8 3+ 1 16 4+ 4 64 6 = 1 32 + 1 32 + 1 16 + 1 8 + 1 4 + 1 2 +1 = 2 We will see that this is an optimal code (not only among the single-letter constructions but overall). 8 / 8

Example 3: DMS with Alphabet size8 More EXAMPLE 5 L =.1+.1+.2+.2+.3+.5+1 = 2.4 But here we can do better by encoding 2 source letters (or more) at a time? 8 / 8