Information Theory and Coding Techniques

Similar documents
Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Entropy as a measure of surprise

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

CSCI 2570 Introduction to Nanocomputing

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Chapter 2: Source coding

Chapter 9 Fundamental Limits in Information Theory

UNIT I INFORMATION THEORY. I k log 2

Compression and Coding

3F1 Information Theory, Lecture 1

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Lecture 7 Predictive Coding & Quantization

1 Introduction to information theory

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

An introduction to basic information theory. Hampus Wessman

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Digital communication system. Shannon s separation principle

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018


Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments

Computing and Communications 2. Information Theory -Entropy

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments. Tutorial 1. Acknowledgement and References for lectures 1 to 5

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

3F1 Information Theory, Lecture 3

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 4 Channel Coding

Information and Entropy

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

One Lesson of Information Theory

EE5585 Data Compression January 29, Lecture 3. x X x X. 2 l(x) 1 (1)

2018/5/3. YU Xiangyu

Lecture 1: Shannon s Theorem

Data Compression Techniques

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Lecture 1. Introduction

3F1 Information Theory, Lecture 3

ELEC 515 Information Theory. Distortionless Source Coding

Intro to Information Theory

Image Compression. Fundamentals: Coding redundancy. The gray level histogram of an image can reveal a great deal of information about the image

Coding for Discrete Source

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Massachusetts Institute of Technology

6.02 Fall 2012 Lecture #1

Multimedia Networking ECE 599

COMM901 Source Coding and Compression. Quiz 1

A Mathematical Theory of Communication

ITCT Lecture IV.3: Markov Processes and Sources with Memory

Information Theory - Entropy. Figure 3

Lecture 3 : Algorithms for source coding. September 30, 2016

Lecture 22: Final Review

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Lecture 8: Shannon s Noise Models

Review of Quantization. Quantization. Bring in Probability Distribution. L-level Quantization. Uniform partition

ELEMENT OF INFORMATION THEORY

Information Theory (Information Theory by J. V. Stone, 2015)

Chapter 5: Data Compression

Source Coding: Part I of Fundamentals of Source and Video Coding

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

Introduction p. 1 Compression Techniques p. 3 Lossless Compression p. 4 Lossy Compression p. 5 Measures of Performance p. 5 Modeling and Coding p.

Source Coding Techniques

Basic Principles of Video Coding

The Measure of Information Uniqueness of the Logarithmic Uncertainty Measure

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

Principles of Communications

Summary of Last Lectures

Block 2: Introduction to Information Theory

Data Compression. Limit of Information Compression. October, Examples of codes 1

CMPT 365 Multimedia Systems. Lossless Compression

Information Theory, Statistics, and Decision Trees

On Compression Encrypted Data part 2. Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University

Image Compression. Qiaoyong Zhong. November 19, CAS-MPG Partner Institute for Computational Biology (PICB)

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

(Classical) Information Theory III: Noisy channel coding

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

BASICS OF COMPRESSION THEORY

Efficient Alphabet Partitioning Algorithms for Low-complexity Entropy Coding

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

lossless, optimal compressor

Quantum-inspired Huffman Coding

CSEP 521 Applied Algorithms Spring Statistical Lossless Data Compression

Generalized Writing on Dirty Paper

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Noisy channel communication

Revision of Lecture 5

Lecture 7 September 24

CMPT 365 Multimedia Systems. Final Review - 1

Information Theory and Synthetic Steganography

Transcription:

Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1

Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and Information Engineering National Taiwan University

Digital Communication System Source Source Encoder Source codeword Source Coding Destination Source Decoder Estimated Source codeword Channel Encoder Channel codeword Channel Coding Channel Decoder Received codeword Modulator Modulation Demodulator Channel Noise Information Theory 3

Source Coding Based on characteristics/features of a source, Source Encoder-Decoder pair is designate to reduce the source output to a Minimal Representation. [Shannon 1948] How to model a signal source? Random process How to measure the content of a source? Entropy How to represent a source? Code-design How to model the behavior of a channel? Stochastic mapping channel capacity Information Theory 4

Source Coding (cont.) Redundancy Reduction Data Compression Data Compaction Modalities of Sources: Text Image Speech/Audio Video Graphics Hybrid Information Theory 5

Channel coding Introduction redundancy into the channel encoder and using this redundancy at the decoder to reconstitute the input sequences as accurately as possible, i.e., channel coding is designate to minimize the effect of the channel noise. Noise Channel Encoder Channel Channel Decoder Noiseless Channel Information Theory 6

Modulation Physical channels can require electrical signals, radio signals, or optical signals. The modulator takes the channel encoder/source encoder outputs into account and transfers the output waveforms that suit the physical nature of the channel, and are also chosen to yield either system simplicity or optimal detection performance. Information Theory 7

What is information? What is meant by the information contained in an event? If we are formally to defined a quantitative measure of information contained in an event, this measure should have some intuitive properties such as: 1. Information contained in events ought to be defined in terms of some measure of the uncertainty of the events. 2. Less certain events ought to contain more information than more certain events. 3. The information of unrelated/independent events taken as a single event should equal the sum of the information of the unrelated events. Information Theory 8

A nature measure of the uncertainty of an event is the probability of denoted P(). Once we agree to define the information of an event in terms of P(), the properties (2) and (3) will be satisfied if the information in is defined as I() = -log P() Self-information *The base of the logarithm depends on the unit of information to be used. Information Theory 9

Information Unit: log 2 log e log 10 : bit : nat : Hartley base conversions: log 10 2 = 0.30103, log 2 10 = 3.3219 log 10 e = 0.43429, log e 10 = 2.30259 log e 2 = 0.69315, log 2 e = 1.44270 log a X log log b b X a log blog X a b Information Theory 10

Information (Source) Facts: S 1 S 2 S q : Source alphabet P 1 P 2 P q : Probability 1) The information content (surprise) is somewhat inversely related to the probability of occurrence. 2) The information content from two different independent symbols is the sum of the information content from each separately. Since the probability of two independent choices are multiplied together to get the probability of the compound event, it is natural to define the amount of information as As a result, we have I S log or log P i I 1 P i S IS log IS S 1 2 1, PP 1 2 1 i 2 Information Theory 11

Entropy: Average information content over the whole alphabet of symbols H H r r S q i1 q i1 log log 1 S H S log 2 2 P i P i r r P r i P i S P *Consider the entropy of the Source can have no meaning unless a model of the Source is included. For a sequence of numbers and if we cannot recognize that they are pseudo-random numbers, then we would probably compute the entropy based on the frequency of occurrence of the individual numbers. 1 1 S P 2 2 S P q q Information Theory 12

*The entropy function involves only the distribution of the probabilities it is a function of a probability Distribution P i and does not involve the S i Ex: Weather of Taipei X = {Rain, fine, cloudy, snow} = {R, f, c, s} P(R) = ¼, P(F) = ½, P(C) = ¼, P(S) = 0 H 2 (X) = 1.5 bits/symbol If ¼ for each P(i) H 2 (X) = 2 bits/symbol. (>1.5) (equal probability event) H(X) = 0 for a certain event P(ai)=0 or P(ai)=1 Information Theory 13

The logarithmic measure is more convenient for various reasons: 1. It is practically more useful. Parameters of engineering importance such as time, bandwidth, number of relays, etc., tend to vary linearly with the logarithm of the number of possibilities. For example, adding one relay to a group doubles the number of possible states of the relays. It adds 1 to the base 2 logarithm of this number. Information Theory 14

2. It is nearer to the feeling of a human body Intensity eve volume ear 3. It is mathematically more suitable log 2 bits log 10 decimal digits log e natural unit Change from the base a to base b merely requires multiplication by log b a Information Theory 15

Course contents Basic Information Theory: Entropy, Relative Entropy and Mutual Information Data Compression / Compaction: Kraft Inequality, the prefix condition and Instantaneous decodable codes. Variable Length Codes Huffman code, Arithmetic code and L-Z code. Coding Techniques DPCM (predictive coding) Transform coding (Discrete Cosine Transform) JPEG (JPEG2000) Motion Estimation and Compensation MPEG-1, 2, 4 H.26P, H.264, HEVC -- Distributed Video Coding Steganography and Information Hiding Digital Watermarking Information Theory 16

References: 1. Elements of Information Theory, Thomas M. Cover & Joy A. Thomas, Wiley 2 nd Edit. 2006 Introduction to Data Compression, Khalid Sayood, 1996. JPEG/ MPEG-1 coding standard Introduction to Information Theory and Data Compression, Greg A. Harris & Peter D. Johnson, CRC Press, 1998

The Minimum Description Length Principle, Peter D. Grunward, the MIT Press, 2007. IEEE Trans. on IT, CSVT, SP, IP, MM, IFS, Comm, Comput

Requirements: 1. Home works 2. Midterm 3. Programming Assignments Variable Length Codes JPEG Compressor Grade: from C+ to B+ (Midterm dependent)

4. Final Project Real time MPEG-1 decoder --- (A- to A) ---------------------------------------------------------------- Real time MPEG-2 / H.264 decoder --- (A to A+) A+) Near real time MPEG-1 Encoder --- (A to ---------------------------------------------------------------- H.264 encoder/ H.265 Codecs --- (A+)

Bonuses: for grade A+ 1. Application of Information Theory in Specific Research Field 2. Compression Algorithms for 3D Videos 3. Encryption Domain Compression Algorithms 4. Parallelized (Cloud) Video Codecs 5. Scalable Video Codecs 6. Distributed Video Codecs 7. Others --- propose and confirmed