Digital communication system. Shannon s separation principle

Similar documents
Information and Entropy

Compression and Coding

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Basic Principles of Video Coding

Chapter 9 Fundamental Limits in Information Theory

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

UNIT I INFORMATION THEORY. I k log 2

Image Compression. Fundamentals: Coding redundancy. The gray level histogram of an image can reveal a great deal of information about the image

Wavelet Scalable Video Codec Part 1: image compression by JPEG2000

Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments. Tutorial 1. Acknowledgement and References for lectures 1 to 5

at Some sort of quantization is necessary to represent continuous signals in digital form

CSCI 2570 Introduction to Nanocomputing

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

BASICS OF COMPRESSION THEORY

Multimedia Networking ECE 599

Image Compression. Qiaoyong Zhong. November 19, CAS-MPG Partner Institute for Computational Biology (PICB)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Source Coding: Part I of Fundamentals of Source and Video Coding

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Autumn Coping with NP-completeness (Conclusion) Introduction to Data Compression

3F1 Information Theory, Lecture 3

Coding for Discrete Source

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Chapter 2: Source coding

2018/5/3. YU Xiangyu

Image and Multidimensional Signal Processing

Shannon s A Mathematical Theory of Communication

Lecture 1: Shannon s Theorem

Compression and Coding. Theory and Applications Part 1: Fundamentals

ECE472/572 - Lecture 11. Roadmap. Roadmap. Image Compression Fundamentals and Lossless Compression Techniques 11/03/11.

CMPT 365 Multimedia Systems. Lossless Compression

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Entropies & Information Theory

3F1 Information Theory, Lecture 3

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Compression methods: the 1 st generation

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG

Predictive Coding. Prediction Prediction in Images

Entropy as a measure of surprise

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

1 Introduction to information theory

Motivation for Arithmetic Coding

Multimedia. Multimedia Data Compression (Lossless Compression Algorithms)

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University

ELEMENT OF INFORMATION THEORY

CSEP 521 Applied Algorithms Spring Statistical Lossless Data Compression

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Information Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes

Implementation of Lossless Huffman Coding: Image compression using K-Means algorithm and comparison vs. Random numbers and Message source

Review of Quantization. Quantization. Bring in Probability Distribution. L-level Quantization. Uniform partition

Objectives of Image Coding

Distributed Arithmetic Coding

ASYMMETRIC NUMERAL SYSTEMS: ADDING FRACTIONAL BITS TO HUFFMAN CODER

Multimedia Communications. Scalar Quantization

CSEP 590 Data Compression Autumn Arithmetic Coding

Exercise 1. = P(y a 1)P(a 1 )

COMM901 Source Coding and Compression. Quiz 1

Shannon-Fano-Elias coding

Digital Image Processing Lectures 25 & 26

Lecture 6: Kraft-McMillan Inequality and Huffman Coding

Multimedia Information Systems

Lecture 10 : Basic Compression Algorithms

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

Information Theory and Coding Techniques

Uncertainity, Information, and Entropy

A Mathematical Theory of Communication

Lecture 4 Noisy Channel Coding

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

ITCT Lecture IV.3: Markov Processes and Sources with Memory

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Objective: Reduction of data redundancy. Coding redundancy Interpixel redundancy Psychovisual redundancy Fall LIST 2

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Predictive Coding. Prediction

Information Theory, Statistics, and Decision Trees

Module 5 EMBEDDED WAVELET CODING. Version 2 ECE IIT, Kharagpur

encoding without prediction) (Server) Quantization: Initial Data 0, 1, 2, Quantized Data 0, 1, 2, 3, 4, 8, 16, 32, 64, 128, 256

Compression and Coding. Theory and Applications Part 1: Fundamentals

Quantization. Introduction. Roadmap. Optimal Quantizer Uniform Quantizer Non Uniform Quantizer Rate Distorsion Theory. Source coding.

! Where are we on course map? ! What we did in lab last week. " How it relates to this week. ! Compression. " What is it, examples, classifications

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Source Coding for Compression

SIGNAL COMPRESSION. 8. Lossy image compression: Principle of embedding

ECE Information theory Final

Information Theory (Information Theory by J. V. Stone, 2015)

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Lecture 22: Final Review

CMPT 365 Multimedia Systems. Final Review - 1

Transcription:

Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation binary symbols channel noise information sink source decoder channel decoder demodulation We will be looking at this part digital channel Bernd Girod: EE368b Image and Video Compression Lossless Compression no. Shannon s separation principle Source Assume:. Point-to-point communication 2. Ergodic channel 3. Delay Adapt to source statistics (and distortion measure) Source coder Source decoder Channel Channel Adapt to channel statistics coder decoder Channel Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 2

How does compression work? Exploit redundancy. Take advantage of patterns in the signal. Describe frequently occuring events efficiently. Lossless coding: completely reversible Introduce acceptable deviations. Remove information that the humans cannot perceive. Match the signal resolution (in space, time, amplitude) to the application Lossy coding: irreversible distortion of the signal Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 3 Lossless compression in lossy compression systems Almost every lossy compression system contains a lossless compression system Lossy compression system Quantizer Encoder Lossless Encoder Lossless Decoder Lossless compression system Quantizer Decoder We will discuss the basics of lossless compression first, then move on to lossy compression Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 4 2

Topics in lossless compression Binary decision trees and variable length coding Entropy and bit-rate Huffman codes Statistical dependencies in image signals Sources with memory Arithmetic coding Redundancy reduction by prediction Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 5 Example: 2 Questions Alice thinks of an outcome (from a finite set), but does not disclose his selection. Bob asks a series of yes-no questions to uniquely determine the outcome chosen. The goal of the game is to ask as few questions as possible on average. Our goal: Design the best strategy for Bob. Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 6 3

Example: 2 Questions (cont.) Observation: The collection of questions and answers yield a binary code for each outcome. A B C D E F A B C D E F Which strategy (=code) is better? Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 7 Fixed length codes A B C D E F G H Average description length for K outcomes Optimum for equally likely outcomes Verify by modifying tree l av = log 2 K Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 8 4

Variable length codes If outcomes are NOT equally probable: Use shorter descriptions for likely outcomes Use longer descriptions for less likely outcomes Intuition: Optimum balanced code trees, i.e., with equally likely outcomes, can be pruned to yield unbalanced trees with unequal probabilities. The unbalanced code trees such obtained are also optimum. Hence, an outcome of probability p should require about log 2 p bits Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 9 Entropy of a memoryless source Let a memoryless source be characterized by an ensemble with: Alphabet {a,a,a 2,..., a K } Shannon: information conveyed by message : I(a k ) = log(p(a k )) Entropy of the source is the average information contents: H U U Probabilities {P(a ), P(a ), P(a 2 ),..., P(a K )} K ( ) = E{ I( a k )} = P( a k ) log( P( a k )) = P( u ) k = For log = log 2 the unit is bits/symbol log P u u a k ( ( )) Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 5

Properties of entropy: Entropy and bit-rate max{h(u )} = log(k) H(U ) with P(a j ) = P(a k ) j, k The entropy H(U ) is a lower bound for the average word length l avu of a decodable variable-length code for the symbols. Conversely, the average wordlength l av can approach H(U ), if sufficiently large blocks of symbols are encoded jointly. Redundancy of a code: R = l av H(U ) Bernd Girod: EE368b Image and Video Compression Lossless Compression no. Encoding with variable word length A code without redundancy, i.e. l av = H(U ) is achieved, if all individual code word length l cw (a k ) = log(p(a k )) For binary code words, all probabilities would have to be binary fractions: P(a k ) = 2 l cw (a k ) a i Example P(a i ) redundant code optimum code a.5 a.25 a 2.25 a 3.25 H(U ) =.75 bits / symbol l av =.75 bits / symbol R = Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 2 6

Huffman-Code Design algorithm for variable length codes proposed by Huffman (952) always finds a code with minimum redundancy. Obtain code tree as follows: Pick the two symbols with lowest probabilities and merge them into a new auxiliary symbol. 2 Calculate the probability of the auxiliary symbol. 3 If more than one symbol remains, repeat steps and 2 for the new auxiliary alphabet. 4 Convert the code tree into a prefix code. Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 3 Huffman-Code - Example Fixed length coding: l =3. bits/symbol Huffman code: lav =2.7 bits/symbol Entropy H(U )=2.68 bits/symbol Redundancy of the Huffman code: R=.3 bits/symbol Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 4 7

Probability density function of the luminance signal Y Images: 3 EBU test slides, 3 SMPTE test slides, uniform quantization with 256 levels (8 bits/pixel) H(U Y ) = 7.34 bits / pixel Image with lowest entropy: H L (U Y ) = 6.97 bits / pixel Image with highest entropy: H H (U Y ) = 7.35 bits /pixel Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 5 Probability density function of the color difference signals R-Y and B-Y H(U R Y ) =5.57 bits/pixel H L (U R Y ) = 4.65 bits/pixel H H (U R Y ) = 5.72 bits / pixel H L (U B Y ) = 4. bits /pixel H H (U B Y ) = 5.34 bits /pixel Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 6 8

Joint sources Joint sources generate N symbols simultaneously. A coding gain can be achieved by encoding those symbols jointly. The lower bound for the average code word length is the joint entropy: H(U,U 2,...,U N ) =... P(u,u 2,...,u N )log(p(u,u 2,...,u N )) u 2 u N It generally holds that u H(U,U 2,...,U N ) H(U ) + H(U 2 ) +... + H(U N ) with equality, if U,U 2,...,U N are statistically independent. Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 7 Statistical dependencies between video signal components Y, R-Y, B-Y Data: 3 EBU-, 3 SMPTE test slides, each component Y, R-Y, B-Y uniformly quantized to 64 levels H = 3x6 bits / sample = 8 bits/sample H(U Y,U R Y,U B Y ) = 9.44 bits/sample H(U Y ) + H(U R Y ) + H(U B Y ) =.28 bits/sample H = 2.74 bits/sample Statistical dependency between R, G, B is much stronger. If joint source Y, R-Y, B-Y is treated as a source with memory, the possible gain by joint coding is much smaller. Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 8 9

Markov process Neighboring samples of the video signal are not statistically independent: source with memory P(u T ) P(u T u T,u T 2,...,u T N ) A source with memory can be modeled by a Markov random process. Conditional probabilities of the source symbols u T of a Markov source of order N: P(u T, Z T ) = P(u T u T,u T 2,...,u T N ) state of the Markov source at time T Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 9 Entropy of source with memory Markov source of order N: conditional entropy H(U T Z T ) = H(U T U T,U T 2,...,U T N ) = E{ log(p(u T U T,U T 2,...,U T N ))} =... p(u T,u T,u T 2,..., u T N ) log( p(u T u T,u T 2,...,u T N )) ut u T N H(U T Z T ) H(U T ) (equality for memoryless sources) Average code word length can approach with a switched Huffman code. Number of states for an 8-bit video signal: N = N = 2 N = 3 256 states 65536 states 677726 states H(U T Z T ) e.g. Bernd Girod: EE368b Image and Video Compression Lossless Compression no. 2