ECE472/572 - Lecture 11. Roadmap. Roadmap. Image Compression Fundamentals and Lossless Compression Techniques 11/03/11.

Similar documents
Image and Multidimensional Signal Processing

Image Compression. Fundamentals: Coding redundancy. The gray level histogram of an image can reveal a great deal of information about the image

Image Compression. Qiaoyong Zhong. November 19, CAS-MPG Partner Institute for Computational Biology (PICB)

Compression. What. Why. Reduce the amount of information (bits) needed to represent image Video: 720 x 480 res, 30 fps, color

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

IMAGE COMPRESSION-II. Week IX. 03/6/2003 Image Compression-II 1

Reduce the amount of data required to represent a given quantity of information Data vs information R = 1 1 C

Multimedia. Multimedia Data Compression (Lossless Compression Algorithms)

Chapter 2: Source coding

Image Compression - JPEG

BASICS OF COMPRESSION THEORY

repetition, part ii Ole-Johan Skrede INF Digital Image Processing

Digital communication system. Shannon s separation principle

Fundamentals. Relative data redundancy of the set. C.E., NCU, Taiwan Angela Chih-Wei Tang,

RLE = [ ; ], with compression ratio (CR) = 4/8. RLE actually increases the size of the compressed image.

SIGNAL COMPRESSION. 8. Lossy image compression: Principle of embedding

Source Coding Techniques

Lecture 10 : Basic Compression Algorithms

Chapter 9 Fundamental Limits in Information Theory

Objective: Reduction of data redundancy. Coding redundancy Interpixel redundancy Psychovisual redundancy Fall LIST 2

Optimum Notch Filtering - I

IMAGE COMPRESSION IMAGE COMPRESSION-II. Coding Redundancy (contd.) Data Redundancy. Predictive coding. General Model

Digital Image Processing Lectures 25 & 26

Multimedia Networking ECE 599

Fundamentals of Image Compression

Basic Principles of Video Coding

Kotebe Metropolitan University Department of Computer Science and Technology Multimedia (CoSc 4151)

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

CMPT 365 Multimedia Systems. Lossless Compression

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments

BASIC COMPRESSION TECHNIQUES

Lecture 2: Introduction to Audio, Video & Image Coding Techniques (I) -- Fundaments. Tutorial 1. Acknowledgement and References for lectures 1 to 5

Image Compression Basis Sebastiano Battiato, Ph.D.

Wavelet Scalable Video Codec Part 1: image compression by JPEG2000

Source Coding for Compression

Image Data Compression

UNIT I INFORMATION THEORY. I k log 2

Compression and Coding

Information and Entropy

Lecture 7 Predictive Coding & Quantization

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Overview. Analog capturing device (camera, microphone) PCM encoded or raw signal ( wav, bmp, ) A/D CONVERTER. Compressed bit stream (mp3, jpg, )

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

Department of Electrical Engineering, Polytechnic University, Brooklyn Fall 05 EL DIGITAL IMAGE PROCESSING (I) Final Exam 1/5/06, 1PM-4PM

Review of Quantization. Quantization. Bring in Probability Distribution. L-level Quantization. Uniform partition

Multimedia Information Systems

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

CS4800: Algorithms & Data Jonathan Ullman

Proyecto final de carrera

at Some sort of quantization is necessary to represent continuous signals in digital form

Multimedia & Computer Visualization. Exercise #5. JPEG compression

4 An Introduction to Channel Coding and Decoding over BSC

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Compression. Encryption. Decryption. Decompression. Presentation of Information to client site

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)


Compressing Kinetic Data From Sensor Networks. Sorelle A. Friedler (Swat 04) Joint work with David Mount University of Maryland, College Park

Can the sample being transmitted be used to refine its own PDF estimate?

encoding without prediction) (Server) Quantization: Initial Data 0, 1, 2, Quantized Data 0, 1, 2, 3, 4, 8, 16, 32, 64, 128, 256

Lecture 20: Quantization and Rate-Distortion

CMPT 365 Multimedia Systems. Final Review - 1

3F1 Information Theory, Lecture 3

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

State of the art Image Compression Techniques

Introduction p. 1 Compression Techniques p. 3 Lossless Compression p. 4 Lossy Compression p. 5 Measures of Performance p. 5 Modeling and Coding p.

Compression and Coding. Theory and Applications Part 1: Fundamentals

CSEP 521 Applied Algorithms Spring Statistical Lossless Data Compression

6.02 Fall 2012 Lecture #1

2018/5/3. YU Xiangyu

Audio Coding. Fundamentals Quantization Waveform Coding Subband Coding P NCTU/CSIE DSPLAB C.M..LIU

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

Multimedia communications

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Coding for Discrete Source

CSE 408 Multimedia Information System Yezhou Yang

LOCO. near lossless image compression. A new method for lossless and. Nimrod Peleg Update: Dec. 2003

Lecture 7: DecisionTrees

SIGNAL COMPRESSION Lecture 7. Variable to Fix Encoding

Introduction to Information Theory. Part 3

Module 5 EMBEDDED WAVELET CODING. Version 2 ECE IIT, Kharagpur

EE5356 Digital Image Processing

Predictive Coding. Prediction Prediction in Images

Compression and Coding. Theory and Applications Part 1: Fundamentals

Lecture 4 : Adaptive source coding algorithms

Lecture 1: Shannon s Theorem

Information and Entropy. Professor Kevin Gold

Predictive Coding. Prediction

Predictive Coding. Lossy or lossless. Feedforward or feedback. Intraframe or interframe. Fixed or Adaptive

On Compression Encrypted Data part 2. Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University

Inverse Problems in Image Processing

Entropy as a measure of surprise

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Pattern Recognition and Machine Learning. Learning and Evaluation of Pattern Recognition Processes

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Transcription:

ECE47/57 - Lecture Image Compression Fundamentals and Lossless Compression Techniques /03/ Roadmap Preprocessing low level Image Enhancement Image Restoration Image Segmentation Image Acquisition Image Compression Image Coding Representation & Description Morphological Image Processing Wavelet Analysis Recognition & Interpretation Knowledge Base Roadmap Color image processing Interpreting color Pseudo-color IP Intensity slicing Transform in spatial domain Transform in frequency domain Full-color IP Tonal correction Color correction Enhancement (I vs. RGB channels) Restoration (I vs. RGB channels) Color image edge detection Image compression Data vs. information Entropy Data redundancy Coding redundancy Interpixel redundancy Psycho-visual redundancy Fidelity measurement Lossless compression Variable length coding Huffman coding LZW Bitplane coding Binary image compression Run length coding Lossless predictive coding Lossy compression

Questions What is the difference between data and information? How to measure data redundancy? How many different types of data redundancy? Explain each. What is entropy? What is the first/second order estimate of entropy? Understand the two criteria that all coding mechanisms should satisfy. Understand Huffman coding How do you explain the average coding length from Huffman coding is always greater than the entropy? What image format uses which coding scheme? Understand RLC, differential coding Data and Information Data and information Different data set can represent the same ind of information Data redundancy Relative data redundancy RD = n C C R = R n C R : compression ratio n, n : number of information carrying units in two data sets that represent the same information Data compression Reducing the amount of data required to represent a given quantity of information Data Redundancy Coding redundancy Interpixel redundancy Psychovisual redundancy

Coding Redundancy In general, coding redundancy is present when the codes assigned to a set of events (such as gray-level values) have not been selected to tae full advantage of the probabilities of the events. In most images, certain gray levels are more probable than others Coding Redundancy - Example p L ( r ) r avg n = n L = l = 0 ( r ) p ( r ) r r = 0 r = / 7 r = / 7 r = 3/ 7 3 r = 4 / 7 4 r = 5 / 7 5 r = 6 / 7 6 0 r = 7 r p ( r ) Code l ( r ) Code l ( r ) r 0.9 000 3 0.5 00 3 0 0. 00 3 0 0.6 0 3 00 3 0.08 00 3 000 4 0.06 0 3 0000 5 0.03 0 3 00000 6 0.0 3 000000 6 Interpixel Redundancy Because the value of any pixel can be reasonably predicted from the value of its neighbors, much of the visual contribution of a single pixel to an image is redundant, it could have been guessed on the basis of its neighbors values. Include spatial redundancy, geometric redundancy, interframe redundancy 3

Interpixel Redundancy - Example Psychovisual Redundancy The eye does not respond with equal sensitivity to all visual information. Certain information simply has less relative importance than other information in normal visual processing In general, an observer searches for distinguishing features such as edges or textural regions and mentally combines them into recognizable groupings. 4

Psychvisual Redundancy Improved Gray Scale Quantization (IGS) Evaluation Metrics Fidelity criteria Measure information Entropy Fidelity Criteria Objective fidelity criteria Root-mean-square error (e rms ) / [ ] & M ˆ # = $ N erms f y) f y)! % MN x= 0 y= 0 " Mean-square signal-to-noise ratio (SNR ms ) M N [ fˆ y) ] x= 0 y= 0 SNRms = M N fˆ y) f y) [ ] x= 0 y= 0 Root-mean-square SNR ms (SNR rms ) 5

Measure Information A random event E that occurs with probability P(E) is said to contain ( E) I = log P ( E) = log P ( E) I(E): self-information of E The amount of self-information attributed to event E is inversely related to the probability of E If base is used, the unit of information is called a bit Example: if P(E) = ½, then I(E) = -log P(E) =. Entropy Average information per source output, or uncertainty, or entropy of the source H J ( z) = P( a j) log P( a j ) j= How to interpret the increase of entropy? What happens if the events are equally probable? Gray level Count 95 4 69 4 43 Gray level pair Count (,) 8 (,95) 4 (95,69) 4 (69,43) 4 (43,43) 8 Pr obability 3/ 8 / 8 / 8 3/ 8 Example Estimate the information content (entropy) of the following image (8 bit image): Pr obability / 7 / 7 / 7 / 7 / 7 95 69 43 43 43 95 69 43 43 43 95 69 43 43 43 95 69 43 43 43 First-order estimate of the entropy (bits/ pixel) 3 / 8*log (3/ 8) / 8*log (/ 8) / 8*log (/ 8) 3/ 8*log (3/ 8) =.8 Second-order estimate of the entropy (bits every two pixels) / 7*log ( / 7)* / 7*log (/ 7)*3 =. What do these two numbers tell you? 6

Compression Approaches Error-free compression or lossless compression Variable-length coding Bit-plane coding Lossless predictive coding Lossy compression Lossy predictive coding Transform coding Variable-length Coding Only reduce code redundancy Assign the shortest possible code words to the most probable gray levels Huffman coding Can obtain the optimal coding scheme Revisit Example r = 0 r = / 7 r = / 7 r = 3/ 7 3 r = 4 / 7 4 r = 5 / 7 5 r = 6 / 7 6 0 r = 7 r p ( r ) Code l ( r ) r 0.9 000 3 0.5 00 3 0. 00 3 0.6 0 3 0.08 00 3 0.06 0 3 0.03 0 3 0.0 3 Code 00 0 0 0 000 00 Can we do this? 7

Huffman Coding Uniquely decodable Instantaneous coding Source reduction Order the probabilities of symbols Combine the lowest probability symbols into a single symbol that replaces them in the next source reduction Code each reduced source, starting with the smallest source and woring bac to the original source Example R 0.5 (0) 0.5 (0) 0.5 (0) 0.5 (0) 0.35 (00) 0.4 () 0.6 (0) R 0. (0) 0. (0) 0. (0) 0. (0) 0.5 (0) 0.35 (00) 0.4 () R0 0.9 () 0.9 () 0.9 () 0.9 () 0. (0) 0.5 (0) R3 0.6 (00) 0.6 (00) 0.6 (00) 0.9 (000) 0.9 () R4 0.08 (000) 0.08 (000) 0. (0000) 0.6 (00) R5 0.06 (00000) 0.06 (00000) 0.08 (000) R6 0.03 (00000) 0.05 (0000) R7 0.0 (0000) Entropy of the source: -0.5*log (0.5) - 0.*log (0.) - 0.9*log (0.9) - 0.6*log (0.6) -0.08*log (0.08) - 0.06*log (0.06) - 0.03*log (0.03) - 0.0*log (0.0) =.6508 bits/pixel Average length of Huffman code: *0.5 + *0. + *0.9 + 3*0.6 + 4*0.08 + 5*0.06 + 6*0.03 + 6*0.0 =.7 bits LZW Lempel-Ziv-Welch Attac interpixel redundancy - spatial redundancy Assigns fixed-length code words to variable length sequences of source symbols Requires no a-priori nowledge of the probability of occurrence of the symbols Used in GIF, TIFF, PDF Coding boo is created while the data are being encoded 8

Example 39 39 6 6 39 39 6 6 39 39 6 6 39 39 6 6 Bit-plane Coding Attac inter-pixel redundancy First decompose the original image into bit-plane Binary image compression approach run-length coding (RLC) Bit-plane Decomposition Bit-plane slicing Problem: small changes in gray level can have a significant impact on the complexity of the bit plane 7 vs. 8 è 0 vs. 000 0000 Solution: g = a a 0 i m g i m = a m i+ Example: 7 è 0 è 000 0000 8 è 000 0000 è 00 0000 i 9

Binary Image Compression - RLC Developed in 950s Standard compression approach in FAX coding Approach Code each contiguous group of 0 s or s encountered in a left to right scan of a row by its length Establish a convention for determining the value of the run Code blac and white run lengths separately using variable-length coding A Complete Example Original image Binary code XOR binary 7 8 3 3 33 0000 000 0 000 0000 0000 00 000 0000 000 000 0000 00 000 0000 00 0000 0000 000 00 0000 00 000 Run-length coding 8 bit planes 00 0 000 000 000 000 00 00 3 3 3 3 0 0 000 000 0 0 000 000 00 00 3 3 3 3 0 Huffman coding 0.34 () 0.66 (0) 3.00 (0) 0 0.34 () 0.4 () 0.6 (0) 0.33 (00) 0.34 () 0.33 (00) 0.4 (00) 0.4 () 3 0.33 (0) 0.33 (0) 0 0. (0) Final code 00 00 0 0 0 0 000 000 0 0 00 00 0 0 000 00 Lossless Predictive Coding Do not need to decompose image into bit planes Eliminate interpixel redundancy Code only the new information in each pixel The new information is the difference between the actual and predicted value of that pixel y) = f y) fˆ y) e fˆ m & # y) = round αi f y i)!" $ % i= 0

Previous Pixel Coding (differential coding) fˆ y) = round[ αf y ) ] First-order linear predictor / previous pixel predictor uses differential coding 6 7 8 6

Lossless Compression Approaches Code redundancy Interpixel redundancy Variable-length coding Huffman coding Bit-plane coding Run-length coding LZW Predictive coding Averagely, use shorter-length code to convey the same amount of information