Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Similar documents
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Chapter 8 SCALAR QUANTIZATION

Lecture 3: Shannon s Theorem

Pulse Coded Modulation

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Introduction to information theory and data compression

VQ widely used in coding speech, image, and video

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lecture 3: Probability Distributions

APPENDIX A Some Linear Algebra

Edge Isoperimetric Inequalities

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

Chapter 7 Channel Capacity and Coding

Foundations of Arithmetic

COS 521: Advanced Algorithms Game Theory and Linear Programming

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Introduction to Information Theory, Data Compression,

Lecture Notes on Linear Regression

Kernel Methods and SVMs Extension

Notes on Frequency Estimation in Data Streams

EGR 544 Communication Theory

Affine transformations and convexity

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

MMA and GCMMA two methods for nonlinear optimization

Generalized Linear Methods

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Solution Thermodynamics

Error Probability for M Signals

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

More metrics on cartesian products

Linear Approximation with Regularization and Moving Least Squares

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Chapter 7 Channel Capacity and Coding

NP-Completeness : Proofs

find (x): given element x, return the canonical element of the set containing x;

= z 20 z n. (k 20) + 4 z k = 4

Errors for Linear Systems

Lecture 12: Discrete Laplacian

Transform Coding. Transform Coding Principle

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The internal structure of natural numbers and one method for the definition of large prime numbers

NUMERICAL DIFFERENTIATION

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Lecture 10 Support Vector Machines II

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

Module 9. Lecture 6. Duality in Assignment Problems

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

Lossless Compression Performance of a Simple Counter- Based Entropy Coder

Limited Dependent Variables

Lecture 4. Instructor: Haipeng Luo

Lecture 4: Universal Hash Functions/Streaming Cont d

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Introductory Cardinality Theory Alan Kaylor Cline

Statistics II Final Exam 26/6/18

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Complete subgraphs in multipartite graphs

Flexible Quantization

Economics 101. Lecture 4 - Equilibrium and Efficiency

Workshop: Approximating energies and wave functions Quantum aspects of physical chemistry

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

1 Binary Response Models

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

Chapter 8 Indicator Variables

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

Convergence of random processes

Lec 02 Entropy and Lossless Coding I

THE SUMMATION NOTATION Ʃ

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Some modelling aspects for the Matlab implementation of MMA

Gaussian Mixture Models

Linear Regression Analysis: Terminology and Notation

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Which Separator? Spring 1

Problem Set 9 Solutions

EEE 241: Linear Systems

Welfare Properties of General Equilibrium. What can be said about optimality properties of resource allocation implied by general equilibrium?

Asymptotic Quantization: A Method for Determining Zador s Constant

CONJUGACY IN THOMPSON S GROUP F. 1. Introduction

HMMT February 2016 February 20, 2016

Channel Encoder. Channel. Figure 7.1: Communication system

Feature Selection: Part 1

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Difference Equations

ENTROPIC QUESTIONING

Scalar and Vector Quantization

On the set of natural numbers

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Transcription:

Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda compresson standards such as JPEG and MPEG. A complete entropy codec, whch s an encoder/decoder par, conssts of the process of encodng or compressng a random source (typcally quantzed transform coeffcents) and the process of decodng or decompressng the compressed sgnal to perfectly regenerate the orgnal random source. In other words, Sgnal Compresson Sgnal Compresson 4 there s no loss of nformaton due to the process of entropy codng. Random Source Entropy Encodng Compressed Source Thus, entropy codng does not ntroduce any dstorton, and hence, the combnaton of the entropy encoder and entropy decoder fathfully reconstructs the nput to the Random Source Entropy Decodng Compressed Source entropy encoder.

Sgnal Compresson 5 Therefore, any possble loss-of-nformaton or dstorton that may be ntroduced n a sgnal compresson system s not due to entropy encodng/decodng. As we dscussed prevously, a typcal mage compresson system, for example, ncludes a transform process, a quantzaton process, and an entropy codng stage. In such system, the dstorton s ntroduced due to quantzaton. Moreover, Sgnal Compresson 7 Code Desgn and Notatons In general, entropy codng (or source codng ) s acheved by desgnng a code, C, whch provdes a oneto-one mappng from any possble outcome a random varable X ( source ) to a codeword. There two alphabets n ths case; one alphabet s the tradtonal alphabet of the random source X, and the Sgnal Compresson 6 for such a system, and from the perspectve of the entropy encoder, the nput random source to that encoder s the quantzed transform coeffcents. Transform Coeffcents Quantzed Coeffcents Random Source Transform Quantzaton Entropy Codng Compressed Source Examples KT DCT Wavelets Examples Huffman Arthmetc Sgnal Compresson 8 second alphabet s the one that s used for constructng the codewords. Based on the second alphabet, we can construct and defne the set D, whch s the set of all fnte-length strng of symbols wthdrawn from the alphabet.

Sgnal Compresson 9 The most common and popular codes are bnary codes, where the alphabet of the codewords s smply the bnary bts one and zero. Sgnal Compresson Bnary codes can be represented effcently usng bnary trees. In ths case, the frst two branches of the root node represent the possble bt assgned to the frst bt of a codeword. Once that frst bt s known, and f the codeword has a second bt, then the second par of branches represents the second bt and so on. Sgnal Compresson Alphabet (A) of Random Source (X) X A A a B b C c.. Set of Codewords D Alphabet of code symbols used to construct Codewords b B b b In ths example: B Sgnal Compresson Bnary tree representaton of a bnary (D-ary; D=) prefx code. Set of Codewords Alphabet of code symbols used to construct codewords B B D

Sgnal Compresson Defnton A source code, C, s a mappng from a random varable (source) X wth alphabet to a fnte length strng of symbols, where each strng of symbols (codeword) s a member of the set D : C: D Sgnal Compresson 5 Example et X be a random source wth x,,, 4. et,, and hence D. Then: D... } {,,,...,,,...,,,,,,... Sgnal Compresson 4 The codewords n D are formed from an alphabet B that has D elements: D. We say that we have a D-ary code; or B s a D-ary alphabet. As dscussed prevously, the most common case s when the alphabet B s the set B, ; therefore, n ths case, D and we have bnary codewords. Sgnal Compresson 6 We can defne the code C as follows: Codeword ength C C x C C x C C x C4 C x4 4

Sgnal Compresson 7 Defnton or a random varable X wth a p.m.f. p, p,..., p m, the expected length of a code C X s: m C p. Sgnal Compresson 9 In addton, we have to desgn codes that are unquely decodable. In other words, f the source generates a sequence: x, x, x,... that s mapped nto a sequence of C x, C x, C x,..., then we should be codewords able to recover the orgnal source sequence x, x, x,... C x, C x, C x,.... from the codewords sequence Sgnal Compresson 8 Code Types The desgn of a good code follows the basc noton of entropy: or random outcomes wth a hgh probablty, a good code assgns short codewords and vce versa. The overall objectve s to have the average length C to be as small as possble. Sgnal Compresson In general, and as a start, we are nterested n codes that map each random outcome x nto a unque codeword that dffers from the codeword of any other outcome. or a random source wth alphabet,,...m a nonsngular code meets the followng constrant: C x C x j j

Sgnal Compresson Although a non-sngular code s unquely decodable for a sngle symbol, t does not guarantee unque decodablty for a sequence of outcomes of X. Sgnal Compresson In the above example, the code C s non-sngular, however, t s not unquely decodable. Meanwhle, the code C s both non-sngular and unquely decodable. Therefore, not all non-sngular codes are unquely decodable; however, every unquely decodable code s non-sngular. Sgnal Compresson Example: Code C Code C C C x C x C C x C x C C x C x C4 C x4 C x4 Sgnal Compresson 4 It s mportant to note that a unquely decodable code may requre the decodng of multple codewords to unquely dentfy the orgnal source sequence. Ths s the case for the above code C. (Can you gve an example when the C decoder needs to wat for more codewords before beng able to unquely decode a sequence?)

Sgnal Compresson 5 Therefore, t s hghly desrable to desgn a unquely decodable code that can be decoded nstantaneously when recevng each codeword. Ths type of codes are known as nstantaneous, prefx free, or smply prefx codes. In a prefx code, a codeword cannot be used as a prefx for any other codewords. Sgnal Compresson 7 It should be rather ntutve to know that every prefx code s unquely decodable but the nverse s not always true. In summary, the three major types of codes, non-sngular, unquely decodable, and prefx codes, are related as shown n the followng dagram. Sgnal Compresson 6 Example: In the followng example, no codeword s used as a prefx for any other codeword. C C x C C x C Cx C 4 C x4 Sgnal Compresson 8 All possble codes Unquely decodable codes Nonsngular codes Prefx (nstantaneous) codes

Sgnal Compresson 9 Kraft Inequalty Based on the above dscusson, t should be clear that unquely decodable codes represent a subset of all possble codes. Also, prefx codes are a subset of unquely decodable codes. Prefx codes meet a certan constrant, whch s known as the Kraft Inequalty. Sgnal Compresson Conversely, gven a set of codeword lengths that meet the nequalty D, there exsts a prefx code for m ths set of lengths. Proof A prefx code C can be represented by a D-ary tree. Below we llustrate the proof usng a bnary code and a Sgnal Compresson Theorem or any prefx D-ary code C wth codeword lengths,,..., m the followng must be satsfed: m D. Sgnal Compresson correspondng bnary tree. (The same prncples apply to hgher order codes/trees.) or llustraton purposes, let us consder the code: C C x C C x C Cx C 4 C x4 Ths code can be represented as follows.

Sgnal Compresson Bnary tree representaton of a bnary (D-ary; D=) prefx code. B Set of Codewords B D D Alphabet of code symbols used to construct codewords Sgnal Compresson 5 Bnary tree representaton of a bnary (D-ary; D=) prefx code. eaf nodes of the Codeword eaf nodes of the Codeword Sgnal Compresson 4 An mportant attrbute of the above tree representaton of codes s the number of leaf nodes that are assocated wth each codeword. or example, the frst codeword C, there are four leaf nodes that are assocated wth t. Smlarly, the codeword C, has two leaf nodes. Sgnal Compresson 6 The last two codewords are leaf nodes themselves, and hence each of these s assocated wth a sngle leaf node (tself).

Sgnal Compresson 7 eaf nodes of the Codeword eaf nodes of the Codeword 4 eaf nodes of codeword eaf nodes of codeword Sgnal Compresson 9 of leaf nodes that are assocated wth (descendant of) a max codeword at level s D. urthermore snce each group of leaf nodes of a codeword wth length s a dsjont from any other group of leaf nodes j, then: m max max D D whch mples: D m. Sgnal Compresson 8 Note that for a prefx code, any codeword cannot be an ancestor of any other codeword. et max be the maxmum length among all codeword lengths of a prefx code. or each codeword wth length max, ths codeword s at depth of the D-ary tree. Hence, the total number Sgnal Compresson 4 By smlar arguments, one can construct a prefx code for a set of lengths that satsfy the above constrant: D. m QED

Sgnal Compresson 4 Optmum Codes Here we address the ssue of fndng mnmum length C codes gven the constrant mposed by the Kraft nequalty. In partcular, we are nterested n fndng codes that satsfy: Sgnal Compresson 4 Consequently, we can mnmze the followng objectve functon: m m. J p D J p D D ln. p D ln. D Sgnal Compresson 4 m mn C mn p,,... m,,... m such that m D. If we assume that equalty s satsfed: D, we m can formulate the problem usng agrange multplers. Sgnal Compresson 44 D p. ln D Usng the constrant m D, ln D D p log p D

Sgnal Compresson 45 Therefore: The average length C of an optmum code can be expressed as: m p m p log p H X D D Sgnal Compresson 47 D p p log D. However, and n general, the probablty dstrbuton values ( p ) do not necessarly guarantee nteger-valued lengths for the codewords. Sgnal Compresson 46 where H D X s the entropy of the orgnal source X (measured wth a logarthmc base D). or a bnary code, D, then the average length s the same as the standard (base-) entropy measured n bts. Based on the above dervaton, achevng an optmum prefx code C wth an entropy length H X s only D possble when: Sgnal Compresson 48 Below, we state one of the most fundamental theorems n nformaton theory that relates the average length of any prefx code wth the entropy of the random source wth general dstrbuton values ( p ). Ths theorem, whch s commonly known as the entropy bound theorem, llustrates that any code cannot have an average length that s smaller than the entropy of the random source.

Sgnal Compresson 49 Theorem (Entropy Bound) The expected length of a prefx D-ary code C for a random source X wth an entropy H D X satsfes the followng nequalty: HD X wth equalty f-and-only-f D p. Sgnal Compresson 5 H D X. Such dstrbutons are known as D-adc. or the bnary case, D, we have a Dyadc dstrbuton (or a dyadc code). Example of a dyadc dstrbuton s: p, p, p, p ; and 4 8 8,,,. Sgnal Compresson 5 Observaton from the Entropy Bound Theorem The Entropy Bound Theorem and ts proof leads to mportant observatons that we outlne below: or random sources wth dstrbutons that satsfy p D, where s an nteger for,,..., m, there exsts a prefx code that acheves the entropy Sgnal Compresson 5 Entropy Codng Methods Here, we wll dscuss leadng examples of entropy codng methods that are broadly used n practce, and whch have been adopted by leadng nternatonal compresson standards. In partcular, we wll dscuss Huffman codng and arthmetc codng, both of whch lead to optmal entropy codng.

Sgnal Compresson 5 Key Propertes of Optmum Prefx Codes Here, we outlne few key propertes of optmum prefx codes that wll lead to the Huffman codng procedure. We adopt the notaton C to represent the codeword wth length of a code C. Sgnal Compresson 55 Property Assumng p p p m p m, then the largest codewords of an optmum code have the same length: m. m Sgnal Compresson 54 Property If C j and C k are two codewords of an optmum prefx code C, then: pj pk j k Sgnal Compresson 56 Set of Codewords m m

Sgnal Compresson 57 Set of Codewords Unused shorter codeword m m Sgnal Compresson 59 Property There exts an optmum code C where the largest codewords are sblngs (.e., they dffer n one bt). Sgnal Compresson 58 Set of Codewords Unused shorter codeword m m Sgnal Compresson 6 Property 4 or a bnary random source, the optmum prefx code s of length:.

Sgnal Compresson 6 The Huffman Entropy Codng Procedure The above propertes lead to the Huffman entropy codng procedure for generatng prefx codes.a core noton n ths procedure s the observaton that optmzng a gven code C s equvalent to optmzng a shortened verson ' C. Sgnal Compresson 6 other outcome (treat them as bnary, and use an optmum bnary code.).. Repeat step untl we have a bnary source, whch one merged result n a probablty. We now llustrate the Huffman procedure usng few examples. Sgnal Compresson 6 The Huffman codng procedures can be summarzed by the followng steps:. Sort the outcomes accordng probablty dstrbuton: p p pm pm.. Merge the two least probable outcomes. And assgn a zero to one outcome and a one to the Sgnal Compresson 64 Example p nd an optmum set of codewords: C?, C?, C?, C? 4 p p p 4 4 The optmum codewords must meet the followng: 4 4 C C 4 and sblngs

Sgnal Compresson 65 Combnng the least probable outcomes: p p p p 4 p p p 4 Sgnal Compresson 67 Now we have a probablty one. Nothng else to merge. p p p p 4 4 p p p Sgnal Compresson 66 Use the least probable outcomes of the shortened codes; p p p 4 p 4 p p p Sgnal Compresson 68 p p p p 4 4 p p p C C C 4 C What s the average length?

Sgnal Compresson 69 In some cases, we may encounter more than one choce for mergng the probablty dstrbuton values.(ths was the case n the above example.) One mportant queston s: what s the mpact of selectng one choce for combnng the probabltes versus the other? We llustrate ths below by selectng an alternatve opton for combnng the probabltes. Sgnal Compresson 7 As can be seen n the above example, the Huffman procedure can lead to dfferent prefx codes (f multple optons for mergng are encountered). Hence, an mportant queston s: Does one opton provde a better code (n terms of provdng a smaller average code length )? Sgnal Compresson 7 p p p p 4 4. C C C 4 C Sgnal Compresson 7 The Huffman procedure can also be used for the case when D (.e., the code s not bnary anymore). Care should be taken though when dealng wth a nonbnary code desgn.

Sgnal Compresson 7 Arthmetc Codng Although Huffman codes are optmal on a symbol-bysymbol bass, there s stll room for mprovements n terms of achevng lower overhead. or example, a bnary source wth entropy H X, stll requres one bt-per-symbol when usng a Huffman code. Hence, f, for example, H X.5, then a Huffman code spends Sgnal Compresson 75 hence, sometmes called the Shannon-ano-Elas (SE) codes. Therefore, we frst outlne the prncples and procedures of SE codes, and then descrbe arthmetc codng. Sgnal Compresson 74 double the amount of bts per symbol (relatve to the true optmum lmt of H X.5 ). Arthmetc codng s an approach that addresses the overhead ssue by codng a contnuous sequence of source symbols whle tryng to approach the entropy lmt H X. Arthmetc codng has roots n a codng approach proposed by Shannon, ano, and Elsas, and Sgnal Compresson 76 Shannon-ano-Elas Codng The SE codng procedure s based on usng the cumulatve dstrbuton functon (CD) x of a random source X ; xpr X x. The CD provdes a unque one-to-one mappng for the possble outcomes of any random source X.

Sgnal Compresson 77 In other words, f we denote to the alphabet of a dscrete random source X by the nteger ndex set:,,...,m, then t s well known that: j, j. Ths can be llustrated by the followng example of a typcal CD functon of a dscrete random source. Sgnal Compresson 79 One mportant characterstcs of the CD of a dscrete random source s that the CD defnes a set of nonoverlappng ntervals n ts range of possble values between zero and one. (Recall that the CD provdes a measure of probablty, and hence t s always confned between zero and one.) Sgnal Compresson 78 x 4 4 x Sgnal Compresson 8 Based on the above CD example, we can have a welldefned set of non-overlappng ntervals as shown n the next fgure.

Sgnal Compresson 8 x 4 Non-overlappng Intervals 4 x Sgnal Compresson 8 x 4 p p p p 4 4 x Sgnal Compresson 8 Another mportant observaton s that the sze of each (non-overlappng) nterval n the range of the CD x s defned by the probablty-mass-functon (PM) value p Pr X of a partcular outcome X. Ths s the same s the level of jumps that we can observe n the starcase-lke shape of a CD of a dscrete random source. Ths s hghlghted by the next fgure. Sgnal Compresson 84 Overall, and by usng the CD of a random source, one can defne a unque mappng between any possble outcome and a partcular (unque) nterval n the range between zero and one. urthermore, one can select any value wthn each (unque) nterval of a correspondng random outcome ( ) to represent that

Sgnal Compresson 85 outcome. Ths selected value serves as a codeword for that outcome. The SE procedure, whch s based on the above CDdrven prncples of unque mappng, can be defned as follows: Sgnal Compresson 87. Select a partcular value wthn the nterval, to represent the outcome X. Ths value s known as the modfed CD and s denoted by x. Sgnal Compresson 86. Map each outcome X to the nterval,., Inclusve Exclusve Sgnal Compresson 88 In prncple, any value wthn the nterval, can be used for the modfed CD. A natural choce s the mddle of the correspondng nterval,. Hence, the modfed CD can be expressed as follows:

Sgnal Compresson 89 p, whch, n turn, can be expressed as:. Ths s llustrated by the next fgure. Sgnal Compresson 9 So far, t should be clear that [,), and t provdes a unque mappng for the possble random outcomes of X.. Generate a codeword to represent, and hence to represent the outcome X. Below we consder smple examples of such codewords accordng to a SE codng procedure. Sgnal Compresson 9 x 4 4 4 x Sgnal Compresson 9 Examples of modfed CD Values and Codewords The followng table outlnes a dyadc set of examples of values that could be used for a modfed CD and the correspondng codewords for such values.

Sgnal Compresson 9 Bnary Representaton. Codeword 4. 8. Sgnal Compresson 95 In general the number of bts needed to code the modfed CD value could be nfnte snce could be any real number. In practce, however, a fnte number of bts s used to represent ( approxmate ). It should be clear that the number of bts used Sgnal Compresson 94 The above values of modfed CD can be combned to represent hgher precson values as shown n the next table. Bnary Representaton.75. Codeword.65. Sgnal Compresson 96 must be suffcently large to make sure that the codeword representng s unque (.e., there should not be overlap n the ntervals representng the random outcomes). By usng a truncated value for the orgnal value, we antcpate a loss n precson.

Sgnal Compresson 97 et be the truncated value used to represent the orgnal modfed CD based on bts. Naturally, the larger number of bts used, the hgher precson, and the smaller the dfference between and. Sgnal Compresson 99 Consequently, and based on the defnton of the modfed CD value: p, n order to mantan unque mappng, the maxmum error has to be smaller than p / : p. Sgnal Compresson 98 It can be shown that the dfference between the orgnal modfed CD value and ts approxmaton satsfes the followng nequalty:. Sgnal Compresson Ths leads to the followng constrant on the length : p log log log log p log p log p

Sgnal Compresson Therefore: log p. Example The followng table shows an example of a random source X wth four possble outcomes and the Sgnal Compresson X p (Bnary) SE Code.5.5.5..5.75.65..5.875.85. 4 4.5..975. 4 Sgnal Compresson correspondng PM, CD, and modfed CD values and codewords used based on SE codng. Sgnal Compresson 4 Arthmetc Codng The advantages of the SE codng procedure can be realzed when t s sued to code multple outcomes of the random source under consderaton. Arthmetc codng s bascally an SE codng appled to multple outcomes of the random source.

Sgnal Compresson 5 Under AC, we code a sequence of n outcomes:,,..., n, where each outcome,,..., m j. Each possble vector X of the random source X s mapped to a unque value: ( n ) [,). Sgnal Compresson 7 Example Arthmetc codng begns wth dvdng the zero to one range based on the CD functon of the random source. In ths example, the source can take one of three possble outcomes. Sgnal Compresson 6 The best way to llustrate arthmetc codng s through a couple of examples as shown below. Sgnal Compresson 8 x,,

Sgnal Compresson 9 If we assume that we are nterested n codng n outcomes, the followng fgures show the partcular nterval and correspondng value x that arthmetc codng focuses on to code the vector,,. Sgnal Compresson x, Sgnal Compresson x, Sgnal Compresson x x, Transmt ths number to represent the vector: x,

Sgnal Compresson Smlarly, the followng fgure shows the partcular nterval and correspondng value x that arthmetc codng focuses on to code the vector,,. Sgnal Compresson 5 Based on the above examples, we can defne: and ( n) ( n) ( n ) l u ( ) ( ) ( ), n n n u l where ( n ) u and ( n ) l are the upper and lower bounds of a ( ) ( ) unque nterval n, n l u that ( n ) belong to. Below, we use these expressons to llustrate the arthmetc codng procedure. Sgnal Compresson 4 x, x Sgnal Compresson 6 Example The codng process starts wth the ntal step values: () l () u () () () u l

Sgnal Compresson 7 () u () () () u l () l Sgnal Compresson 9 () u () () () u l Example,, () l Sgnal Compresson 8 After the ntal step, the nterval ( ) ( ) ( ) and n n n u l correspondng value are updated ( n) ( n) ( n ) l u accordng to the partcular outcomes that the random source s generatng. Ths s llustrated below. Sgnal Compresson u () () u? () () () u l, () () l l?

Sgnal Compresson u () ( () ) () u l. () () () u l, l () () () ( ) l l. Sgnal Compresson u () u () u ( ) (). () l l () () () () u l () l () () () u l ( ) () l l ( ),. Sgnal Compresson u () ( () ) () u l.. () u l () () () () u l, () () ( ) l l.. () l Sgnal Compresson 4 The arthmetc codng procedure can be summarzed by the followng steps that are outlnes below.

Sgnal Compresson 5 n ( n) ( n) l u x ( n ) ( n ) ( n ) u l. n ( n ) ( n ) ( n ) l l. n ( n ) ( n ) ( n ) u l () () l u Sgnal Compresson 7 Sgnal Compresson 6 Smlar to SE codng, after determnng the value ( n ), we use ( n ) bts to represent ( n ) accordng to the constrant: ( ) n log px.