Part II Image Data Compresson Prof. Ja-Lng Wu Department of Computer Scence and Informaton Engneerng Natonal Tawan Unversty
Contents I. Introducton II. III. IV. Predctve Technques Transform Doman Codng Technques Image Codng n Vsual Telephony V. Codng of Two-Tone Images VI. References Informaton Theory
Image Data Compresson I. Introducton : Image data Compresson s concerned wth mnmzng the number of bts requred to represent an mage. Applcatons of data compresson are prmarly n Transmsson and Storage of nformaton. Applcaton of data compresson s also n the development of fast algorthms where the number of operatons requred to mplement an algorthm s reduced by workng wth the compressed data. Informaton Theory 3
Image data Compresson technques Pxel Codng Predctve Codng Transform Codng Others PCM/quantzaton Data modulaton Zonal codng Hybrd Codng Run-length codng Lne-by-lne DPCM Threshold codng Vector quantzaton Bt-plane codng -D DPCM Mult-D technques Interframe technques Adaptve Adaptve Informaton Theory 4
Image data Compresson methods fall nto two common categores : A. Redundancy Codng : Redundancy reducton Informaton lossless Predctve codng : DM, DPCM B. Entropy Codng : Entropy reducton Inevtably results n some dstorton Transform codng For dgtzed data, Dstortonless Compresson technques are possble. Informaton Theory 5
Some methods for Entropy reducton: Subsamplng : reduce the samplng rate Coarse Quantzaton : reduce the number of quantzaton levels Frame Repetton / Interlacng : reduce the refresh rate (number of frames per second) TV sgnals Informaton Theory 6
II. Predctve Technques : Basc Prncple : : to remove mutual redundancy between successve pxels and encode only the new nformaton. DPCM : A Sampled sequence u(m), coded up to m=n-. Let u ~ n, u~ n, be the value of the reproduced (decoded) sequence. Informaton Theory 7
n At m=n, when u(n) arrves, a quantfy, an estmate of u(n), s predcted from the prevously decoded samples u ~ n, u~ n,,.e., u ~ n u~ n, u~ n, ; :"predcton rule " u ~ predcton error : n un un e ~ n e ~ u ~ n u~ n e~ n If s the quantzed value of e(n), then the reproduced value of u(n) s : Informaton Theory 8
DPCM CODEC un en e ~ n Quantzer Communcaton e ~ n Channel + u ~ n u ~ n u ~ n Predctor wth delay u ~ n + + Predctor wth delay Coder Reconstructon flter/decoder Informaton Theory 9
Note : u u Remarks: ~ n u n en ~ n u n un u~ n en u~ n en e~ n qn : e~ n thequantzaton error n e(n). The pontwse codng error n the nput sequence s exactly equal to q(n), the quantzaton error n e(n). Wth a reasonable predctor the mean sequare value of the dfferental sgnal e(n) s much smaller than that of u(n) Informaton Theory 0
Concluson: For the same mean square quantzaton error, e(n) requres fewer quantzaton bts than u(n). The number of bts requred for transmsson has been reduced whle the quantzaton error s kept the same. Informaton Theory
Feedback Versus Feedforward Predcton An mportant aspect of DPCM s that the predcton s based on the output the quantzed samples rather than the nput the unquantzed samples. Ths results n the predctor beng n the feedback loop around the quantzer, so that the quantzer error at a gven step s fed back to the quantzer nput at the next step. Ths has a stablng effect that prevents DC drft and accumulaton of error n the reconstructed sgnal u ~ n. Informaton Theory
If the predcton rule s based on the past nput, the sgnal reconstructon error would depend on all the past and present quantzaton errors n the feedforward predctonerror sequence (n). Generally, the MSE of feedforward reconstructon wll be greater than that n DPCM. Quantzer un + n + u ~ n Predctor u n Entropy coder/decoder + Predctor Feedforward codng Informaton Theory 3
Example The sequence 00, 0, 0, 0, 0, 8, 6, s to be predctvely coded usng the predcton rule: u~ n u~ n for DPCM coder. u n u n for the feedforward predctve Assume a -bt quantzer, as shown below, s used, 5-6 -4 - - 4 6-5 Except the frst sample s quantzed separately by a 7-bt unform quantzer, gven u ~ 0 u0 00. Informaton Theory 4
Input DPCM Feedforward Predctve Coder u ~ n e n e ~ n u ~ n N u(n) u(n) u(n) 0 00 00 0 00 0 0 00 0 00 0 0 0 9 5 06 4 0 8 5 06 4 3 0 06 4 5 9 0 0-05 5 4 0 9 5 6 4 0 0-04 6 5 8 6 7 0 - -5 99 9 u n ε n ~ εn u ~ n Informaton Theory 5
Delta Modulaton : (DM) Predctor : one-step delay functon Quantzer : -bt quantzer u~ e ~ n un n un u~ n un + en + e ~ n u ~ n Unt Delay u ~ n + + Integrator e ~ n + u ~ n + u ~ n Unt Delay Informaton Theory 6
un u ~ n Granularty Slope overload Prmary Lmtaton of DM : ) Slope overload : large jump regon Max. slope = (step sze) (samplng freq.) ) Granularty Nose : almost constant regon 3) Instablty to channel Nose Step sze effect : Step Sze () slope overload (samplng frequency ) () granular Nose Informaton Theory 7
Adaptve Delta Modulaton S k + + Adaptve Functon E K X k X k Unt Delay + + k k, Ek, mn stored E X K K K sgn X K K S K E K mn E K X K K E Ths adaptve approach smultaneously mnmzes the effects of both slope overload and granular nose. K f f K K mn mn Informaton Theory 8
DPCM Desgn There are two components to desgn n a DPCM system :. The predctor. The quantzer Ideally, the predctor and quantzer would be optmzed together usng a lnear or Nonlnear technque. In practce, a suboptmum desgn approach s adopted :. Lnear predctor. Zero-memory quantzer Remark : For ths approach, the number of quantzng levels, M, must be relatvely large (M8) to acheve good performance. Informaton Theory 9
Informaton Theory 0 Desgn of lnear predctor n n n n n n n j j n n n n n n n n R R R R a a a a R R R R R a R a R a S S a S S a S S a E R S E S S S E S S E R S S S E S S a S a S a S E S S a S a S a S E a S a S a S a S E a S S E S S e S a S a S a S 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0,,,,,, ˆ n,, 0, ˆ 0 n,,, 0 ˆ ˆ 0 ˆ
Ŝ 0 When comprses these optmzed coeffcents, a, then the mean square error sgnal s : σ But σ σ R e e e 00 : : E S E E E R S ˆ ˆ ˆ 0 S0 S0 E S0 S0 S0 ˆ ˆ S0 S0 S0 0 (orthogonal ˆ ˆ S S S E S E S S 00 0 0 Sˆ a R 0 0 0 the varance the varance 0 a of of R 0 0 a the dfference the orgnal n 0 R 0 0n sgnal sgnal prncple) The varance of the error sgnal s less than the varance of the orgnal sgnal. Informaton Theory
Remarks:. The complexty of the predctor depends on n.. n depends on the covarance propertes of the orgnal sgnal. Informaton Theory
Desgn of the DPCM Quantzer Revew of unform Quantzer: Quantzaton:. Zero-Memory quantzaton. Block quantzaton output Y 3. Sequental quantzaton X nput Mdtread quantzer Mdrser quantzer Quantzaton Error : Q E = y(x) X Average Dstorton : D yx X SNR : σ SNR 0log n db where σ 0 P X D dx : the varance of thenput x Informaton Theory 3
Unform quantzer : p(x) s constant wthn each nterval Lower overload regon Granular Y Regon upper overload regon X x x x 3 x 4 x 5 x 6 Quantzaton Error for Mdtread Quantzer Informaton Theory 4
Y y 8 y 7 y 6 y 5 x x x 3 x 4 x 6 x 7 x 8 y 4 x 5 x 9 X y 3 y y Unform Mdtread Quantzer M=9 Output level y always be n the mdpont of the nput nterval =x -x -. Assume p(x) s constant n the nterval =x -x - and equal to p(x ) Lower overload regon : =x 0 -x, x >> x 0 Granular regon : =x -x -, M- upper overload regon : =x M -x M-, x M >> x M- D ~ M M x x p y x x px x y x x 3 3 dx x x Where we assume the contrbuton of the overload regon s neglgble ;.e. p(x )=p(x M )=0 Informaton Theory 5
Informaton Theory 6 Snce V V V M M dx x x p x V x V D x p x p D x y x x y x 3 V V 3 the nput varance s ) ( p(x) s the pdf f ~ But ~ Quantzer characterstcs (Source Model)
Then SNR But SNR V M 0log 0log 0n log 0 for 0 0 D V 3 M SNR 0log0 M 0log n f M (n - bt quantzer) 6n (n db) - vald only for PCM Quantzer 0 M Informaton Theory 7
B. DPCM Quantzer The pdf of the nput sgnal to the DPCM quantzer s not at all unform. Snce a good predctor would be expected to result n many zero dfference between the predcted values and the actual values. A typcal shape for ths dstrbuton s a hghly peaked, around zero, dstrbuton. pdf : p(d) (Ex., Laplacan) + : Non-unform Quantzer s requred. Informaton Theory 8
X Compressor Cx C dc dx x x M max x Unform Quantzer Q C x Q Expander C Y C Q C x Non-unform Quantzer compressor + unform Quantzer + Expander Informaton Theory 9
C(X) Compressor X max C(X) unform Quantzer -X max X max X max -X max y C - (X) x Non-unform Quantzer Expander Informaton Theory 30
For ths model, the mean-square dstorton can be approxmately represented as : D where L x L L C' M L x p( x C' s s ) x M 3 x the quantzer range the slopeof L L x x p x the nonlnear dx max functon dc dx x xmax max x M x x MC' ( x) Informaton Theory 3
Lloyd-Max Quantzer : the most popular one.. Each nterval lmt should be mdway between the neghborng levels, y y x. Each level should be at the centrod of the nput prob. Densty functon over the nterval for that level, that s x x x y pxdx 0 dc dx Logarthmc Quantzer : -law x KX y x V log log x V A-law y (log PCM) Ax log A x V V log Ax log A V,,0 V A x V A x V : US. Canada, Japan : Europe Informaton Theory 3
If a Laplacan functon s used to model p(e), p e exp e e e Input pdf of the DPCM Quantzer then the varance of the quantzaton error s: g g 3M 9 M e V 0 exp 3 3 e e as e V de the SNR for the non-unform quantzer n DPCM becomes : SNR 0log 0log Snce M 0log 0 n 0 0 e g M 9 e SNR 6.5 6n 0log DPCM mproves thesnr by 0 For thesame pdf,pcm gves : SNR 6.5 6n 3 e Informaton Theory 33
ADPCM :. Adaptve predcton. Adaptve Quantzaton DPCM for Image Codng : Each scan lne of the mage s coded ndependently by the DPCM technques. For every slow tme-varyng mage (=0.95) and a Laplacan-pdf Quantzer, 8 to 0 db SNR mprovement over PCM can be expected : that s The SNR of 6-bt PCM can be acheved by 4-bt lne-by-lne DPCM for =0.97. Two-Dmensonal DPCM : two-d predctor Ex : u m, n a, um, n aum, n a um, n a um, n 3 4 Informaton Theory 34
Informaton Theory 35
Informaton Theory 36
Ⅲ. Transform Doman Codng Technques Transform Codng : (Block Quantzaton) A block of data s untarly transformed so that a large fracton of ts total energy s packed n relatvely few transform coeffcents, whch are quantzed ndependently. The optmum transform coder s defned as the one that mnmzes the mean square dstorton of the reproduced data for a gven number of total bts. the Karhunen-Loeve Transform (KLT) The functon of the transformaton s to make the orgnal samples so that the subsequent operaton of quantzaton may be done more effcently. Informaton Theory 37
In transform codng systems the total number of bts avalable for quantzng a block of transformed samples s fxed, and t s necessary to allocate these bts to the quantzed transformed samples n such a way as to mnmze the overall quantzaton dstorton. Informaton Theory 38
The KLT : U : nput vector : Nx random vector, covarance (zero mean) R A : NxN matrx, not necessary untary V : transformed vector, each components v(k) are mutually uncorrelated. B : NxN matrx U : reconstructed vector Problem : Fnd the optmum matrces A and B and the optmum quantzers such that the overall average mean square dstorton s mnmzed. D E N E N N n u n u n T u u u u Informaton Theory 39
Soluton :. For an arbtrary quantzer the optmal reconstructon matrx B s gven by B = A where s a dagonal matrx of elements r K defned as r K ~ K K ~ K K E v k E v * v * k k Informaton Theory 40
. The Lloyd-Max quantzer for each v(k) mnmzes the overall mean square error gvng = I (that s, B = A ) 3. The optmal decorrelatng matrx A s the KL transform of U, that s the rows of A are the orthonormalzed egenvectors of the autocovarance matrx R. Ths gves B = A = A* T Informaton Theory 4
Smplfcaton : Assume there s no quantzers Image [u(z)] N lnes N Pxels per lne u(x, y j ) : all the N pxels n the jth lne j =,, N [u(z)] = [u(x, j ), u(x, j ),,u(x, j L )] the N vector composed of all the pxels taken n the normal raster scanned pattern sequence. [V(w)] = [A] [U(z)] Transformed pxels N x N transform matrx Image vector Informaton Theory 4
v w uz t u z A v w A A u z vw k target : v N N w : uncorrelat ed k k A A k k, k,,..., N A s a matrx whose columns are the normalzed egenvectors of the covarance matrx of the orgnal pxels.,,,..., N Informaton Theory 43
Informaton Theory 44 The covarance matrx of u(z) :,, set and 0 assume N N N N N u u E u u E u u E u u E u E u u u E u E u u u E E u C u z u z u E z u E z u z u E z u E C
Let denote the egenvectors of C u : C u = det [C u I] = 0 Arrange s n decreasng order such that N and substtute nto (C u I) = 0 to solve for When the matrx [A] (whose columns are the functons) s appled to [u(z)], the covarance of the resultng coeffcents v(w k ) s a dagonal matrx wth dagonal elements,,, N. v(w k ) uncorrelated. That s C v A 0 T C A The KLT decorrelates the orgnal nput. u 0 N Informaton Theory 45
Remarks :. The KLT s nput data dependent, for an NxN mage, the egenvectors of an N xn matrx have to be found.. Gven a block of N samples, the KLT packs the maxmum amount of varance nto the frst k coeffcents (compared to any other transforms) where k<n. Ths permts hgher ordered coeffcents to be dscard to obtan a compresson. 3. The KLT mnmzes the mean-square error between the orgnal block of samples and the correspondng block of reconstructed samples. Ths mean-square error s equal to the sum of the varance of the dscard coeffcents. 4. Not beng a fast transform n general, the KLT can be replaced ether by a fast untary transform, such as the cosne, Sne, DFT, WHT, DHT, whch s not a perfect decorrelator, or by a fast decorrelatng transform, whch s not untary. Informaton Theory 46
Transform Bass Vectors T: NxN orthogonal trasnform T t = [t, t,, t N ] where the bass column vectors {t m } are real-valued and orthonormal, I.e., t t m t k, k 0, k m m where the row vector t t m t m, tm,, t m N The transform doman coeffcents are expressed by the matrx product Y = T X T t Informaton Theory 47
DFT : DHT : DCT : WHT : DST : t mn exp N where t mn t mn j, m - n - m, n cos sn N N m- n- cos N N tmn N where the m, n t mn P m n 0, N m, n m, m n N n N N are the th bnary dgts (0 or ) mn, N Sn, m n N N P N Informaton Theory 48
The bass pctures : The reconstructed mage can be represented as t X T YT : the reconstructed mage s a lnear combnaton of the transform coeffcen ts. Let X kl t t m N n m where the matrces m, t,, t denote a general bass row vector, then X N N t N n m y mn m y t mn km t m t t t That s, the reconstructed mage s the weghted sum of the bass pctures. An NxN mage wll be generated from N bass pctures. ln t n m, mn t t n k, l N m N are called the"bass pctures" Informaton Theory 49
Fg.. The 8*8 bass pcture set: (a) DHT, (b) DCT, (c) WHT, (d) DST. Informaton Theory 50
Informaton Theory 5
mean square error Informaton Theory 5
Bt Allocaton The transform coeffcents varances are generally unequal, and therefore each requres a dfferent number of Quantzng bts. Zonal Samplng : defne a fxed geometrc zone Threshold Samplng : the coeffcents are selected for retenton accordng to ther magntude relatve to a threshold value. Note : Zonal samplng : take the values n the zone of largest k varances. Threshold samplng : take the K values of largest magntudes. Informaton Theory 53
Informaton Theory 54
Informaton Theory 55
Informaton Theory 56
Human Vsual Model Effects : weghted MSE Adaptve Transform codng :. Adaptaton of transform. Adaptaton of bt allocaton 3. Adaptaton of quantzer levels Informaton Theory 57
Hybrd Codng / Vector DPCM If the mage s vector scanned, e.g., a column at a tme, then t s possble to generalze the DPCM technque by consderng vector recursve predctors. combne transform and predctve codng technques. The mage s transformed n one of ts dmensons to decorrelate the pxels n that drecton (drectonal flter). Each transformed coeffcents s then sequentally coded n the drecton by -D DPCM. Informaton Theory 58
IV. Image Codng n Vsual Telephony Voce sgnal : 3.4 KHz 4.3 Vdeo sgnal : 4.3 MHz 3.4 bandwdth reducton MHz khz 65 Vde o Vdeo Coder Vdeo decoder Aud o Audo Coder Transmsson Coder ISDN Transmsson decoder Audo decoder Data Sgnalng Fg 0. Vsual telephony system archtecture Informaton Theory 59
ISDN Access : Basc Access : avalable to every household & busness sutable for desk top face-to-face comm. B+D channels : B : 64 Kbs/s, D = 6 Kbs/s Prmary Access : (T lne) : n busness applcatons B+D channels : both B and D : 64 Kbs/s sutable for vdeo teleconferencng. ISDN : the bt rate for combned vdeo and audo servces s lmted to Px64 Kb/s where p =,,, 30 (for North Amercan net, p =,,, 3) Informaton Theory 60
Vdeo Sgnal Format : CCITT : Common Intermedate Format (CIF) ¼ CIF form 35x88x8x30x.5=36.5M 30 frames/sec, 8 bt/pxel, CIF : 36.5 Mb/s 570 64 Kb/s = ¼ CIF : 9. Mb/s 4.5 ¼ CIF, 0 frames/sec : 4.5/3 = 47.5 Compresson rato Informaton Theory 6
Vdeo Codng Algorthm A hybrd transform / DPCM wth moton estmaton. Intra-frame codng : transform codng : DCT Inter-frame codng : predctve codng + moton estmaton Informaton Theory 6
Informaton Theory 63
Moton Estmaton : block matchng where a(,j) s the lumnance pxel value n a 6x6 macroblock n the current frame, b(+v, j+h) s the correspondng lumnance pxel value n a 6x6 macroblock that s shfted v pxels vertcally and h pxels horzontally n the prevous frame, and (V,H) s the value of (v, h) whch yelds the mnmum of the double sums of absolute dfference n a trackng range. (n practcal, -8 to +7 pxels) Teleconference HDTV Moton Vector (V, Mn v, h j, j b v, j h Source Codng / Entropy Codng 6 6 a H) Informaton Theory 64
V. Codng of Two-Tone Images Two-Tone Image : lne drawngs, letters, newsprnt, maps, documents Transmsson over telephone-lne and data-lne CCITT (Comté Consultantf Internatonal de Téléphone et Telégraphe) Standard samplng rate : : A4 (8 ½ n x n) Group 3 dgtal facmle apparatus 3.85 ones / mllmeter : normal resoluton 7.7 lnes / mllmeter : hgh resoluton (vertcal drecton) Horzontal samplng rate : 78 pxels / lne (7.7 lnes / mm, 78 pxels / lne) 00 ponts per nch Informaton Theory 65
For Newspaper pages and other documents that contan text as well as halftone mages, samplng rates of 400 to 000 pp are used. A4,.87 x 0 6 bts are requred at (00pp x 00 lp) samplng densty 4.8 Kbs telephone 6 mn Informaton Theory 66
Informaton Theory 67
Informaton Theory 68
The Facts that lead to compressblty of Bnary Images :. Most pxels are whlte. The block pxels occur wth a regularty that manfests tself n the form of characters, symbols or connected boundares. Three Basc concepts of Codng Bnary Image :. Codng only transton ponts between black and whte. Skppng whte 3. Pattern Recognton Informaton Theory 69
Run-length codng : (RLC) Snce whte ( s) and black (0 s) runs alternate, the color of the run need not be coded. Ex: 0000 0000 0 0000 0000 0000 0000 8W 3B 5W 4B W The run lengths can be coded by fxed-length m-bt codewords, each representng a block of maxmum run length M-, M = m, where M can be optmzed to maxmze compresson. Informaton Theory 70
Huffman Codng : Varable length codng Code length of a symbol depends on the probabltes (frequences of occurrence) of that symbol. P L Informaton Theory 7
Modfed Huffman codes : CCITT Group 3 (-D) 在資料方面, 一行 FAX 的資料, 是由一連串長度不同的 code word 所組成的 每一個 code word 代表某一長度的全黑或全白點 並且每一行資料皆是由不同長度的黑點或白點交替所組成的 為了使傳送者和接收者保持顏色同步 (colour synchronzaton),g3 假設每一行資料的第一個 code word 代表白色 若是實際上一行上第一個點是黑色時, 那麼在 G3 裡規定需先傳送一長度為零, 代表白色的 code word Code word 可分為尾字元 (termnatng code words) 和組成字元 (make-up code word), 不同的長度值 (0-560) 皆可用一個尾字元或一個組成字元加一尾字元來表示 G3 編碼的技術如下 :. 若長度值為 0~63, 則由尾字元表 (termnatng code word table) 的碼來表示 請注意相同長度的黑 白點其尾字元不同. 若長度值為 64 ~78 則用長度為 64 長度值 / 64 的組成字元和一個長度為 :( 長度值 64 長度值 / 64 ) 的尾自原來代表 3., 的編碼程序將持續, 直到一行中所有的 run 都被編碼完成 4. 在每一編碼完成的資料行後面, 需加 EOL 碼 Informaton Theory 7
在控制方面, 一維長度的編碼使用到三個控制碼. 行結束碼 (End-Of-Lne) 格式 :0000000000 說明 : 行結束碼伴隨著一行資料之後 在一正確有效的資料行內部不可有此碼出現, 因此當有錯誤發生時, 可在下一行開始傳送者和接收者重新達成同步 除此之外, 這一個訊號將在每一頁第一行資料碼送出前送出. 填充碼 (Fll) 格式 : 任何長度的 0 字串 說明 : 填充碼的作用在保證由傳輸資料碼 填充碼及行結束碼所構成的傳輸時間不小於在 pre_message control procedure 所定義的最小傳輸時間 3. 回碼 (Return to Control) 格式 :0000000000 0000000000 ( 總共 6 個行結束碼 ) 說明 : 一維編碼使用 6 個連續的行結束碼來表示文件傳輸的結束 Fg. 5 說明了定義在一維編碼信號的關係 其中上圖顯示一頁開始時, 資料的格式, 下圖顯示一頁中最末一行資料的格式 Informaton Theory 73
Fg. 5. Transmsson protocol Informaton Theory 74
Informaton Theory 75
Informaton Theory 76
Whte Block Skppng Each scan lne s dvded nto blocks of N pxels. If the block contans all whte pxels, t s coded by a 0. Otherwse, the code word has N+ bts, the frst bt beng, followed by the bnary pattern of the block. Ex : 0000 0000 0 0000 0000 0000 0000 0 0 0 0 0 0 0 CCITT : Two-Dmensonal codng of Document Relatve Element Address Desgnate (READ) Algorthm. Proc. IEEE, vol. 73, No., Feb. 985, page 865 Informaton Theory 77
VI. References Data compresson has been a topc of mmense nterest n dgtal mage processng. Several specal ssues and revew papers have been devoted to ths. For detals and extended bblographes :. Specal Issues (a) Proc. IEEE 55, no. 3 (March 967), (b) IEEE Commun. Tech. COM- 9, no. 6, part I (December 97), (c) IEEE trans. Commun. COM-5, no. (November 977), (d) Proc. IEEE 68, no. 7 (July 980), (e) IEEE Trans. Commun. COM-9 (December 98), (f) Proc. IEEE 73, no. (February 985).. T. S. Huang and O. J. Tretak (eds.). Pcture Bandwdth Compresson. New York: Gordon and Breach, 97. 3. L. D. Davsson and R. M. Gray (eds.). Data Compresson. Benchmark Papers n Electrcal Engneerng and Computer Scence, Stroudsberg, Penn.: Dowden Hunchnson & Ross, Inc., 976. 4. W. K. Pratt (ed.). Image Transmsson Technques. New York: Academc Press, 979. 5. A. N. Netraval and J. O. Lmb. Pcture Codng : A Revew. Proc. IEEE 68, no. 3 (March 980): 366-406. 6. A. K. Jan, P. M. Farrelle, and V. R. Algaz, Image Data Compresson: A Revew. Proc. IEEE 69, no. 3 (March 98): 349-389. 7. N. S. Jayant and P. M. Noll. Dgtal Codng of Waveforms. Englewood Clfs, N. J.: Prentce-Hall, 984. 8. A. K. Jan and P. M. Farrelle, and V. R. Algaz. Image Data Compresson. In Dgtal Image Processng Technques, M. P. Ekstrom, ed. New York: Academc Press, 984. 9. E. Dubos, B. Prasada, and M. S. Sabr. Image Sequence Codng. In Image Sequence Analyss, T. S. Huang, (ed.) New York: Sprnger-Verlag, 98. Pp. 9-88. Informaton Theory 78
For some early work on predctve codng, Delta modulaton and DPCM see Olver, Harrson, O Neal, and others n Bell Systems Techncal Journal ssues of July 95, May-June 97. For more recent work : 5. J. B. O Neal, Jr. Dfferental Pulse-Code Modulaton (DPCM) wth Entropy Codng. IEEE Trans. Inform. Theory IT-, no. (March 976): 69-74. Also see vol. IT-3 (November 977): 697-707. 6. V. R. Algaz and J. T. DeWtte. Theoretcal Performance of Entropy Coded DPCM. IEEE Trans. Commun. COM-30, no. 5 (May 98): 088-095. 7. J. W. Modestno and V. Bhaskaran. Robust Two-Dmensonal Tree Encodng of Images. IEEE Trans. Commun. COM-9, no. (December 98): 786-798. 8. A. N. Netraval. On Quantzers for DPCM Codng of Pcture Sgnals. IEEE Trans. Inform. Theory IT-3 (May 977): 360-370. Also see Proc. IEEE 65 (Aprl 977): 536-548. Informaton Theory 79
For adaptve DPCM, see Zschunk (pp. 95-30) and Habb (pp. 75-84) n [c]. 9. L. H. Zetterberg, S. Ercsson, C. Couturer. DPCM Pcture Codng wth Two- Dmensonal control of Adaptve Quantzaton. IEEE Trans. Commun. COM- 3, no. 4 (Aprl 984): 457-46. 0. H. M. Hang and J. W. Woods. Predctve Vector Quantzaton of Images, IEEE Trans. Commun., (985). For early work on transform codng and subsequent developments and examples of dfferent transforms and algorthms, see Pratt and Andrews (pp. 55-554), Woods and Huang (pp. 555-573) n [], and :. A. Habb and P. A. Wntz. Image Codng by Lnear Transformaton and Block Quantzaton. IEEE Trans. Commun. Tech. COM-9, no. (February 97): 50-63. 3. P. A. Wntz. Transform Pcture Codng. Proc. IEEE 60, no. 7 (July 97): 809-83. 4. W. K. Pratt, W. H. Chen, and L. R. Welch. Slant Transform Image Codng. IEEE Trans. Commun. COM-, NO. 8 (August 974): 075-093. 5. K. R. Rao, and M. A. Narasmhan, and K. Revulur. Image Data Processng by Hadamard-Haar Transform. IEEE Trans. Computers C-3, no. 9 (September 975): 888-896. Informaton Theory 80
The concepts of fast KL transform and recursve block codng were ntroduced n [6 and Ref 7, Ch 5]. For detals and extensons see [6], Mer et al. (pp. 78-735) n [e], Jan et al. n [8], and : 6. A. K. Jan. A Fast Karhunen-Loeve Transform for a Class of Random Processes. IEEE Trans. Commun. COM-4 (September 976): 03-09. [f] s devoted to codng of two tone mages. Detals of CCITT standards and varous algorthms are avalable here. Some other useful references are Arps (pp. -76) n [4], Huang n [c, ], Musmann and Preuss n [c], and : 43. H. Kobayash and L. R. Bahl. Image Data Compresson by Predctve Codng I: Predcton Algorthms. and II: Encodng Algortyms. IB< J. Res. Dev. 8, no. (March 974): 64-79. 44. T. S. Huang and A. B. S. Hussan. Facsmle Codng by Skppng Whte. IEEE Trans. Commun. COM-3, no- (December 975): 45-466. Informaton Theory 8