SW Problem: The Rate Region. Slepian-Wolf (SW) Problem. Source Coding with Decoder Side Information (Asymmetric SW) Compression of Correlated Sources

Size: px
Start display at page:

Download "SW Problem: The Rate Region. Slepian-Wolf (SW) Problem. Source Coding with Decoder Side Information (Asymmetric SW) Compression of Correlated Sources"

Transcription

1 Contact Details Distributed Source Coding: Theory and Practice EUSIPCO 8 Lausanne, Switzerland August Speakers: Dr Vladimir Stankovic, Dr Lina Stankovic, Dr Samuel Cheng Vladimir Stanković Department of Electronic and Electrical Engineering, University of Strathclyde vladimir.stankovic@eee.strath.ac.uk Web: Lina Stanković Department of Electronic and Electrical Engineering, University of Strathclyde lina.stankovic@eee.strath.ac.uk Web: Samuel Cheng Department of Electrical and Computer Engineering, University of Oklahoma samuel.cheng@ou.edu Web: Distributed Source Coding (DSC) Joint Compression of two or more physically separated sources The sources do not communicate with each other (hence distributed coding) Noiseless transmission to the decoder Decoding is performed jointly A compression or source coding problem of network information theory Motivation Increased interest in DSC due to many potential applications Data gathering in wireless sensor networks Distributed (or Wyner-Ziv) video coding Multiple description coding Compressing encrypted data Streaming from multiple servers Hyper-spectral imagining Multiview and 3D video Cooperative wireless communications Talk Roadmap Theory Code designs Practice Slepian-Wolf (SW) problem SW coding Distributed Video Coding Wyner-Ziv (WZ) problem WZ coding Multimedia Streaming DSC: Problem Setup and Theoretical Bounds Multiterminal (MT) problem MT coding Wireless Sensor Networks DSC and Network coding (NC) DS-NC Cooperative Diversity 1

2 and are discrete, correlated sources Slepian-Wolf (SW) Problem R R Joint (ˆ, ) ˆ Joint Encoding: R=R +R =H()+H( )=H(,) < H()+H() Separate Encoding: R=R +R =H()+H( )=H(,) < H()+H() R and R are compression rates Separate encoding is as efficient as joint encoding! R H(,) H() H( ) H( ) SW Problem: The Rate Region separate encoding and decoding H() H(,) R R H ( ); R H ( ); R + R H (, ) Achievable rate region Source Coding with Side Information (Asymmetric SW) Source Lossless Compression of Correlated Sources Temp max = 31 degrees + sign bit = 6 bits R H(,) R H ( ) H() H( ) A H( ) B H() H(,) R decoder side information (SI) = 11 = 12 6 bits Correlation: =[-1,, +1, +2] Central Station 2 bits=11 =11+1=12 Compression of Correlated Sources Compression of Correlated Sources Disjoint sets - bins Disjoint sets - bins = 11 = 12 6 bits Central Station? bits=11 Side information = 11 = 12 6 bits Central Station 2 bits=11 Side information =12 Correlation: =[-1,, +1, +2] Correlation: =[-1,, +1, +2] Slepian-Wolf theorem: Still two bits are needed for compressing! 2

3 Slepian-Wolf network (Slepian & Wolf 73) Slepian-Wolf-Cover network (Wolf 74, Cover 75) Slepian-Wolf network (Slepian & Wolf 73) Slepian-Wolf-Cover network (Wolf 74, Cover 75) Lossless multiterminal network (Csiszár & Körner 8, Han & Kobayashi 8) Uncorrelated sources over network (Ahlswede et al. ) Network Lossless multiterminal network (Csiszár & Körner 8, Han & Kobayashi 8) Correlated sources over network (Song & eung 1) Network Correlated Sources over Network: Problem Setup Max-flow=min-cut (Graph Theory) {i} a1 a2 {i} Network of noiseless channels with rate limits G(V,E,c). b1 b2 b3 br and discrete, correlated, memoryless sources V set of nodes in the network E set of edges Edge e E noiseless channel with bit-rate constraint c(e) c vector of rate constraints c(e), e E {i} a1 a2 {i} Network of noiseless channels with rate limits G(V,E,c). b1 b2 b3 br a 1 u Each destination b i wants to reconstruct perfectly both and How do we determine c? Cuts determine the bit-rate constraint of the links between any two nodes by disconnecting the edges in the graph network b 1 b 3 b 2 mincut(a 1, b 2 )=2. Theoretical Limits (Han 8, Song & eung 1) Asymmetric SW {i} a1 a2 {i} Network of noiseless channels with rate limits G(V,E,c). b1 b2 b3 br Source Lossless A rate vector c is achievable if and only if : Each cut separating a 1 from any b has at least capacity H( ) Each cut separating a 2 from any b has at least capacity H( ) Each cut separating a 1 and a 2 from any b has at least capacity H(,) Extending above, i.e., source coding with decoder side information, to lossy coding Wyner-Ziv coding 3

4 Wyner-Ziv (WZ) Problem Wyner-Ziv (WZ) Problem Lossy source coding of with decoder side information Extension of asymmetric SW setup to rate-distortion theory Distortion constraint at the decoder: E[d(,)] D V Source Lossy Encoding Wyner-Ziv rate-distortion function Decoding ˆ Source Lossy Encoding V Decoding ˆ Wyner-Ziv rate-distortion function Source V ˆ Lossy Encoding Decoding Conditional rate-distortion function (side information at both sides) WZ: Lossy Source Coding with SI Source Lossy In general, R WZ R, i.e., there is a rate loss in WZ coding compared to joint encoding Rate loss (Zamir 96) : Less than.22 bit for binary sources and Hamming measure Less than.5 bit for continuous sources and mean-square error (MSE) measure Z WZ: Jointly Gaussian Sources with MSE Distortion Measure α Lossy Encoding V Decoding For MSE distortion and jointly Gaussian and, rate-distortion function is the same as for joint encoding and joint decoding Correlation model: = α+z, where ~ N(,σ 2 ) and Z ~ N(, σ Z2 ) are independent, Gaussian RWZ = R 2 1 σ Z ( D) = log 2 D No rate loss in this case compared to joint encoding R ˆ Non-asymmetric SW Joint Multiterminal (MT) Source Coding (Berger & Tung 77, amamoto & Itoh 8) Non-asymmetric WZ setup Extension of the SW setup to rate-distortion theory Two types: direct and indirect/remote MT source coding Extending above to lossy coding Multiterminal source coding 4

5 Quadratic Gaussian MT Source Coding with MSE Distortion Quadratic Gaussian Direct MT Source Coding with MSE Distortion (Wagner 5) 2 2 for D 1 =D 2 and σ y1= σ y 2 1 and 2 quadratic Gaussian sources with 2 2 variances and σ y1 σ y 2 - correlation coefficient Direct MT Coding Indirect MT Coding - Distortion constraints Quadratic Gaussian Indirect MT Source Coding with MSE Distortion (Oohama 5) are zero-mean mutually independent Gaussian random variables with variances σ x, σ n1 and σ n2. Two noisy observations: - Distortion constraint Wyner & Ziv 76, 78 Direct MT Berger & Tung 77 Omura & Housewright 77 Oohama 97, 5 (Gaussian case) Wagner 5 (two Gaussian sources) LOSS Indirect MT Slepian & Wolf 73 amamoto & Itoh 8 Flynn & Gray 87 Viswanathan & Berger 97 (CEO) Oohama 98 (Gaussian CEO) Viswanath 2 Chen, Zhao, Berger & Wicker 3 Oohama 5 (Jointly Gaussian case) LOSSLESS Wolf 74 (multiple sources) Cover 75 (ergodic processes) Wyner & Gray 74, 75 (simple network) Ahlswede & Körner 75 Sgarro 77 (two-help-one) Körner & Marton (zig-zag network ) Gel fand & Pinsker 8 (lossless CEO problem) Csiszár & Körner 8 Han & Kobayashi 8 (lossless MT network) Han 8 Song & eung 1 (sources over the network) Source Coding with Side Information DSC: Code Design Guidelines and Coding Solutions R H(,) Source Lossless R H ( ) H() A decoder side information (SI) H( ) B H( ) H() H(,) R 5

6 Compression of Correlated Sources Disjoint sets - bins Channel Codes for Compression: Algebraic Binning (Wyner 74, Zamir et al. 2) S Source bin a coset channel code s 1 = s 2 s 3 s 2 n-k 1 = 11 = bits Central Station 2 bits=11 Side information =12 Correlation: =[-1,, +1, +2] Slepian-Wolf theorem: Still two bits are needed for compressing! C Distribute all possible realizations of (of length n) into bins Each bin is a coset of an (n,k) linear channel code C, with parity-check matrix H, indexed by a syndrome s Source Encoding S Source Decoding S Correlation Channel s 1 s 2 s 3 s 2 n-k s 1 s 2 s 3 s 2n-k x x y Encoding: For an input x, form a syndrome s=xh T Send the resulting syndrome (s 2 ) to the decoder Interpret y as a noisy version (output of virtual communication channel called correlation channel ) of x Find a codeword of the coset indexed by s closest to y by performing conventional channel decoding An Example (Pradhan & Ramchandran 99)? Correlation model and are discrete binary sources of length n=3 bits and differ at most in one position (d H (, ) 1) If is also given to the encoder (joint encoding), obviously we can compress with 2 bits C channel code with parity check matrix x {,111}:s=xH T = x {1,11}:s=xH T =1 x {1,11}:s=xH T =1 x {11,1}:s=xH T =11 s ={,111} s 1 ={1,11} s 1 ={1,11} s 11 ={1,11} Another Example: Hamming Code Two uniformly distributed sources ( and ) Length: n=7 bits Correlation: d H(,) 1 bit Slepian-Wolf bound: R +R =nh(,)=1 bits Asymmetric SW coding: R =nh()=7 bits, R =nh( )=3 bits 6

7 Source Syndrome former s Conventional channel s T =Hx T decoder Correlation Channel Systematic (7,4) Hamming code C (can correct one bit error) G 4x7=[I 4 P]= Suppose that realizations are: x = [ u 1 u 2 ] = [ 1 11 ] y = [ v 1 v 2 ] = [ ] H 3x7= P4x3 T R y I 3 (3,7) Encoding: sx T = Hx T = y T = Decoding: P T u1 T u2 T 1) Form 7 -length vectors : t 1 T= t 2 T=y T = = [ 1] T = [ ] T O 4x1 P T u 1 T u 2 T = [ 1] T v 1 T v 2 T padded = [11 11 ] T 3 bits! 7 bits! R x 2) Find codeword c in C closest to t = t1t t2 = [11 111] u 1 G t=[u 1 u 1 P ] e e=x y correlation noise 3) Error-correction decoding: c = t1 t2 x y = [1 111 ] [u 1 u 1 P] [ u 2 u 1 P] 4) Reconstruction: x = u1g t1=[1 11] = x [ u1 u 2 ] v 1 u 1 P [u 2 v 2 ] [ u1 u1p] Asymmetric Syndrome Concept Take an (n,k) linear channel code to partition the space of n-length source into cosets indexed by different syndromes (of length n-k bits) Added redundancy TO PROTECT k Systematic channel coding Compression rate: R=(n-k)/n n Removed redundancy TO COMPRESS Syndrome-based compression n-k Asymmetric Binning From Asymmetric to Symmetric Binning Example: Suppose that and are i.i.d. uniform sources of length n=4 bits each. Code at R y =nh()=4 bits and at R x =nh( )=2 bits. Total transmission rate R + R =4+2=6 bits Code and at R x =R y =3 bits Total transmission rate R x + R y =3+3=6 bits : bin index of the bin: 2 bits 2 2 =4 bins : index of the bin: 3 bits : index of the bin: 4 bits 2 4 =16 bins 4 bits : index of the bin: 3 bits Both and are compressed 7

8 Implementation (Pradhan & Ramchandran 5) Generate a channel code C and partition it into nonoverlapping subcodes C 1 and C 2 C 2 : set of cosets representatives of C 1 in C Assign subcode C i to encoder i, i=1,2 Hamming Code Example Revisited Two uniformly distributed sources ( and ) Length: n=7 bits Correlation: d H(x,y) 1 bit Slepian-Wolf bound: R +R =nh(,)=1 bits Asymmetric coding: R =nh()=7 bits, R =nh( )=3 bits Non-asymmetric coding: R =nh()=6 bits, R =nh( )=4 bits Symmetric coding: R =R =5 bits! Systematic (7,4) Hamming code C G 4x7=[I 4 P]= G1=[I 3 O 3x1 P13x3] G2=[O 1x3 I 1 P21x3] R y (4,6) R x Systematic (7,4) Hamming code C G 4x7=[I 4 P]= G1=[I 2 O 2x2 P1 2x3] G2=[O 2x2 I 2 P22x3] R y (5,5) R x H1 O1x3 I3 O3x1 O3x3 4x7= T I P1 4 3x3 H2 6x7= T O 3x3 P2 3x1 I 3 H1 O2x2 I2 O2x2 O2x3 5x7= T I 5 P1 3x2 H2 5x7= T O 3x2 P2 3x2 I 3 x = [ u 1 u 2 u 3 ] = [ 1 11 ] y = [ v 1 v 2 v 3 ] = [ ] x = [ u 1 u 2 u 3 ] = [ 1 11 ] y = [ v 1 v 2 v 3 ] = [ ] Encoding: sx T = H1 x T = u2 T P1 T u1 T u3 T coded = [ 1] T 4 bits! Encoding: sx T = H1 x T = u2 T P1 T u1 T u3 T coded = [1 1 1 ] T 5 bits! sy T = H2 y T = v1 T P2 T v2 T v3 T = [ ] T code 6 bits! sy T = H2 y T = v1 T P2 T v2 T v3 T = [ 1 1] T coded 5 bits! Decoding: 1) Form 7 -length vectors : t 1 T= t 2 T= O 2x1 u T 2 = [ ] T P 1 Tu T 1 u T 3 v 1 T O 2x1 P 2 Tv 2 T v 3 T = [ 1 1] T 4) Reconstruction: x = u1g1 t1=[111] = x S x transmitted padded padded S transmitted y = u2g2 t2=[1111] = y x T = u 1 T u 2 T u 3 T v 1 T v 2 T v 3 T y T = v 1 u 2 2) Find codeword c in C closest to t = t1t t2 = [ ] 3) Error-correction decoding: c = x y t1 t2 = [1 111 ] [ u1 v2 [u1 v2]p ] General Syndrome Concept Applicable to all linear channel codes (inc. turbo and LDPC codes) Key lies in correlation modeling: if the correlation can be modeled with a simple communication channel, existing channel codes can be used SW code will be good if the employed channel code is good for a correlation channel If the channel code approaches capacity for the correlation channel, then the SW code approaches the SW limit Complexity is close to that of conventional channel coding 8

9 Parity-based Binning Syndrome approach: To compress an n -bit source, index each bin with a syndrome from a linear channel code (n,k) Parity-based approach: To compress an n -bit source, index each bin with (n-k) parity bits p of a codeword of a systematic (2n-k,n) channel code Compression rate: R =(n-k)/n x (2n-k,n) systematic channel code p Conventional systematic channel x encoding Correlation Channel Removed y Conventional channel decoder x Syndrome vs. Parity Binning Syndrome-based approach works better because Code distance property preserved For the same compression length, minimum codeword size is used Good channel code -> good SW code of the same performance Parity-based binning has advantages Good for noisy SW coding problem because in contrast to syndromes, parity bits can protect Simpler (conventional encoding and decoding) Simple puncturing mechanism can be used to realize different coding rates Practical Wyner-Ziv Coding (WZC) Practical WZC Solutions Practical SW coding with algebraic binning based on channel codes for discrete sources In WZ coding, we are dealing with continuous space, hence syndrome approach alone will not work! Questions: What is a good choice of binning? How to perform coding efficiently? Three types of solutions proposed: Nested quantization Combined quantization and SW coding (DISCUS, IT March 23) Quantization followed by SW coding (Slepian-Wolf coded quantization - SWCQ) We will focus on the first and third method We will assume correlation model between source and SI : =+Z with Z~N(,σ 2 Z) Improved performance Nested Scalar Quantization (NSQ) Nested Lattice Quantization (LQ) : Bin : Bin 1 : Bin 2 : Bin 3 p y ( x ) Uniform scalar quantizer with step-size q It can be seen as four nested uniform scalar quantizers (red, green, blue, yellow), each with step-size 4q D is fine distortion that will remain after decoding d is coarse distortion that can appear if there are decoding errors SI Nested lattice A (fine) lattice is partitioned into sublattices (coarse lattices) A bin: the union of the original Voronoi regions of points of a sublattice : bin 8 : bin 4 : bin 3 9

10 Nested Lattice Quantization Encoding: output index of the bin containing Quantize using the fine lattice Output the index V of the coarse lattice containing quantized lattice point Decoding: find lattice point of sublattice V that is closest to Quantize using sublattice V Bin index: V = 8 Nested Lattice Quantization Encoding: output index of the bin containing Quantize using the fine lattice Output the index V of the coarse lattice containing quantized lattice point Decoding: find lattice point of sublattice V that is closest to Quantize using sublattice V Bin index: V = 8 SW Coded Quantization (SWCQ) Nested lattice quantization is asymptotically optimal as dimensions go to infinity Difficult to implement even in low dimensions The bin index V and the SI are still highly correlated, i.e., H(V ) > H(V ) Note that conventional lossless compression techniques (e.g., Huffman coding) are fruitless since is not given to the encoder Use SW coding to further compress V! Further improvement: Use estimation instead of reconstructing to a lattice point Nested Quantization V Slepian-Wolf Encoding Slepian-Wolf Decoding V Estimation Practical SWCQ Practical realization: (nested) quantization followed by channel coding for SW coding WZ coding is a source-channel coding problem Quantization loss due to source coding Binning loss due to channel coding To approach the WZ limit, one needs Strong source codes (e.g., TCVQ and TCQ) Near-capacity channel codes (e.g., turbo and LDPC) Estimation of based on V and SI helps at low rate, thus rely more on SI at lower rates V at higher rates WZC vs. Classic Source Coding Classic entropy-constrained quantization (ECQ) Quantization Wyner-Ziv coding (SWCQ) Nested Quantization Entropy Coding Slepian-Wolf Coding Nested quantization: quantization with SI Slepian-Wolf coding: entropy coding with SI Classic source coding is just a special case of WZ coding (since the SI can be assumed to be a constant) WZC vs. Classic Source Coding (SC) ECQ ECSQ ECLQ (2-D) ECTCQ Classic SC Gap to SWCQ Gap to D (R) (R) 1.53 db 1.36 db.2 db WZC SWC-SQ SWC-LQ SWC-TCQ D WZ 1.53 db 1.36 db.2 db Same performance limits at high rate! (Assuming ideal entropy coding and ideal SW coding) 1

11 Layer WZ Coding Q(.) V V 1 V 2 Split Bitplane V n : binary SW encoder : binary SW decoder SWC ( y ', V1 ) 1 Vn 1 y ' ( y ', V,..., ) Combine Bitplane Vˆ Estimate Side info at k th level = 1 k 1 y ( y ', V,..., V ) ˆ LDPC Code for binary SW Coding LDPC code is a linear block code LDPC stands for low-density parity-check Low-density means its parity-check matrix is sparse Message-passing decoding algorithm Suboptimal but effective Pros Exists flexible and systematic design techniques for arbitrary channels Designed codes have excellent performance Tanner Graph Consider a length-3 block code with parity check matrix A binary vector V=[V 1,V 2,V 3 ] is a codeword if H T V= V 1 V 2 V 3 V 1 +V 2 : variable node : check node V 1 +V 2 +V 3 Message Passing Decoding 1. Initialization: Compute the belief of actual transmitted bit at each variable node 2. Iteration: Pass beliefs from variable nodes to check nodes; combine beliefs Pass beliefs from check nodes to variable nodes; combine beliefs 3. Exit condition: Estimate variable node values by thresholding current beliefs. Exit if the estimates form a valid codeword; otherwise, back to 2. p( y V = ) Belief usually in the form of log-likelihood ratio log p( y V = 1) V 1 V 2 V 3 V : variable node : check node channel Message Passing Decoding If we assume all messages are independent SW Encoding with LDPC Codes 1 S y Encoding: Output check values S Compression rate: R=(n-k)/n = 4/6 = 2/3 V 1 uncompressed binary source 1 compressed output (syndrome) p( y V = ) p( y V = 1) Φ m m tanh = tanh tanh Ψ= log + m1+ m Message passing decoding performs well for long blocklength code with relatively few connections (low-density) 1 1 : variable node : check node 1 11

12 SW Decoding with LDPC Codes Decoding: View SI as hypothetical outputs of a channel Input received S as check node values Decode to a code vector syndromes V instead of a codeword??? uncompressed binary source??? 1 compressed output (syndrome) : variable node : check node 1 S Message Passing Decoding If we assume all messages are independent y p( y V = ) Ψ= log + m1+ m p( y V = 1) 2 Φ tanh = (1 2 s) When v is a codeword of the LDPC code (s is all-zero sequence), SWC decoding LDPC channel decoding s m m tanh tanh 2 2 SW Code Analysis Major assumption: the SW decoding error probability is independent of the source v The same assumption in conventional channel coding If the assumption holds, performance of SW coding = performance of SW coding given v= (hence s=) transmitted = performance of LDPC code on channel V We can design SW code using conventional LDPC code design technique on V Sign Symmetry Code performance will be independent of input code vector if p(y V=)=p(-y V=1) P ( y V = 1) Error occurs when V= P ( y V = ) Error occurs when V=1 Symmetry commonly holds in conventional channels (e.g., BSC) Best decision: ˆ, if y V = 1, if y < Intuition of Sign Symmetry P ( y V = 1) P ( y V = ) Error ~ x + x Relaxing Sign Symmetry Sign symmetry is too restrictive Will not work for layer setup V 1 V 2 y ' y= ( y ', V ) 1 Side info not even a scalar (n-tuple in general!) 1 1 Error ~ x + x Error probability independent of the input! Obvious non-sign symmetry distribution resulting in code performance independent of input P ( y V = 1) P ( y V = ) p ( y V = ) p( y V = 1) 12

13 Dual Symmetry Dual Symmetry? If p(g(y) V=)=p(y V=1) with some g( ) that g (g (y ))=y (i.e., g ( )=g -1 ( )) Then, probability of error independent of input codeword g( ) ES! ES! g( ) P( y V = 1) g( ) P( y V = ) NO! Why Dual Symmetry Works? Let l be the log-likelihood ratio of the output p( y V = ) l= log p ( y V = 1) No loss to take l as output instead, easy to show that if dual symmetry is satisfied then L is sign symmetric Hence, code performance independent of input Toy Problem Let ~N (µ,1) source =+N (,σ 2 ) side information p(x) µ p(y) Q.: Given 1 bit to quantize, what will be a good quantizer for SW coding? Answer: Trust your intuition! Toy Problem 1 µ Let V =, then V is dual-symmetric < µ p(x) p(y) V=1 µ V= p(x) p(y) V=1 µ V= V=1 g ( ) p ( y V = 1) p ( y V = ) p ( y V = ) p ( y V = 1) Not symmetric Simulation Results Asymmetric SW Non-asymmetric SW Quadratic Gaussian WZ 1D lattice 2D lattice Trellis Coded Quantization (TCQ) MT source coding 13

14 BER for i Asymmetric Binning for SW regular LDPC 1 4 best nonsyndrome turbo For two sources, code rate = ½ serial irregular LDPC 1 4 parallel 1 5 irregular LDPC 1 5 Wyner-Ziv limit Slepian-Wolf limit irregular LDPC threshold Non-asymmetric Binning for SW Codeword length 2, bits LDPC code Turbo code H( i i )=H(p) (bits) Gaussian WZC (NSQ 1-D Lattice) Gaussian WZC (2-D Nested Lattice) 1.53dB Gaussian WZC (with TCVQ) MT Source Code Design N1 1 + N2 2 + Terminal 1 V 1 Quantizer1 SW I Quantizer2 V 2 SW II R1 R2 Central Unit V 1 SW Estimator V 2 Terminal 2 Conventional quantization + lossless nonasymmetric Slepian-Wolf coding of quantization indices V 1 and V 2 (ang, Stankovic, iong, Zhao, IEEE IT, March 28) 14

15 Gaussian MT (with TCQ) Pradhan & Ramchandran 99 (DISCUS) LOSS LOSSLESS Direct MT D 1 =D 2 =-3 db, ρ=.99 Indirect MT D= db, σ n1 =σ n2 =1/99 Wyner-Ziv coding: Chou et al. 3 Liveris et al. 4 ang et al. 4 Cheng & iong 5 Direct MT coding: ang, Stanković, iong, Zhao 5, 7 Indirect MT coding: Flynn & Gray 88 (index reuse) Pradhan & Ramchandran 5 (SQ+Trellis codes) ang, Stanković, iong, Zhao 4, 7 (asymmetric scheme) ang, Stanković, iong, Zhao 5, 7 (source-splitting and symmetric scheme) Slepian-Wolf (SW) coding: Bajcsy & Mitran 1 Aaron & Girod 2 Garcia-Frias et al. 1, 3 Schonberg et al. 2, 4 Liveris et al. 2, 3 Coleman et. al. 4 Li et al. 4 Stanković et al. 4 Sartipi & Fekri 7 Eckford & u (rateless) 7 MT networks: Stanković et al. 4 Distributed source-network coding: u, Stanković, iong, Kung 5 A Step Back: Reality DSC: Key Applications Despite great recent theoretical achievements, no commercial product exploits in any way DSC yet Practical limitations: Real sources are not Gaussian, but can be often approximated as Gaussian Correlation statistics varying, difficult to model, track/predict To get good performance long block lengths are needed for channel codes, thus resulting in delay and implementation constraints (e.g., memory) Applications Distributed (WZ) video coding Stereo Video Coding Multimedia streaming over heterogeneous networks Wireless sensor networks Conventional Video Coding Source Current frame A previous frame A previous frame Current frame Predictive video encoding (interframe) Predictive video decoding (interframe) High-complexity encoding (TV station, strong server) Low-complexity decoding (TV, computer, cell-phone) 15

16 Distributed Video Coding (DVC) Source Current frame Video encoder (Witsenhausen & Wyner, 78) The encoder does not need to know SI Low-complexity encoding (cell-phone, web-cam) High-complexity decoding (computer server) Low-complexity network: cell-server (converts to H264) -cell A previous frame Current frame Video decoder Stanford s group Pixel-based DVC DCT-based DVC Reported DVC Coders Berkeley s group (PRISM) DCT-based TAMU s group Scalable DVC Many extensions/improvements of the above, e.g., by the DISCOVER partners (UPC Spain, IST Portugal, EPFL Switzerland, UH Germany, INRIA France, UNIBIS Italy) Pixel-domain DVC feedback Video Frame = 2i q 2i Scalar Turbo encoding Quantizer for SW coding WZ encoder p 2i Turbo decoder & Estimation 2i 2i K= 2i-1 Key Frame Intraframe encoding Conventional H.26x (Aaron, Zhang, Girod, 22) (Aaron, Rane, Zhang, Girod, 23) Intraframe decoding 2i+1 2i-1 Interpolation side information generated by motion-compensated interpolation PSNR 3.3 db (Aaron, Zhang, Girod, 22) (Aaron, Rane, Zhang, Girod, 23) After Wyner-Ziv decoding 16-level quantization bpp PSNR 36.7 db DCT-domain: Look (Aaron, Rane, Setton, Girod, 24) Block interleaving -> UEP Input video frame W DCT High frequency bands Low frequency bands i bit-plane 1 2 level qi Extract bit-plane 2 Turbo M i quantizer bit-planes bit-plane M i For each low frequency coefficient band p i Wyner-Ziv parity bits side information generated by motion-compensated interpolation PSNR 24.8 db (Aaron, Zhang, Girod, 22) (Aaron, Rane, Zhang, Girod,23) After Wyner-Ziv decoding 16-level quantization 2. bpp PSNR 36.5 db Previous-frame Comparison quantized high frequency coefficient bands Key frame K < T No Quantizer Intraframe encoding feedback from the decoder Entropy Coder Conventionally compressed bitstream High frequency bits 16

17 DCT-domain: Look Wyner-Ziv parity bits feedback to the encoder p i Decoded For each low frequency band frame Turbo q i i Reconstruction IDCT W Side information DCT i Refined side information DCT High frequency bits Conventionally compressed bitstream Entropy and Dequantizer Motioncompensated Extrapolation Reconstructed high frequency band Intraframe decoding Extrapolation Refinement K DCT-based Intracoding 149 kbps PSNR =3. db (Aaron, Rane, Setton, Girod, 24) Wyner-Ziv DCT codec 152 kbps PSNR =35.6 db. Every 8th frame is a key frame PRISM (Puri, Ramchandran, 22) (Puri, Majumdar, Ramchandran, 27) Traditional Entropy High DCT coef Coding 3 db 6 db Input video frame DCT and zig-zag Scanning Scalar Quantizer Low DCT coef CRC SW Coding Every 8 th frame is a key frame 1 frames of Salesman QCIF sequence at 1fps (Aaron, Rane, Setton, Girod, 24) Based on correlation with previous frames, determine the mode of compression (skip, SW, or intraframe) Encoded data Previous decoded frame Entropy Decoding SW Decoding SI Motion Search Frame Classifier CRC Dequantization Reconstruction Output video PRISM: Results Latest Developments CIF Football QCIF Stefan Performance improvement Stanford s DCT-based architecture (with DISCOVER improvements) outperforms H.264/AVC intra-coded PRISM outperforms H.263+ Improved error resilience No need for feedback channel Extensions to multi-view video 17

18 DISCOVER Results QCIF Hall Monitor QCIF Foreman Much lower encoding complexity than H.264/AVC intra, and comparable decoding complexity x Robust Scalable DVC H.264 Video DCT SQ IRA Raptor encoder LT Error resilient Wyner-Ziv (u, Stankovic, iong, 25) At very low rate Base Layer Erasure Channel H.264 Video Raptor Estimation Wyner-Ziv 1. Encode x at very low bitrate with H.26x and send it to the decoder using strong error protection 2. Decode the received stream and get SI 3. x is compress/protected again with a Raptor code assuming as SI and erasure packet transmission channel 4. The decoder decodes using as SI. x Channel Code Design k symbols Raptor Code kh( )(1+ ε) symbols k bits Systematic channel encoder n-k bits k bits Noisy channel Correlation channel Channel decoder SI k bits Erasure Channel Efficient transmission over two different parallel channels: actual erasure channel and correlation (Gaussian) channel between and Parity-based binning is called for! Low complexity encoding and decoding required (use of Raptor codes vs. Reed-Solomon codes) IRA LT Joint Raptor A bias p towards selecting IRA parity symbols vs. systematic symbols in forming bipartite graph of the LT code A-priori information from SI Simulation Example (u, Stankovic, iong, 25) Transmission rate 256 Kbps 5% macroblock loss rate in the base layer 1% packet loss rate for WZ coding layer Applications Distributed (WZ) video coding Stereo Video Coding Multimedia streaming over heterogeneous networks Wireless sensor networks H264 FGS Scalable DVC system 18

19 Stereo Video Coding (ang, Stankovic, Zhao, iong, 27) The same view encoded independently with two cameras High correlation among the views can be exploited with MT source coding Stereo Video Coding (ang, Stankovic, Zhao, iong, 27) Both views compressed with TCQ+LDPC codes using MT source coding scheme Camera 1 Camera 2 MT MT 1 MT MT 2 Base Base station station H.264/AVC Tunnel Stereo Video Sequence Distributed/Stereo Video Applications Applications A new attractive video compression paradigm Video surveillance Low complexity networks Very-low complexity robust video coding Multiview/3D video coding Distributed (WZ) video coding Stereo Video Coding Multimedia streaming over heterogeneous networks Wireless sensor networks - Efficient low-bitrate video coding (e.g., H.264/MPEG-4) - Strong error protection - Extremely high compression and efficient congestion control - Fast encoding/decoding - QoS: Digital TV quality of video -Fit into current technologies (HDTV, best-effort Internet, DVB-S/DVB-T) MT Coding-based Streaming Scarce wireless bandwidth Error-prone wireless links Heavy Traffic + + N1 N2 1 2 Terminal 1 Terminal 2 R1 R2 Noiseless Channel Central Unit Video N1 + N R1 R2 Internet Network Client Video Live Broadcast Economical Feasibility Quality of service Wireless channel 19

20 Advantages of the System Rt Video coding 2=+N2 Correlation R2 1=+N1 R1 Internet DSC decoding plus video decompression Significant rate savings due to spatial diversity gain and DSC (without distortion penalty) Downloading from multiple servers: Servers evenly loaded Robustness to a server failure Security Source-independent transmission (not limited to video or multimedia) No need for cross-layer optimization Acceptable system complexity and flexible design Results: AWGN Channel (Stankovic, ang, iong, 27) Uncoded BPSK Convolutional codes plus BPSK Multi-station Wireless Streaming Turbo codes No DSC 1 b/s No DSC Turbo codes Base station Base station Increased quality of the reconstructed video due to path diversity gain Resilience to station failure Traffic control is improved because the servers can be equally loaded BPSK+SQ Bound BPSK+SQ Bound LDPC Problems: multipath fading, interference, noise Conventional solution: spread spectrum Spread spectrum increases required bandwidth DSC with Scalar Quantization (SQ) + Turbo/LDPC codes Solution (Khirallah, Stankovic et al., TWComm 28) Idea: Exploit the fact that the two stations stream same/correlated content Use Complete Complementary (CC) sequences for spreading at the base station (BS) At the encoders, puncture some of the output chips to reduce the rate At the decoder, recover the punctured chips using the chips received from the other stations as SI Extending DSC idea CC sequence sets of size N = 2 with K CC = 4 chips per each code. Input data broken into blocks Puncturing at the b b CC sequences sets N = K CC = 2 { b, b } 1 = (1,1), (1,4) { b, b } 2 = ( 2,1), (2,4) = ] User 1 User 2 w 11 = [ +, +, +, ] w = [ +, +,, + ] w 11, w ] w 21, w ] [ 12 [ = [ +,, +, + ] w = [ +,,, = c11 = f( b1, w11) c12 = f( b1, w12) c21 = f( b 2, w 21) c22 = f( b2, w 22) First component Base Station 1 Base Station 2 Second component Problem: Puncturing leads to loss of orthogonality => conventional CC decoding is not feasible w 12 Auto- correlation ψ = w11 w 11+ w12 w12 function = [,,,8,,,] Cross-correlation ψ = w11 w 21+ w12 w 22 = function [,,,,,,] 22 ψ 22 = w 21 w 21 + w 22 w = [,,,8,,,] R1 R2 2

21 Iterative Recovery at the Results (Khirallah, Stankovic et al., TWComm 28) Despread for Terminal BS 11 r= { r 1,r 2 } b c ˆ1 ( l) ˆ11 ( l) bˆ 2 ( l) Spread _ Despread w 11 for 1 1 ( hˆ1) ĥ 1 r1 ( hˆ2) ˆ ( l+ 1) b 1 Terminal BS 2 r 2 cˆ12 l Spread w 12 ĥ 1 ( ) Rough estimate Iterative process Received signal at frequency f 1 : r 1 =h 1 c 11 +h 2 c 21 +z i1 and f 2 : r 2 = + h 2 c 22 +z 2 {h 1 c 11 ;h 1 c 12 (l + 1)} Terminal BS 1 h 1, h 2 are fading coefficients, z 1 and z 2 AWGN, l designates iteration number d 1 Despread for Relative speed: 3km/hr Relative speed: 12km/hr CPR: channel power ratio (the average power ratio between the second and first path), CPR=1dB frequency selective, CPR= flat-fading channel p = : streaming of two identical sources p > : streaming of two sources correlated by a binary symmetric channel with crossover probability p Applications Wireless Sensor Networks (WSN) Distributed (WZ) video coding Stereo Video Coding Multimedia streaming over heterogeneous networks Wireless sensor networks Networks of numerous tiny, low-power and low-cost devices Key requirement: reduce power consumption by reducing communication via efficient distributed compression In dense sensor network, measurements of neighbouring sensors are expected to be correlated, hence DSC the most efficient compression choice R. Cristescu et al. Networked SW (joint optimization of placement, routing, and compression in WSN) J. Liu et al. Optimal communication cost in WSN The Relay Channel The Relay Channel Relay r=+zsr Relay Ar m Source Destina tion m Source Destina tion d1=+zsd d2=ar+zrd= A+AZsr+Zrd Task: Transmit messages m to the destination with the help of a relay. Noisy wireless channels assumed for all three links AMPLIF-AND-FORWARD CODING 21

22 The Relay Channel Decode r=+zsr m Re-encode Relay m Source r=+zsr DSC W Error-protection Relay r d2 Destina tion d1 d1=+zsd d2= r +Zrd W (channel dec.) r (WZ dec.) r=+zsr (MRC) m m Source Destina tion d1=+zsd d2=+zrd Source W r DECODE-AND-FORWARD (DF) CODING Problem: The rate is limited by the capacity of source-relay link! Solution: Avoid decoding at the relay node! r=+z sr d1=+z sd COMPRESS-AND-FORWARD (CF) CODING (Stankovic et al. SPM 26) Half-Duplex Gaussian Relay Channel: The Rate Bounds Practical CF Code Design Good sourcerelay channel Bad sourcerelay channel CF performs better DF performs better Relay DF does not help here Source (Liu, Uppal, Stankovic, and iong, ISIT 28) Destination Practical CF: The Source Practical CF: The Relay Sent during the relay-receive period αt Joint DSC and error protection using a single IRA code LDPC1 r=s1+zsr IRA Source Relay LDPC2 Sent during the relay-transmit period (1-α)T 22

23 Practical CF: The Destination Results: Half-duplex AWGN Relay Destination LDPC1 The scheme bound with optimal α Practical BPSK CF with α=.5 = s1+z sd The scheme bound with α=.5 IRA Upper bound BPSK Source RELA Destina tion d2=s2+r+zrd (MAC) LDPC2 1 m (Liu, Uppal, Stankovic, and iong, ISIT 28) Receiver Cooperation: Great Promise of CF Rate (bit/s/hz) Received SNR Almost 2dB gain of CF over DF! 2 Rx Antenna Upper bound DF CF No cooperation upper/lower Final Remarks Concept of Distributed Source Coding SW, WZ and MT coding Code designs based on channel codes over a correlation channel SW: Hamming, Turbo, LDPC, Raptor codes WZ: Nested Quantization + SW Many practical challenges Still a long way to go (a lot of information theory, not much practice) 23

24 Classic References: Lossless Distributed Source Coding: 1. D. Slepian and J.K. Wolf, Noiseless coding of correlated information sources, IEEE Trans. Inform. Theory, vol. IT-19, pp , July J.K. Wolf, Data reduction for multiple correlated sources, Proc. 5th Colloquium Microwave Communication, pp , June A. Wyner, Recent results in the Shannon theory, IEEE Trans. Inform. Theory, vol. 2, pp. 2-1, January R.M. Gray and A.D. Wyner, Source coding for a simple network, Bell Syst. Tech. J., vol. 53, pp , November T. Cover, A proof of the data compression theorem of Slepian and Wolf for ergodic sources, IEEE Trans. Inform. Theory, vol. IT-21, pp , March A.D. Wyner, On source coding with side information at the decoder, IEEE Trans. Inform. Theory, vol. IT-21, pp , May R.F. Ahlswede and J. Korner, Source coding with side information and a converse for degraded broadcast channels, IEEE Trans. Inform. Theory, vol. IT-21, pp , November A. Sgarro, Source coding with side information at several decoders, IEEE Trans. Inform. Theory, vol. IT-23, pp , March J. Korner and K. Marton, Images of a set via two channels and their role in multi-user communication, IEEE Trans. Inform. Theory, vol. IT-23, pp , November S.I. Gel`fand and M.S. Pinsker, Coding of sources on the basis of observations with incomplete information, Probl. Peredach. Inform., vol. 15, pp , April-June T.S. Han and K. Kobayashi, A unified achievable rate region for a general class of multiterminal source coding systems, IEEE Trans. Inform. Theory, vol. IT-26, pp , May I. Csiszar and J. Korner, Towards a general theory of source networks, IEEE Trans. Inform. Theory, vol. IT-26, pp , March T. S. Han, Slepian-Wolf-Cover theorems for networks of channels, Inform. and Control, vol. 47, pp , I. Csiszar and J. Korner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Academic Press, T. Cover and J. Thomas, Elements of Information Theory, New ork: Wiley, L. Song and R. W. eung, Network information flow multiple sources, in Proc. ISIT- 21 Int'l Symp. Inform. Theory, June 21.

25 Lossy Distributed Source Coding: 1. A. Wyner and J. Ziv, The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inform.Theory, vol. 22, pp. 1-1, January T. Berger, Multiterminal source coding, The Inform. Theory Approach to Communications, G. Longo, Ed., New ork: Springer-Verlag, S. Tung, Multiterminal Rate-distortion Theory, Ph. D. Dissertation, School of Electrical Engineering, Cornell University, Ithaca, N, A. Wayner, The rate-distortion function of source coding with side information at the decoder-ii: General sources, Inform. Contr., vol. 38, pp. 6-8, H. amamoto and K. Itoh, Source coding theory for multiterminal communication systems with a remote source, Trans. IECE of Japan, vol. E63, pp. 7 76, October A. Kaspi and T. Berger, Rate-distortion for correlated sources with partially separated encoders, IEEE Trans. Inform. Theory, vol. 28, pp , November C. Heegard and T. Berger, Rate-distortion when side information may be absent, IEEE Trans. Inform. Theory, vol. IT-31, pp , November T. Berger and R. eung, Multiterminal source encoding with one distortion criterion, IEEE Trans. Inform. Theory, vol. 35, pp , March T. Flynn and R. Gray, Encoding of correlated observations, IEEE Trans. Inform. Theory, vol. 33, pp , November T. Berger, Z. Zhang, and H. Viswanathan, The CEO problem, IEEE Trans. Inform. Theory, vol. 42, pp , May R. Zamir, The rate loss in the Wyner-Ziv problem, IEEE Trans. Inform. Theory, vol. 42, pp , November H. Viswanathan and T. Berger, The quadratic Gaussian CEO problem, IEEE Trans. Inform. Theory, vol. 43, pp , September Oohama, Gaussian multiterminal source coding, IEEE Trans. Inform. Theory, vol. 43, pp , November Oohama, The rate-distortion function for the quadratic Gaussian CEO problem, IEEE Trans. Inform. Theory, vol. 44, pp , May R. Zamir and T. Berger, Multiterminal source coding with high resolution, IEEE Trans. Inform. Theory, vol. 45, pp , January Oohama, Multiterminal source coding for correlated memoryless Gaussian sources with several side information at the decoder, Proc. ITW-1999 Information Theory Workshop, Kruger National Park, South Africa, June S. Servetto, Lattice quantization with side information, Proc. DCC-2, Snowbird, UT, March R. Zamir, S. Shamai, and U. Erez, Nested linear/lattice codes for structured multiterminal binning, IEEE Trans. Inform. Theory, vol. 48, pp , June S. Pradhan and K. Ramchandran, Distributed source coding using syndromes (DISCUS): Design and construction, IEEE Trans. Inform. Theory, vol. 49, pp , March Oohama, Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder, IEEE Trans. Inform. Theory, vol. 38, pp , July A. Wagner, S. Tavildar, and P. Viswanath, The rate region of the quadratic Gaussian twoencoder source-coding problem, IEEE Trans. Inform. Theory, to appear.

26 Some Research Groups: DSC Theory: T. Cover, Stanford, Cornell University (T. Berger, A. Wagner, S. Wicker, L. Tong), R. Zamir, Tel Aviv University, Oohama, University of Tokushima, M. Vetterli, EPFL, S. Pradhan, Mitchigen Ann Arbor, J. Chen, McMaster, DSC Code Design: K. Ramchandran, Berkeley, Z. iong, TAMU, F. Fekri, Georgia Tech, M. Effros, California Institute of Technology, J. Garcia-Frias, University of Delaware, S. Pradhan, Mitchigen Ann Arbor, E. Magli, Politecnico di Torino, IBM T. J. Watson Research Center, Mitsubishi Electric Research Labs, DVC and other applications to multimedia: B. Girod, Stanford, K. Ramchandran, Berkeley, Z. iong, TAMU, DISCOVER Project, A. Ortega, Southern California, E. Delp, Purdue, E. Tuncel, University of California, River Side, A. Fernando, University of Surrey, P. Frossard, P.L. Dragotti, Imperial College London, Stankovic s, University of Strathclyde, Compress-and-forward: Z. iong, TAMU, E. Erkip, Polytechnic University, M. Gastpar, Berkeley, J. Li (Tiffany), Lehigh University, Motorola Labs Paris

Distributed Arithmetic Coding

Distributed Arithmetic Coding Distributed Arithmetic Coding Marco Grangetto, Member, IEEE, Enrico Magli, Member, IEEE, Gabriella Olmo, Senior Member, IEEE Abstract We propose a distributed binary arithmetic coder for Slepian-Wolf coding

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh

More information

Multi-Hypothesis based Distributed Video Coding using LDPC Codes

Multi-Hypothesis based Distributed Video Coding using LDPC Codes Multi-Hypothesis based Distributed Video Coding using LDPC Codes Kiran Misra, Shirish Karande, Hayder Radha Department of Electrical and Computer Engineering 2120, Engineering Building Michigan State University

More information

A Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding

A Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding A Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding Arun Avudainayagam, John M. Shea and Dapeng Wu Wireless Information Networking Group (WING) Department of Electrical and Computer Engineering

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES

ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES 7th European Signal Processing Conference (EUSIPCO 2009) Glasgow, Scotland, August 24-28, 2009 ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES

More information

Distributed Compression for Condition Monitoring of Wind Farms

Distributed Compression for Condition Monitoring of Wind Farms Distributed Compression for Condition Monitoring of Wind Farms Vladimir Stanković, Lina Stanković, Shuang Wang, and Samuel Cheng Abstract A good understanding of individual and collective wind farm operation

More information

Design of Optimal Quantizers for Distributed Source Coding

Design of Optimal Quantizers for Distributed Source Coding Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305

More information

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Aline Roumy, Khaled Lajnef, Christine Guillemot 1 INRIA- Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France Email: aroumy@irisa.fr Abstract

More information

Network Distributed Quantization

Network Distributed Quantization Network Distributed uantization David Rebollo-Monedero and Bernd Girod Information Systems Laboratory, Department of Electrical Engineering Stanford University, Stanford, CA 94305 {drebollo,bgirod}@stanford.edu

More information

On Compression Encrypted Data part 2. Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University

On Compression Encrypted Data part 2. Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University On Compression Encrypted Data part 2 Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University 1 Brief Summary of Information-theoretic Prescription At a functional

More information

MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING

MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING Yichuan Hu (), Javier Garcia-Frias () () Dept. of Elec. and Comp. Engineering University of Delaware Newark, DE

More information

CODING FOR COOPERATIVE COMMUNICATIONS. A Dissertation MOMIN AYUB UPPAL

CODING FOR COOPERATIVE COMMUNICATIONS. A Dissertation MOMIN AYUB UPPAL CODING FOR COOPERATIVE COMMUNICATIONS A Dissertation by MOMIN AYUB UPPAL Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of

More information

Hyper-Trellis Decoding of Pixel-Domain Wyner-Ziv Video Coding

Hyper-Trellis Decoding of Pixel-Domain Wyner-Ziv Video Coding 1 Hyper-Trellis Decoding of Pixel-Domain Wyner-Ziv Video Coding Arun Avudainayagam, John M. Shea, and Dapeng Wu Wireless Information Networking Group (WING) Department of Electrical and Computer Engineering

More information

Slepian-Wolf Code Design via Source-Channel Correspondence

Slepian-Wolf Code Design via Source-Channel Correspondence Slepian-Wolf Code Design via Source-Channel Correspondence Jun Chen University of Illinois at Urbana-Champaign Urbana, IL 61801, USA Email: junchen@ifpuiucedu Dake He IBM T J Watson Research Center Yorktown

More information

Quantization for Distributed Estimation

Quantization for Distributed Estimation 0 IEEE International Conference on Internet of Things ithings 0), Green Computing and Communications GreenCom 0), and Cyber-Physical-Social Computing CPSCom 0) Quantization for Distributed Estimation uan-yu

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem EURASIP Journal on Applied Signal Processing 2005:6, 961 971 c 2005 Hindawi Publishing Corporation An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem Zhenyu Tu Department of Electrical

More information

Wyner-Ziv Coding of Video with Unsupervised Motion Vector Learning

Wyner-Ziv Coding of Video with Unsupervised Motion Vector Learning Wyner-Ziv Coding of Video with Unsupervised Motion Vector Learning David Varodayan, David Chen, Markus Flierl and Bernd Girod Max Planck Center for Visual Computing and Communication Stanford University,

More information

Basic Principles of Video Coding

Basic Principles of Video Coding Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion

More information

On Side-Informed Coding of Noisy Sensor Observations

On Side-Informed Coding of Noisy Sensor Observations On Side-Informed Coding of Noisy Sensor Observations Chao Yu and Gaurav Sharma C Dept, University of Rochester, Rochester NY 14627 ABSTRACT In this paper, we consider the problem of side-informed coding

More information

Optimal matching in wireless sensor networks

Optimal matching in wireless sensor networks Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network

More information

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem Zhenyu u, Jing Li (iffany), and Rick S. Blum Department of Electrical and Computer Engineering Lehigh University, Bethlehem, PA 18105

More information

CODING WITH SIDE INFORMATION. A Dissertation SZE MING CHENG

CODING WITH SIDE INFORMATION. A Dissertation SZE MING CHENG CODING WITH SIDE INFORMATION A Dissertation by SZE MING CHENG Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of DOCTOR OF

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Analysis of Rate-distortion Functions and Congestion Control in Scalable Internet Video Streaming

Analysis of Rate-distortion Functions and Congestion Control in Scalable Internet Video Streaming Analysis of Rate-distortion Functions and Congestion Control in Scalable Internet Video Streaming Min Dai Electrical Engineering, Texas A&M University Dmitri Loguinov Computer Science, Texas A&M University

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

Proceedings of the DATA COMPRESSION CONFERENCE (DCC 02) /02 $ IEEE

Proceedings of the DATA COMPRESSION CONFERENCE (DCC 02) /02 $ IEEE Compression with Side Information Using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford Universit y,stanford, CA 94305 amaaron@stanford.e du, bgir o d@stanford.e du Abstract

More information

Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder

Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder Elsa Dupraz, Aline Roumy + and Michel Kieffer, L2S - CNRS - SUPELEC - Univ Paris-Sud, 91192 Gif-sur-Yvette, France

More information

A DISTRIBUTED VIDEO CODER BASED ON THE H.264/AVC STANDARD

A DISTRIBUTED VIDEO CODER BASED ON THE H.264/AVC STANDARD 5th European Signal Processing Conference (EUSIPCO 27), Poznan, Poland, September 3-7, 27, copyright by EURASIP A DISTRIBUTED VIDEO CODER BASED ON THE /AVC STANDARD Simone Milani and Giancarlo Calvagno

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

Lossy Compression of Distributed Sparse Sources: a Practical Scheme

Lossy Compression of Distributed Sparse Sources: a Practical Scheme Lossy Compression of Distributed Sparse Sources: a Practical Scheme Giulio Coluccia, Enrico Magli, Aline Roumy, Velotiaray Toto-Zarasoa To cite this version: Giulio Coluccia, Enrico Magli, Aline Roumy,

More information

Performance of Polar Codes for Channel and Source Coding

Performance of Polar Codes for Channel and Source Coding Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

Side-information Scalable Source Coding

Side-information Scalable Source Coding Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

ON SCALABLE CODING OF HIDDEN MARKOV SOURCES. Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose

ON SCALABLE CODING OF HIDDEN MARKOV SOURCES. Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose ON SCALABLE CODING OF HIDDEN MARKOV SOURCES Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose Department of Electrical and Computer Engineering University of California, Santa Barbara, CA, 93106

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

LORD: LOw-complexity, Rate-controlled, Distributed video coding system

LORD: LOw-complexity, Rate-controlled, Distributed video coding system LORD: LOw-complexity, Rate-controlled, Distributed video coding system Rami Cohen and David Malah Signal and Image Processing Lab Department of Electrical Engineering Technion - Israel Institute of Technology

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5177 The Distributed Karhunen Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli,

More information

Binary Rate Distortion With Side Information: The Asymmetric Correlation Channel Case

Binary Rate Distortion With Side Information: The Asymmetric Correlation Channel Case Binary Rate Distortion With Side Information: The Asymmetric Correlation Channel Case Andrei Sechelea*, Samuel Cheng, Adrian Munteanu*, and Nikos Deligiannis* *Department of Electronics and Informatics,

More information

Distributed Arithmetic Coding for the Slepian-Wolf problem

Distributed Arithmetic Coding for the Slepian-Wolf problem Distributed Arithmetic Coding for the Slepian-Wolf problem Marco Grangetto, Member, IEEE, Enrico Magli, Senior Member, IEEE, Gabriella Olmo, Senior Member, IEEE EDICS: SEN-DCSC, SPC-CODC arxiv:0712.0271v2

More information

Graph-based Codes for Quantize-Map-and-Forward Relaying

Graph-based Codes for Quantize-Map-and-Forward Relaying 20 IEEE Information Theory Workshop Graph-based Codes for Quantize-Map-and-Forward Relaying Ayan Sengupta, Siddhartha Brahma, Ayfer Özgür, Christina Fragouli and Suhas Diggavi EPFL, Switzerland, UCLA,

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Q(X) Q (X) Q(X) X Q(X)-Q (X) Origin Q(X)-Q (X) Origin

Q(X) Q (X) Q(X) X Q(X)-Q (X) Origin Q(X)-Q (X) Origin Lattice Quantization with Side Information Sergio D. Servetto Laboratoire de Communications Audiovisuelles Ecole Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland. URL: http://lcavwww.epfl.ch/~servetto/

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

Multiple Description Transform Coding of Images

Multiple Description Transform Coding of Images Multiple Description Transform Coding of Images Vivek K Goyal Jelena Kovačević Ramon Arean Martin Vetterli U. of California, Berkeley Bell Laboratories École Poly. Féd. de Lausanne École Poly. Féd. de

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

A Framework for Adaptive Scalable Video Coding Using Wyner-Ziv Techniques

A Framework for Adaptive Scalable Video Coding Using Wyner-Ziv Techniques Hindawi Publishing Corporation EURASIP Journal on Applied Signal Processing Volume 26, Article ID 6971, Pages 1 18 DOI 1.1155/ASP/26/6971 A Framework for Adaptive Scalable Video Coding Using Wyner-Ziv

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Iterative Quantization. Using Codes On Graphs

Iterative Quantization. Using Codes On Graphs Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

New communication strategies for broadcast and interference networks

New communication strategies for broadcast and interference networks New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor Distributed Information Coding

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Wavelet Scalable Video Codec Part 1: image compression by JPEG2000

Wavelet Scalable Video Codec Part 1: image compression by JPEG2000 1 Wavelet Scalable Video Codec Part 1: image compression by JPEG2000 Aline Roumy aline.roumy@inria.fr May 2011 2 Motivation for Video Compression Digital video studio standard ITU-R Rec. 601 Y luminance

More information

(Structured) Coding for Real-Time Streaming Communication

(Structured) Coding for Real-Time Streaming Communication (Structured) Coding for Real-Time Streaming Communication Ashish Khisti Department of Electrical and Computer Engineering University of Toronto Joint work with: Ahmed Badr (Toronto), Farrokh Etezadi (Toronto),

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks

The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks Gerald Matz Institute of Telecommunications Vienna University of Technology institute of telecommunications Acknowledgements

More information

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Introduction to Low-Density Parity Check Codes. Brian Kurkoski Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code

More information

Sequential Decision Making in Decentralized Systems

Sequential Decision Making in Decentralized Systems Sequential Decision Making in Decentralized Systems by Ashutosh Nayyar A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Electrical Engineering:

More information

Multimedia Networking ECE 599

Multimedia Networking ECE 599 Multimedia Networking ECE 599 Prof. Thinh Nguyen School of Electrical Engineering and Computer Science Based on lectures from B. Lee, B. Girod, and A. Mukherjee 1 Outline Digital Signal Representation

More information

Estimation-Theoretic Delayed Decoding of Predictively Encoded Video Sequences

Estimation-Theoretic Delayed Decoding of Predictively Encoded Video Sequences Estimation-Theoretic Delayed Decoding of Predictively Encoded Video Sequences Jingning Han, Vinay Melkote, and Kenneth Rose Department of Electrical and Computer Engineering University of California, Santa

More information

Some UEP Concepts in Coding and Physical Transport

Some UEP Concepts in Coding and Physical Transport Some UEP Concepts in Coding and Physical Transport Werner Henkel, Neele von Deetzen, and Khaled Hassan School of Engineering and Science Jacobs University Bremen D-28759 Bremen, Germany Email: {w.henkel,

More information

ProblemsWeCanSolveWithaHelper

ProblemsWeCanSolveWithaHelper ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman

More information

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel Lei Bao, Mikael Skoglund and Karl Henrik Johansson School of Electrical Engineering, Royal Institute of Technology, Stockholm,

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

An Introduction to Low Density Parity Check (LDPC) Codes

An Introduction to Low Density Parity Check (LDPC) Codes An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,

More information

AN IMPROVED CONTEXT ADAPTIVE BINARY ARITHMETIC CODER FOR THE H.264/AVC STANDARD

AN IMPROVED CONTEXT ADAPTIVE BINARY ARITHMETIC CODER FOR THE H.264/AVC STANDARD 4th European Signal Processing Conference (EUSIPCO 2006), Florence, Italy, September 4-8, 2006, copyright by EURASIP AN IMPROVED CONTEXT ADAPTIVE BINARY ARITHMETIC CODER FOR THE H.264/AVC STANDARD Simone

More information

The Distributed Karhunen-Loève Transform

The Distributed Karhunen-Loève Transform 1 The Distributed Karhunen-Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Manuscript received November 2004; revised March 5, 2006;

More information

Chapter 7: Channel coding:convolutional codes

Chapter 7: Channel coding:convolutional codes Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Funded by: National Science Foundation (NSF) Seagate Technology Defense Advanced Research Projects

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Low-density parity-check codes

Low-density parity-check codes Low-density parity-check codes From principles to practice Dr. Steve Weller steven.weller@newcastle.edu.au School of Electrical Engineering and Computer Science The University of Newcastle, Callaghan,

More information

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Source-Channel Coding Theorems for the Multiple-Access Relay Channel Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access

More information

Data and error rate bounds for binar gathering wireless sensor networks. He, Xin; Zhou, Xiaobo; Juntti, Markk Author(s) Tad

Data and error rate bounds for binar gathering wireless sensor networks. He, Xin; Zhou, Xiaobo; Juntti, Markk Author(s) Tad JAIST Reposi https://dspace.j Title Data and error rate bounds for binar gathering wireless sensor networks He, Xin; Zhou, Xiaobo; Juntti, Markk Author(s) Tad Citation 205 IEEE 6th International Worksho

More information

Non-binary Distributed Arithmetic Coding

Non-binary Distributed Arithmetic Coding Non-binary Distributed Arithmetic Coding by Ziyang Wang Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements For the Masc degree in Electrical

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Predictive Coding. Prediction Prediction in Images

Predictive Coding. Prediction Prediction in Images Prediction Prediction in Images Predictive Coding Principle of Differential Pulse Code Modulation (DPCM) DPCM and entropy-constrained scalar quantization DPCM and transmission errors Adaptive intra-interframe

More information