Universal Incremental Slepian-Wolf Coding
|
|
- Norman Page
- 5 years ago
- Views:
Transcription
1 Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, USA Abstract We present a strategy for Slepian-Wolf coding when the joint distribution of the sources is unknown. The encoders use an incremental transmission policy, and the decoder a universal sequential decision test. We show that the decoder is able to decode shortly after the rates of transmission exceed the Slepian-Wolf bounds. The decoder then send an ACK back to the encoders, terminating transmission. As the timing of this ACK depends on the unknown statistics, the duration of the transmission is unknown in advance. Therefore, the seme is variable-rate. It adapts its transmission rates to mat the unknown joint entropy of the sources. We show how to apply these ideas in the rate distortion context of Wyner-Ziv coding. We specify an incremental universal coding strategy. We show that in the case of unknown, but jointly Gaussian statistics, there is no rate-loss in comparison with the case of fully known statistics. 1 Introduction In this paper we present a robust and efficient approa to Slepian-Wolf source coding when source statistics are unknown. In Slepian-Wolf source coding a pair of length-n random source sequences x = x n, y = y n are jointly distributed as p x,y (x, y) = n i=1 Q x,y(x i, y i ). The two sequences are observed at two separated encoders, whi communicate at rates R x and R y, respectively, to a central decoder that jointly decodes and estimates ea sequence. With high probability the estimates are required to mat the sources exactly. In Slepian and Wolf s original formulation [5] the joint distribution Q x,y is known throughout the system. Subsequently in [2] (exercise 3.1.6) Csiszár and Körner propose a seme for when Q x,y is unknown. Their universal coding seme uses a minimum empirical joint entropy decoder that they show is able to decode correctly with arbitrarily high probability (as n gets large) as long as (i) R x > H(x y), (ii) R y > H(y x), and (iii) R x + R y > H(x, y), where the entropies are calculated with respect to Q x,y. In Csiszár and Körner s setup the source coding rates R x and R y must be fixed without any knowledge of the source statistics. This makes the seme both fragile and generally
2 inefficient. It is fragile in the sense that if the rates picked do not satisfy conditions (i), (ii), and (iii), correct decoding will usually fail to occur. It is inefficient in the sense that it is a worst-case design. If the constraints are satisfied, but loosely, then the source coding rates used are excessive. One way to build robust and efficient universal Slepian-Wolf semes is to transmit source information incrementally. In an incremental seme the encoders continue to transmit until the decoder determines it is able to decode reliably. The code is variable-length, and therefore variable-rate, making it possible to mat the source coding rates to the underlying entropies on the fly. The trade off incremental semes make to gain robustness and efficiency is that they require a low-rate reliable feedback annel. Once the decoder has received enough information to decode, it informs the encoders by sending them an ACK. When the encoders receive an ACK, they terminate their transmissions. In [4] Shulman and Feder present an incremental coding strategy for broadcasting a source losslessly to a number of receivers with differing qualities of side information. In this setting receivers simply tune out once they can decode. In [3], Shulman extends these ideas to unknown statistics by incorporating a sequential thresholded variant of the decoder used in [2], testing empirical mutual informations. He first shows that when Q x, the marginal distribution for x, is uniform, and y is observed at the decoder, but Q x,y is unknown, a communication rate of roughly H(x y) is needed. Combining this strategy with other results on the universal encoding of integers, the seme can be applied to cases when Q x is not uniform; by using an extended decoding measure and running roughly log X decoders in parallel (one for ea possible source entropy rate). In this paper we introduce a simpler robust and efficient coding strategy that adopts the universal and incremental philosophies of these earlier strategies. There is no constraint on Q x,y in our setup, and neither source needs to be observed at the decoder. We further require only a single decoder to operate. And, importantly, our setup allows us to extend our scope of applicability to rate distortion formulations. In particular we discuss Wyner-Ziv coding. 2 Coding Seme In this section we describe the operation of the coding seme. Encoder X (Y) refers to the encoder that observes sequence x (y). The annels connecting ea encoder to the joint decoder are assumed noiseless and of fixed rates. In ea use of its annel Encoder X (Y) communicates Rx (Ry ) bits to the decoder. Thus, after k uses of the communication annels, the decoder has received a total of k(rx + Ry ) bits. This translates into a source coding rate pair (R x,k, R y,k ) = (krx /n, kry /n). The coding seme operates in the following steps: 1. The encoders observe their full length-n source sequences, x, y. 2. Encoder X (Y) calculates the type (empirical distribution) of its sequence, P x (P y ).
3 3. Ea encoder communicates its observed type to the decoder. We term this the prefix transmission. We assume that the cardinalities of the source alphabets X and Y are known to both encoders and decoder (or an upper bound thereof). Since, respectively, there are (n + 1) X and (n + 1) Y types, this takes ( X + Y ) log(n + 1) bits in total. Encoder X (Y) and the decoder now both know that the observed sequence must be in the corresponding type class, T Px (T Py ). For ea possible type, a prearranged list of the sequences in that type class is shared by encoder and decoder. The ordering on the list is random. 4. Encoder X (Y) now sends the binary expansion of the position of x (y) on the shared list. This is the incremental transmission idea proposed in [4]. After k transmissions, the decoder has received the first krx ( kry ) bits of the position of x (y). Ea incomplete binary expansion is equivalent to a subset (bin) of sequences. The elements of the bin correspond to the sequences who share the same first krx ( kry ) bits of the binary expansions of their list locations. Since the binary expansions are nested, so are the bins. Let B x,k (B y,k ) denote the subset of sequences in T Px (T Py ) that at time k have the same binary expansion on the list as the observed sequence. Thus, B x,0 = T Px and B x,0 B x,1 B x, B x, nh(px)/r x. 5. The decoder runs an empirical statistical test on ea pair of sequences (x, y) B x,k B y,k. As soon as the empirical mutual information of a pair of sequences satisfies I(x; y) θ k, where θ k is a time-varying threshold, the decoder sends an ACK to the transmitters. The threshold θ k decreases with k and will be specified subsequently. If Encoder X (Y) has already transmitted nh(p x ) (nh(p y )) bits, not counting the prefix, and has not yet received an ACK, it stops transmitting. Note that if we had not communicated the marginal types, and so did not know the marginal entropies, the encoder and decoder would not both know to stop at this point. This aspect of the seme becomes increasingly beneficial as the number of sources gets larger, i.e., when one is more likely to end up operating at a corner point of the aievable region. An example of the operation of the seme is illustrated in Figure 1. After ea transmission, the rate of the seme in terms of bits sent per source sample is (R x,k, R y,k ) = (krx /n, kry /n). These rates increase with time, along a vector from the origin at a slope of Ry /Rx. There are three ranges of slopes to consider. If Ry /Rx > H(y)/H(x y) then y will be decoded while there is still ambiguity in x. 1 At time k = nh(p x )/Rx the Encoder 1 The entropies H(y) and H(x y) in the discussion are calculated with respect to the empirical joint distribution P x,y. Thus one could explicitly write the slope condition as R y /R x > H(P y )/(H(P x,y ) H(P y )). For simplicity of notation we have suppressed this dependence. As one would expect, and we show in Lemma 1, for large n these entropies are quite close to those calculated with respect to the true underlying distribution Q x,y.
4 R y,k H(y) H(y x) R y R x aievable region H(x y) H(x) R x,k Figure 1: An example of the rates traced out until decoding when Ry /Rx > H(y)/H(x y). Y will have transmitted nh(y) bits. At this point both Encoder Y and the decoder know that the decoder can decode, and so Encoder Y can stop transmitting. The encoder for x continues until it receives the ACK or has sent nh(x) bits to the decoder. The sequence of source coding rates over time (R x,k, R y,k ) traces out a line from the origin with slope Ry /Rx until it intersects the upper dotted line shown in Figure 1. At this point Encoder Y stops transmitting, and so the trace continues horizontally until it intersects the corner of the aievable region. If H(y)/H(x y) > Ry /Rx > H(x)/H(y x), the seme aieves a point on the flat face of the aievable region. If Ry /Rx > H(x)/H(y x), it aieves the corner point (H(x), H(y x)). 3 Probability of Decoding Error We now analyze the probability that after the first k annel uses the decoder finds an x B x,k and a y B y,k su that I(x; y) θ k where (x, y) (x, y), the sequences actually observed. Call this error event ɛ k. Pr[ɛ k ] = Pr[x B x,k, y B y,k ] = Pr[x B x,k ] Pr[y B y,k ], x T Px, y T Py s.t. (x,y) (x,y), I(x;y) θ k x T Px, y T Py s.t. (x,y) (x,y), I(x;y) θ k where the probability factors because the lists of x and y sequences are generated independently. The probability that any non-observed x T Px is in bin B x,k is 1/ B x,k = 2 kr x.
5 We suppress the floor notation for simplicity of presentation, giving Pr[ɛ k ] = x +R y ] x T Px, y T Py s.t. (x,y) (x,y), I(x;y) θ k 2 k[r x +R y ] T Px V 2 k[r V s.t. P y=p xv, I(P x,v ) θ k 2 nh(px V ) 2 k[r V s.t. P y=p xv, I(P x,v ) θ k = We oose the threshold as V s.t. P y=p xv, I(P x,v ) θ k 2 V s.t. P y=p xv x +R y ] n[h(px)+h(py) I(Px,V ) kr x /n kr y /n] 2 n[h(px)+h(py) θ k kr x /n kr y /n] (n + 1) X Y 2 n[h(px)+h(py) θ k kr x /n kr y /n]. θ k = H(P x ) + H(P y ) kr x n kr y n + ɛ, (1) whi we can oose since P x and P y are known to the decoder by the prefix transmission. This gives Pr[ɛ k ] (n + 1) X Y 2 nɛ. (2) The derivation of (2) assumes ambiguity in both x and y at time k. If one sequence is decoded when there is still ambiguity in the other as, e.g., in Figure 1, we modify the threshold at subsequent times to take this into account. The general form is { } kr θ k = H(P x ) + H(P y ) min x n, H(P x) min { kr y n, H(P y) } + ɛ. (3) When the system ends up operating along the flat frontier of the aievable region, and not at one of the corner points, (3) and (1) are equivalent. The maximum possible decoding time (over all joint distributions) occurs when P x,y = P x P y, i.e., when the empirical joint distribution of x and y is a product of its marginals. Therefore, k K = n max{h(p x )/Rx, H(P y )/Ry }, whi gives a bound on the overall error probability Pr [ K k=1ɛ k ] K k=1 { Pr[ɛ k ] K(n + 1) X Y 2 nɛ H(Px ) max R x, H(P } y) (n + 1) X Y +1 2 nɛ. Ry By letting the observation length n be sufficiently large, the probability of decoding error can be bounded as small as desired.
6 4 Length of transmission Given that the observed sequences (x, y) have empirical joint distribution P x,y and marginal empirical distributions P x and P y, then the decoding time (assuming a decoding error does not occur) is a deterministic quantity. Assuming that the rates are su that H(x)/H(x y) Ry /Rx H(x)/H(y x), then decoding occurring at the first k su that I(x; y) = I(P x,y ) H(P x ) + H(P y ) kr x n kr y n + ɛ, where I(P x,y ) is the mutual information between the two random variables with the joint distribution P x,y. This decoding rule is equivalent to decoding at the first k su that k(r x + R y ) n[h(p x,y ) + ɛ]. Of course, the joint type of the observations P x,y is a random quantity. In the following lemma we show that the mutual information I(x; y) of the joint type P x,y is close to the mutual information I(Q x,y ) of the underlying distribution Q x,y. This implies that the sum source coding rate is close to H(Q x,y ), the joint entropy of the underlying distribution. Lemma 1 Given any constant µ 1/(8 log 2), then for any pair of length-n sequences (x, y) generated in an i.i.d. manner, p x,y (x, y) = n i=1 Q x,y(x i, y i ), [ Pr I(x; y) I(Q x,y ) 3 2 ln 2 µ ln ] X Y (n + 1) X Y 2 nµ. 2 log 2 µ Proof: We first give a probability bound on the divergence between the empirical distribution and the underlying distribution. Note that the probability is over the random empirical types P x, P y, P x,y while the underlying distributions Q x, Q y, Q x,y are deterministic quantities, but unknown. We first bound the probability that the divergence between the type realized and the generating distribution is greater than some constant µ: Pr[D(P x,y Q x,y ) µ] 2 nµ (n + 1) X Y 2 nµ. P s.t. D(P Q x,y ) µ 2 nd(p Qx,y ) P We convert the bound on divergence into a bound on the variational distance by using the Pinsker Inequality, Theorem of [1]. For any pair of sequences (x n, y n ) of joint type P x n,y n su that µ D(P x n,y n Q x,y) we get 2 log 2 µ 2 log 2 D(P x n,y n Q x,y) P x n,y n(a, b) Q x,y(a, b) a,b P x n,y n(a, b) Q x,y(a, b) = P x n(a) Q x (a). a a b
7 Then from Lemma of [2], we get: H(P x n,y n) H(Q x,y) X Y 2 log 2 µ log 2 log 2 µ H(P x n) H(Q x ) X 2 log 2 µ log 2 log 2 µ H(P y n) H(Q y ) Y 2 log 2 µ log 2 log 2 µ if 2 log 2µ 1/2. Putting these together gives 2 log 2 µ log X Y 2 log 2 µ H(P x n) + H(P y n) I(x n ; y n ) H(Q x ) H(Q y ) + I(Q x,y ) I(Q x,y ) I(x n ; y n ) 2 2 log 2 µ log X Y 2 log 2 µ. Finally, let δ = 3 2 log 2 µ log Pr[ I(x; y) I(Q x,y ) < δ ] = from whi the lemma follows. X Y 2 log 2 µ whi gives the bound Pr[ I(Q x,y ) I(x; y) < δ D(P x,y Q x,y ) µ] Pr[D(P x,y Q x,y ) µ] + Pr[ I(Q x,y ) I(x; y) < δ D(P x,y Q x,y ) µ] Pr[D(P x,y Q x,y ) µ] Pr[D(P x,y Q x,y ) µ] 1 (n + 1) X Y 2 nµ, 5 Application to Wyner-Ziv Coding We now apply the coding methodology we have developed to Wyner-Ziv coding with unknown statistics. As in the Slepian-Wolf context, we assume the availability of an ACK annel. We show that in some settings there is no rate loss in comparison with the situation of known statistics. The Wyner-Ziv rate distortion function R WZ (d) for the rate distortion encoding of a source x where side information y is observed at the decoder, and where the source and side information are distributed as p x,y (x, y) = n i=1 Q x,y(x i, y i ), is [6], R WZ (d) = min min I(x; u) I(y; u), (4) f Q u x where (i) f : U Y ˆX, and (ii) the Markov ain u x y must hold. In comparison to quantization with no side information, Wyner-Ziv coding has two type of gains. First, there is a rate gain from binning. We aieve a given average distortion at a lower rate by binning the codewords. We use the decoder side information to de-bin,
8 as in Slepian-Wolf coding. The lowered rate is reflected in the second, negative, mutual information term in (4). (Note that in some cases, e.g., along the curved portion of the Binary-Hamming R WZ (d) curve, this is the only type of gain one can make.) Second, there is a distortion gain from estimation. Through the estimation function f(u, y), we fuse together the two kinds of source data codeword and side information to lower the average distortion in the estimate. In summary, binning lowers the rate but does not effect the distortion, while estimation lowers the distortion but does not effect the rate. To get the lowest distortion at the lowest rate, we generally must optimize our test annel and estimation function jointly. The best oices, of course, depend on the statistics Q x,y. When the statistics Q x,y are unknown, we must oose our performance target without knowing how mu side information can help us. Say that we have an average distortion level d that our system needs to aieve. Then, using a prefix transmission that informs the decoder of the type of x, encoder and decoder can agree on the rate distortion aieving test annel to use, and therefore on the codebook. If the side information is really bad, e.g., if it is independent of the source, then the system will simply operate as a regular source code, aieving rate R reg (d ) = I(x; u) = I(P x, Q u x ) where Q u x = arg min I(x; u). p(u x) s.t. E[D(x,u)]d The source estimate ˆx = u (thus f(u, y) = u), and D is the distortion measure. We can immediately improve on this by using incremental binning and universal decoding to aieve R bin (d ) = I(x; u) I(y; u), (5) where again ˆx = u and I(y; u) = I(P y,u ). The derivation of the reliability of incremental universal de-binning in this context is very similar to the derivation of Pr[ɛ k ] in (2). For simplicity of presentation we assume that the rate distortion codebook agreed upon at the end of the prefix transmission is constant composition with empirical distribution Q u. Let C denote the codebook, then Pr[ɛ k ] = Pr[u C] Pr[u B u,k u C] u T Qu s.t. u u, I(u;y) θ k V s.t. Q uv =P y, I(Q u,v ) θ k u T V (y) C 2 kr 2nH(Qu) u (6) 2 nh( V Py) 2 ni(x;u) nh(qu) kr u (7) = V s.t. Q uv =P y, I(Q u,v ) θ k 2 ni(qu,v ) 2 ni(x;u) kr u V s.t. Q uv =P y, I(Q u,v ) θ k (n + 1) X Y 2 n[θ k I(x;u)+kR u /n] In (6) V is defined as Q u (a)v (b a) = P y (b) V (a b). We apply this in (7) where I(Q u, V ) = I(P y, V ) = H(Q u ) H( V P y ), and we also use C 2 ni(x;u). In this case we pick θ k =
9 I(x; u) kru /n + ɛ. As with the Slepian-Wolf results, we decode correctly with high probability at a time k su that θ k I(u; y). Therefore the source coding rate realized is kru /n I(x; u) I(y; u), the binning rate cited in (5). Note that if we wanted to encode x losslessly, we would oose C = T Qu = T Px, so Pr[u C] = 1, I(x; u) becomes H(x), and we have a Slepian-Wolf seme with y observed at the decoder. Comparing this oice of θ k with (3) elucidates the role of the minimum limiters in (3). Beyond the gain from binning, we would like to gain from estimation. One way to do this is to estimate p x,y,u and then pick the best f. Because we know the joint empirical marginals P x Q u x and P u,y the former from the prefix transmission and oice of test annel, and the latter because we know both u and y at the decoder and we know that u x y must hold, there are X ( Y 1) unknowns to solve for, in p y x. While generally we may not be able to solve fully for p x,y,u, we can in some more constrained cases, as is discussed next. 5.1 Incremental Universal Gaussian Wyner-Ziv Coding In this section we show that when we know the source and side information are jointly Gaussian, we can aieve the Wyner-Ziv rate distortion bound in a universal manner. We show that our coding seme aieves a point on the rate distortion function, though we do not know a priori whi one. Note that while the derivations to this point are for finite alphabets, through standard quantization teniques they can be extended to this setting. As we don t know the quality of the Gaussian side information, we start out by using the standard Gaussian-quantization test annel. Let u = αx + e where α = 1 d /σx, 2 σx 2 is the variance of x, and e is an independent Gaussian random variable, e N (0, αd ). After decoding u via universal de-binning, we solve for the correlation coefficient ρ as E [uy] = αe [xy] = αρσ x σ y. Note that σ x is known from the prefix transmission, α is osen, and E [uy] and σ y can be estimated from u and y. With this knowledge of ρ we refine our estimation function to be f(u, y) = [ρd/(1 ρ 2 )σ x σ y ]y + [d/d ]u, where d = d σx y 2 /(σ2 x y + ρ 2 d ) d is the expected distortion, and σx y 2 = (1 ρ2 )σx 2 is the minimum mean-squared estimation error of x given y. We calculate the rate distortion trade offs aieved when (i) we ignore side information, (ii) we use side information only for binning, and (iii) we use side information both for binning and estimation. We get the following: (i) Ignore side information: R reg (d ) = I(x; u) = 0.5 log[σ 2 x/d ]. (ii) Side information for binning: R bin (d ) = I(x; u) I(y; u) = 0.5 log[(σ 2 x ρ 2 (σ 2 x d ))/d ]. (iii) Side information for binning and estimation: R WZ (d) = R bin (d ) = 0.5 log[σ 2 x y /d]. Recall that d = d σ 2 x y /(σ2 x y + ρ2 d ), whi can be used to verify that R WZ (d) = R bin (d ). In Figure 2 we plot the point aieved for a target mean-squared error d = 2 when σ 2 x = 4, σ 2 y = 4 and ρ = 0.8. As discussed, the binning gain moves us from the regular Gaussian rate distortion function R reg (d ) to a lower rate point on R bin (d ) without anging the distortion. The use of the refined estimation function then moves us to a point on R WZ (d) whi has the
10 R reg (d) R bin (d) R WZ (d) rate distortion Figure 2: The operation of an incremental universal Gaussian Wyner-Ziv seme. The rate gain from binning moves us from R reg (d ) to R bin (d ), while the distortion gain from estimation moves us from R bin (d ) to R WZ (d). same rate as R bin (d ), but a lower distortion d d. A priori we don t know what distortion d d we will get, nor what rate R WZ (d) R reg (d ). In both dimensions distortion and rate we do at least as well as when we ignore the side information. Note that generally there will be a rate loss because the regular rate distortion test annel will not coincide with a test annel for the Wyner-Ziv problem. References [1] T. Cover and J. Thomas. Elements of Information Theory. John Wiley & Sons, [2] I. Csiszár and J. Körner. Information Theory, Coding Theorems for Discrete Memoryless Systems. Akadémiai Kiadó, [3] N. Shulman. Communication over an Unknown Channel in Common Broadcasting. PhD thesis, Tel Aviv Univ., [4] N. Shulman and M. Feder. Source broadcasting with an unknown amount of receiver side information. In Proc Inform. Theory Workshop, Bangalore, India, , October [5] D. Slepian and J. K. Wolf. Noiseless coding of correlated information sources. IEEE Trans. Inform. Theory, 19: , July [6] A. D. Wyner and J. Ziv. The rate-distortion function for source coding with side information at the decoder. IEEE Trans. Inform. Theory, 22:1 10, January 1976.
On Scalable Source Coding for Multiple Decoders with Side Information
On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationSide-information Scalable Source Coding
Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More informationDistributed Source Coding Using LDPC Codes
Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationProblemsWeCanSolveWithaHelper
ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationOn Function Computation with Privacy and Secrecy Constraints
1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The
More informationVariable-Rate Universal Slepian-Wolf Coding with Feedback
Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationOn Large Deviation Analysis of Sampling from Typical Sets
Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan
More informationPerformance of Polar Codes for Channel and Source Coding
Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationEE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16
EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationInformation Masking and Amplification: The Source Coding Setting
202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationDesign of Optimal Quantizers for Distributed Source Coding
Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationInteractive Hypothesis Testing with Communication Constraints
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical
More informationInformation Theory in Intelligent Decision Making
Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationUniversal Anytime Codes: An approach to uncertain channels in control
Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer
More informationMultiuser Successive Refinement and Multiple Description Coding
Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationTowards control over fading channels
Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationKeyless authentication in the presence of a simultaneously transmitting adversary
Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.
More informationSolutions to Set #2 Data Compression, Huffman code and AEP
Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationSolutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality
1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of
More informationOn the Capacity of the Two-Hop Half-Duplex Relay Channel
On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationAn Achievable Rate for the Multiple Level Relay Channel
An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationMARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for
MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationA Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers
A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationMultiterminal Source Coding with an Entropy-Based Distortion Measure
Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationInformation Theory CHAPTER. 5.1 Introduction. 5.2 Entropy
Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationCommunication Cost of Distributed Computing
Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation
More informationIntermittent Communication
Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationCoding for Noisy Write-Efficient Memories
Coding for oisy Write-Efficient Memories Qing Li Computer Sci. & Eng. Dept. Texas A & M University College Station, TX 77843 qingli@cse.tamu.edu Anxiao (Andrew) Jiang CSE and ECE Departments Texas A &
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationSide Information Aware Coding Strategies for Estimation under Communication Constraints
MIT Research Laboratory of Electronics, Tech. Report # 704 Side Information Aware Coding Strategies for Estimation under Communication Constraints Stark C. Draper and Gregory W. Wornell Abstract We develop
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationHow to Compute Modulo Prime-Power Sums?
How to Compute Modulo Prime-Power Sums? Mohsen Heidari, Farhad Shirani, and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 48109, USA.
More informationMotivation for Arithmetic Coding
Motivation for Arithmetic Coding Motivations for arithmetic coding: 1) Huffman coding algorithm can generate prefix codes with a minimum average codeword length. But this length is usually strictly greater
More informationDistributed Lossy Interactive Function Computation
Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &
More information