On the Feedback Capacity of Stationary Gaussian Channels
|
|
- Logan Stanley
- 5 years ago
- Views:
Transcription
1 On the Feedback Capacity of Stationary Gaussian Channels Young-Han Kim Information Systems Laboratory Stanford University Stanford, CA Abstract The capacity of stationary additive Gaussian noise channels with feedback is characterized as solution to a variational problem. Toward this end, it is proved that the optimal feedback coding scheme is stationary. When specialized to the first-order autoregressive moving-average noise spectrum, this variational characterization yields a closed-form expression for the feedback capacity. In particular, this result shows that the celebrated Schalkwijk Kailath Butman coding scheme achieves the feedback capacity for the first-order autoregressive moving-average Gaussian channel, resolving a long-standing open problem studied by Butman, Schalkwijk Tiernan, Wolfowitz, Ozarow, Ordentlich, Yang Kavčić Tatikonda, and others. Introduction and summary We consider the additive Gaussian noise channel Y i = X i + Z i, i =, 2,..., where the additive Gaussian noise process {Z i } i= is stationary with Z n = (Z,..., Z n ) N n (0, K Z,n ) for each n =, 2,.... We wish to communicate a message W {,..., 2 nr } over the channel Y n = X n + Z n. For block length n, we specify a (2 nr, n) feedback code with codewords X n (W, Y n ) = (X (W ), X 2 (W, Y ),..., X n (W, Y n )), W =,..., 2 nr, satisfying the average power constraint n n i= EX2 i (W, Y i ) P and decoding function Ŵ n : R n {,..., 2 nr }. The probability of error P e (n) is defined by P e (n) = Pr{Ŵn(Y n ) W }, where the message W is uniformly distributed over {, 2,..., 2 nr } and is independent of Z n. We say that the rate R is achievable if there exists a sequence of (2 nr, n) codes with P e (n) 0 as n. The feedback capacity C FB is defined as the supremum of all achievable rates. We also consider the case in which there is no feedback, corresponding to the codewords X n (W ) = (X (W ),..., X n (W )) independent of the previous channel outputs. We define the nonfeedback capacity C, or the capacity in short, in a manner similar to the feedback case. Shannon [] showed that the nonfeedback capacity is achieved by water-filling on the noise spectrum, which is arguably one of the most beautiful results in information theory. This research is supported in part by NSF Grant CCR
2 More specifically, the capacity C of the additive Gaussian noise channel Y i = X i + Z i, i =, 2,..., under the power constraint P, is given by C = 2 log max{s Z(e iθ ), λ} S Z (e iθ ) dθ () where S Z (e iθ ) is the power spectral density of the stationary noise process {Z i } i= and the water-level λ is chosen to satisfy P = max{0, λ S Z (e iθ )} dθ. (2) Although () and (2) give only a parametric characterization of the capacity C(λ) under the power constraint P (λ) for each parameter λ 0, this solution is considered to be simple and elegant enough to be called closed-form. For the case of feedback, no such elegant solution exists. Most notably, Cover and Pombra [2] characterized the n-block feedback capacity C FB,n for arbitrary time-varying Gaussian channels via the asymptotic equipartition property (AEP) for arbitrary nonstationary nonergodic Gaussian processes as C FB,n = max K V,n,B n 2 log det(k V,n + (B n + I)K Z,n (B n + I) ) /n (3) det(k Z,n ) /n where the maximum is taken over all positive semidefinite matrices K V,n and all strictly lower triangular matrices B n of sizes n n satisfying tr(k V,n + B n K Z,n (B n ) ) np. Note that we can also recover the nonfeedback case by taking B n 0. When specialized to the stationary noise processes, the Cover Pombra characterization gives the feedback capacity as a limiting expression C FB = lim n C FB,n = lim max n K V,n,B n 2 log det(k V,n + (B n + I)K Z,n (B n + I) ) /n. (4) det(k Z,n ) /n Despite its generality, the Cover Pombra formulation of the feedback capacity falls short of what we can call a closed-form solution. It is very difficult, if not impossible, to obtain an analytic expression for the optimal (K V,n, B n) in (3) for each n. Furthermore, the sequence of optimal {K V,n, B n} n= is not necessarily consistent, that is, (K V,n, B n) is not necessarily a subblock of (K V,n+, B n+). Hence the characterization (3) in itself does not give much hint on the structure of optimal {K V,n, B n} n= achieving C FB,n, or more importantly, its limiting behavior. In this paper, we make one step forward by proving Theorem. The feedback capacity C FB of the Gaussian channel Y i = X i + Z i, i =, 2,..., under the power constraint P, is given by C FB = sup S V (e iθ ),B(e iθ ) 2 log S V (e iθ ) + + B(e iθ ) 2 S Z (e iθ ) S Z (e iθ ) where S Z (e iθ ) is the power spectral density of the noise process {Z i } i= and the supremum is taken over all power spectral densities S V (e iθ ) 0 and strictly causal filters B(e iθ ) = j= b je ijθ π satisfying the power constraint (S V (e iθ ) + B(e iθ ) 2 S Z (e iθ )) dθ P. dθ 5
3 Roughly speaking, this characterization shows the asymptotic optimality of a stationary solution (K V,n, B n ) in (3) and hence it can be viewed as justification for interchange of the order of limit and maximum in (4). Since Theorem gives a variational form of the feedback capacity, it remains to characterize the optimal (S V (eiθ ), B (e iθ )). In this paper, we provide a sufficient condition for the optimal solution using elementary arguments. This result, when specialized to the first-order autoregressive (AR) noise spectrum S Z (e iθ ) = / + βe iθ 2, < β <, yields a closed-form solution for feedback capacity as C FB = log x 0, where x 0 is the unique positive root of the fourth-order polynomial P x 2 = ( x2 ) ( + β x) 2. This result positively answers the long-standing conjecture by Butman [3, 4], Tiernan Schalkwijk [5, 6], and Wolfowitz [7]. In fact, we will obtain the feedback capacity formula for the first-order autoregressive moving average (ARMA) noise spectrum, generalizing the result in [8] and confirming a recent conjecture by Yang, Kavčić, and Tatikonda [9]. The rest of the paper is organized as follows. We prove Theorem in the next section. In Section 3, we derive a sufficient condition for the optimal (S V (eiθ ), B (e iθ )) and apply this result to the first-order ARMA noise spectrum to obtain the closed-form feedback capacity. We also show that the Schalkwijk Kailath Butman coding scheme [0,, 3] achieves the feedback capacity of the first-order ARMA Gaussian channel. 2 Proof of Theorem We start from the Cover-Pombra formulation of the n-block feedback capacity C FB,n in (3). Tracing the development of Cover and Pombra backwards, we express C FB,n as C FB,n = max h(y n ) h(z n ) = max I(V n ; Y n ) V n +B n Z n V n +B n Z n where the maximization is over all X n of the form X n = V n + B n Z n, resulting in Y n = V n + (I + B n )Z n, with strictly lower-triangular B n and multivariate Gaussian V n, independent of Z n, satisfying the power constraint E n i= X2 i np. Define C FB = sup S V (e iθ ),B(e iθ ) 2 log S V (e iθ ) + + B(e iθ ) 2 S Z (e iθ ) dθ S Z (e iθ ) where S Z (e iθ ) is the power spectral density of the noise process {Z i } i= and the supremum is taken over all power spectral densities S V (e iθ ) 0 and strictly causal filters B(e iθ ) = k= b ke ikθ satisfying the power constraint (S V (e iθ ) + B(e iθ ) 2 S Z (e iθ )) dθ P. In the light of Szegő Kolmogorov Krein theorem, we can express C FB also as C FB = sup h(y) h(z) {X i } where the supremum is taken over all stationary Gaussian processes {X i } i= of the form X i = V i + k= b kz i k where {V i } i= is stationary and independent of {Z i } i= such that EX0 2 P. We will prove that C FB = C FB. 6
4 We first show that C FB,n C FB for all n. Fix n and let (KV,n, B n) achieve C FB,n. Consider a process {V i } i= that is independent of {Z i } i= and blockwise i.i.d. with V (k+)n kn+ N n (0, KV,n ), k = 0, ±, ±2,.... Define a process {X i} i= as X (k+)n kn+ = V (k+)n kn+ + BnZ (k+)n kn+ for all k. Similarly, let Y i = X i + Z i, < i <, be the corresponding output process through the stationary Gaussian channel. Note that Y (k+)n kn+ = V (k+)n kn+ +(I+Bn)Z (k+)n kn+ for all k. For each t = 0,,..., n, define a process {V i (t)} i= as V i (t) = V t+i for all i and similarly define {X i (t)} i=, {Y i (t)} i=, and {Z i (t)} i=. Note that Y i (t) = X i (t) + Z i (t) for all i and all t = 0,,..., n, but X n (t) is not equal to V n (t) + BnZ n (t) in general. From the independence of V n and Vn+, 2n we can easily check that 2C FB,n = I(V n ; Y n ) + I(Vn+; 2n Yn+) 2n = h(v n ) + h(vn+) 2n h(v n Y n ) h(vn+ Y 2n n+) 2n h(v 2n ) h(v 2n Y 2n ) = I(V 2n ; Y 2n ) = h(y 2n ) h(z 2n ). By repeating the same argument, we get C FB,n kn (h(v ) h(z kn )), kn for all k. Hence, for all m =, 2,..., and each t = 0,..., n, we have C FB,n ( h(y m m (t)) h(z m (t)) ) + ɛ m = ( h(y m m (t)) h(z m ) ) + ɛ m where ɛ m absorbs the edge effect and vanishes uniformly in t as m. Now we introduce a random variable T uniform on {0,,..., n } and independent of {V i, X i, Y i, Z i } i=. It is easy to check the followings: (I) {V i (T ), X i (T ), Y i (T ), Z i (T )} i= is stationary with Y i (T ) = X i (T ) + Z i (T ). (II) {X i (T )} i= satisfies the power constraint EX 2 0(T ) = E[E(X 2 0(T ) T )] = n tr(k V,nB nk Z,n (B n) ) P. (III) {V i (T )} i= and {Z i (T )} i= are orthogonal; i.e., EV i (T )Z j (T ) = 0 for all i, j. (IV) Although there is no linear relationship between {X i (T )} and {Z i (T )}, {X i (T )} still depends on {Z i (T )} in a strictly causal manner. More precisely, for all i j, E(X i (T )Z j (T ) Z (T i )) = E ( E(X i (T )Z j (T ) Z (T i ), T ) Z (T i ) ) = E ( E(X i (T ) Z (T i ), T )E(Z j (T ) Z (T i ), T ) Z (T i ) ) = E ( E(X i (T ) Z (T i ), T )E(Z j (T ) Z (T i )) Z (T i ) ) and for all i, = E(X i (T ) Z i (T ))E(Z j (T ) Z i (T )), Var(X i (T ) V i (T ) Z i (T )) = E ( Var(X i (T ) V i (T ) Z i (T ), T ) Z i (T ) ) = 0. 7
5 Finally, define {Ṽi, X i, Ỹi, Z i } i= to be a jointly Gaussian stationary process with the same mean and autocorrelation as {V i (T ), X i (T ), Y i (T ), Z i (T )} i=. It is easy to check that {Ṽi, X i, Ỹi, Z i } also satisfies the properties (I) (IV) and hence that {Ṽi} and { Z i } are independent. It follows from these properties and the Gaussianity of {Ṽi, X i, Ỹi, Z i } that there exists a sequence {b k } k= such that X i = Ṽi + k= b Z k i k. Thus we have C FB,n ( h(y m m (T ) T ) h(z m ) ) + ɛ m ( h(y m m (T )) h(z m ) ) + ɛ m ( h( Ỹ m ) h(z m ) ) + ɛ m. m By letting m and using the definition of C FB, we obtain C FB,n h(ỹ) h(z) C FB. (5) For the other direction of the inequality, we use the notation C FB (P ) and C FB,n (P ) to stress the dependence on the power constraint P. Given ɛ > 0, let {X i = V i + k= b kz i k } i= achieve C FB (P ) ɛ under the power constraint P. The corresponding channel output is given as Y i = V i + Z i + b k Z i k (6) for all i = 0, ±, ±2,.... Now, for each m =, 2,..., we define a single-sided nonstationary process {X i (m)} i= in the following way: { Ui + V X i (m) = i + i k= k= b kz i k i m, U i + V i + m k= b kz i k i > m where U, U 2,... are i.i.d. N(0, ɛ). Thus, X i (m) depends causally on Z i for all i and m. Let {Y i (m)} i= be the corresponding channel output Y i (m) = X i (m) + Z i, i =, 2,..., for each m =, 2,.... We can show that there exists an m so that lim n n n EXi 2 (m ) P + 2ɛ i= and lim n n h(y n (m )) h(y) ɛ (7) where h(y) is the entropy rate of the stationary process defined in (6). Consequently, for n sufficiently large, n n EXi 2 (m ) n(p + 3ɛ) and n (h(y n (m )) h(z n )) C FB (P ) 2ɛ. Therefore, we can conclude that i= C FB,n (P + 3ɛ) C FB (P ) 2ɛ for n sufficiently large. Finally, using continuity of C FB (P ) = lim n C FB,n (P ) in P, we let ɛ 0 to get C FB (P ) C FB, which, combined with (5), implies that C FB (P ) = C FB (P ). 8
6 3 Example: First-order ARMA noise spectrum With the ultimate goal of an explicit characterization of C FB as a function of S Z and P, we wish to solve the optimization problem maximize subject to log( S V (e iθ ) + + B(e iθ ) 2 S Z (e iθ ) ) dθ B(e iθ ) strictly causal S V (e iθ ) 0 S V (e iθ ) + B(e iθ ) 2 S Z (e iθ ) dθ P. (8) Suppose that S Z (e iθ ) is bounded away from zero. Then, under the change of variable we rewrite (8) as S Y (e iθ ) = S V (e iθ ) + + B(e iθ ) 2 S Z (e iθ ), maximize subject to log S Y (e iθ ) dθ B(e iθ ) strictly causal S Y (e iθ ) + B(e iθ ) 2 S Z (e iθ ) S Y (e iθ ) (B(e iθ ) + B(e iθ ) + )S Z (e iθ ) dθ P. (9) Take any ν > 0, φ(e iθ ), ψ L, and ψ 2 (e iθ ), ψ 3 (e iθ ) L such that φ(e iθ ) > 0, log φ(e iθ ) L, ψ (e iθ ) = ν φ(e iθ ) 0, [ ] ψ (e iθ ) ψ 2 (e iθ ) 0 ψ 2 (e iθ ) ψ 3 (e iθ ) and A(e iθ ) := ψ 2 (ω) + νs Z (e iθ ) L is anticausal. Since any feasible B(e iθ ) and S Y (e iθ ) satisfy [ ] SY (e iθ ) + B(e iθ ) 0, + B(e iθ ) S Z (eiθ ) we have ([ SY ] [ + B ψ ]) ψ 2 tr + B S Z ψ 2 ψ 3 = φ S Y + ψ 2 ( + B) + ψ 2 ( + B) + ψ 3 S Z 0. From the fact that log x x for all x 0, we get the inequality log S Y log φ + φs Y = log φ + νs Y ψ S Y log φ + νs Y + ψ 2 ( + B) + ψ 2 ( + B) + ψ 3 S Z. (0) Further, since A L is anticausal and B H is strictly causal, AB L is strictly anticausal and A(e iθ )B(e iθ ) dθ = A(e iθ )B(e iθ ) dθ = 0. () 9
7 By integrating both sides of (0), we get log S Y = = log φ + νs Y + ψ 2 ( + B) + ψ 2 ( + B) + ψ 3 S Z log φ + ν ( (B+B+)S Z + P ) + ψ 2 ( + B) + ψ 2 ( + B) + ψ 3 S Z log φ + ψ 2 + ψ 2 + ψ 3 S Z + ν(s Z + P ) + AB + AB log φ + ψ 2 + ψ 2 + ψ 3 S Z + ν(s Z + P ) (2) where the second inequality follows from the power constraint in (9) and the last equality follows from (). Checking the equality conditions in (2), we find the following sufficient condition for the optimality of a specific (S Y (e iθ ), B(e iθ )). Lemma. Suppose S Z (e iθ ) is bounded away from zero. Suppose B(e iθ ) H is strictly causal with B(e iθ ) 2 S Z (e iθ ) dθ = P. (3) If there exists λ > 0 such that and that λ ess inf + B(e iθ ) 2 S Z (e iθ ) θ [,π) λ + B(e iθ ) B(eiθ )S Z (e iθ ) L is anticausal, then B(e iθ ) along with S V (e iθ ) 0 attains the feedback capacity. Now we turn our attention to the first-order autoregressive moving average noise spectrum S Z (z), defined by S Z (e iθ ) = + αe iθ 2 + βe iθ, α [, ], β (, ). (4) This spectral density corresponds to the stationary noise process defined by Z i +βz i = U i +αu i, where {U i } i= is a white Gaussian process with zero mean and unit variance. We find the feedback capacity of the first-order ARMA Gaussian channel in the following. Theorem 2. Suppose the noise process {Z i } i= has the power spectral density S Z (z) defined in (4). Then, the feedback capacity C FB of the Gaussian channel Y i = X i + Z i, i =, 2,..., under the power constraint P is given by C FB = log x 0 where x 0 is the unique positive root of the fourth-order polynomial and P x 2 = ( x2 )( + σαx) 2 ( + σβx) 2 (5) σ = sgn(β α) = {, β α,, β < α. 20
8 Proof sketch. Without loss of generality, we assume that α <. The case α = can be handled by a simple perturbation argument. When α <, S Z (e iθ ) is bounded away from zero, so that we can apply Lemma. Here is the bare-bone summary of the proof: We will take the feedback filter of the form B(z) = + βz + αz yz (6) σxz where x (0, ) to be an arbitrary parameter corresponding to each power constraint P (0, ) under the the choice of y = x2 +σαx. Then, we can show that B(z) satisfies σx +σβx the sufficient condition in Lemma under the power constraint P = B(e iθ ) 2 S Z (e iθ ) dθ = y 2 dθ xe iθ 2 = y2 x 2. The rest of the proof is the actual implementation of this idea. We skip the details. Although the variational formulation of the feedback capacity (Theorem ), along with the sufficient condition for the optimal solution (Lemma ), leads to the simple closed-form expression for the ARMA() feedback capacity (Theorem 2), one might be still left with somewhat uneasy feeling, due mostly to the algebraic and indirect nature of the proof. Now we take a more constructive approach and interpret the properties of the optimal feedback filter B. Consider the following coding scheme. Let V N(0, ). Over the channel Y i = X i + Z i, i =, 2,..., the transmitter initially sends X = V and subsequently refines the receiver s knowledge by sending X n = (σx) (n ) (V ˆV n ) (7) where x is the unique positive root of (5) and ˆV n = E(V Y,..., Y n ) is the minimum mean-squared error estimate of V given the channel output up to time n. We will show that lim inf n n I(V ; ˆV n ) ( ) 2 log x 2 while lim sup n n n Xi 2 P, which proves that the proposed coding scheme achieves the feedback capacity. Define, for n 2, i= Y n = d n V + U n + ( α) n (αu 0 βz 0 ) where d n = ( ) + σβx ( ( σαx) n )(σx) (n ). + σαx Then one can show that Y n can be represented as a linear combination of Y,..., Y n and hence that ( ( n 2 E(V ˆV n ) 2 E V d k Y k)). k=2 2
9 Furthermore, we can check that whence or equivalently, lim sup n E(V ( n k=2 d ky k ))2 E(V ( n k=2 d ky k ))2 x, 2 lim inf n On the other hand, for n 2, which converges to ( ) n log E(V ˆV n ) 2 log x 2 n I(V ; ˆV n ) ( ) 2 log. x 2 EX 2 n = x 2(n ) E(V ˆV n ) 2 x 2(n ) E x 2(n ) lim n n k=2 d2 k = ( V ( n k=2 ( + σαx)2 ( + σβx) 2 (x 2 ) = P. d k Y k)) 2, Hence, we have shown that the simple linear coding scheme (7) achieves the ARMA() feedback capacity. The coding scheme described above uses the minimum mean-square error decoding of the message V, or equivalently, the joint typicality decoding of the Gaussian random codeword V, based on the general asymptotic equipartition property of Gaussian processes shown by Cover and Pombra [2, Theorem 5]. Instead of the Gaussian codebook V, the transmitter initially sends a real number θ which is chosen from some equally spaced signal constellation Θ, say, Θ = {, +δ,..., δ, }, δ = 2/(2 nr ), and subsequently corrects the receiver s estimation error by sending θ ˆθ n (up to appropriate scaling as before) at time n, where ˆθ n is the minimum variance unbiased linear estimate of θ given Y n. Now we can verify that the optimal maximum-likelihood decoding is equivalent to finding θ Θ that is closest to ˆθ n, which results in the error probability P (n) e ( ) erfc c0 x 2n /2 2nR where erfc(x) = 2 π x exp( t2 )dt is the complementary error function and c 0 is a constant independent of n. This proves that the Schalkwijk Kailath Butman coding scheme achieves C FB = log x with doubly exponentially decaying error probability. 4 Concluding remarks Although it is still short of what we can call a closed-form solution in general, our variational characterization of Gaussian feedback capacity gives an exact analytic answer for a certain class of channels, as demonstrated in the example of the first-order ARMA Gaussian channel. Our development can be further extended in two directions. First, one can investigate properties of the optimal solution (S V, B ). Without much surprise, one can show that feedback increases the capacity if and only if the noise spectrum is white. 22
10 Furthermore, it can be shown that taking SV 0 does not incur any loss in maximizing the output entropy, resulting in a simpler maximin characterization of feedback capacity: ( π C FB = sup inf {b k } {a k } 2 log 2 ) 2 a k b k SZ (e iθ ) dθ k= k= where the supremum is taken over all {b k } satisfying k= b k 2 S Z (e iθ ) dθ P. Secondly, one can focus on the finite-order ARMA noise spectrum and show that the k-dimensional generalization of Schalkwijk Kailath Butman coding scheme is optimal for the ARMA spectrum of order k. This confirms many conjectures based on numerical evidences, including the recent study by Yang, Kavčić, and Tatikonda [9]. These results will be reported separately in [2]. References [] C. E. Shannon, Communication in the Presence of Noise, Proc. IRE, vol. 37, pp. 0 2, 949. [2] T. M. Cover and S. Pombra, Gaussian feedback capacity, IEEE Trans. Info. Theory, vol. 35, pp , January 989. [3] S. A. Butman, A general formulation of linear feedback communication systems with solutions, IEEE Trans. Info. Theory, vol. 5, pp , May 969. [4] S. A. Butman, Linear feedback rate bounds for regressive channels, IEEE Trans. Info. Theory, vol. 22, pp , May 976. [5] J. C. Tiernan and J. P. M. Schalkwijk, An upper bound to the capacity of the bandlimited Gaussian autoregressive channel with noiseless feedback, IEEE Trans. Info. Theory, vol. 20, pp. 3 36, May 974. [6] J. C. Tiernan, Analysis of the optimum linear system for the autoregressive forward channel with noiseless feedback, IEEE Trans. Info. Theory, vol. 22, pp , May 976. [7] J. Wolfowitz, Signalling over a Gaussian channel with feedback and autoregressive noise, J. Appl. Probab., vol. 2, pp , 975. [8] Y.-H. Kim, Feedback capacity of the first-order moving average Gaussian channel, submitted to IEEE Trans. Info. Theory, November [9] S. Yang, A. Kavčić, and S. Tatikonda, Linear Gaussian channels: feedback capacity under power constraints, in Proc. IEEE Int. Symp. Info. Theory, p. 72, [0] J. P. M. Schalkwijk and T. Kailath, A coding scheme for additive noise channels with feedback I: No bandwidth constraint, IEEE Trans. Info. Theory, vol. 2, pp , April 966. [] J. P. M. Schalkwijk, A coding scheme for additive noise channels with feedback II: Band-limited signals, IEEE Trans. Info. Theory, vol. 2, pp , April 966. [2] Y.-H. Kim, Feedback capacity of stationary Gaussian channels, in preparation. 23
Feedback Capacity of the First-Order Moving Average Gaussian Channel
Feedback Capacity of the First-Order Moving Average Gaussian Channel Young-Han Kim* Information Systems Laboratory, Stanford University, Stanford, CA 94305, USA Email: yhk@stanford.edu Abstract The feedback
More informationThe Feedback Capacity of the First-Order Moving Average Gaussian Channel
The Feedback Capacity of the First-Order Moving Average Gaussian Channel arxiv:cs/04036v [cs.it] 2 Nov 2004 Young-Han Kim Information Systems Laboratory Stanford University March 3, 2008 Abstract The feedback
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 7, JULY More precisely, encoding functions X : f1;...; 2 g2! ; i = 1; 2;...;n.
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 7, JULY 2006 3063 Feedback Capacity of the First-Order Moving Average Gaussian Channel Young-Han Kim, Student Member, IEEE Abstract Despite numerous
More informationWE consider a communication scenario in which
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 57 Feedback Capacity of Stationary Gaussian Channels Young-Han Kim, Member, IEEE Abstract The feedback capacity of additive stationary
More informationCooperative Communication with Feedback via Stochastic Approximation
Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationOn the Reliability of Gaussian Channels with Noisy Feedback
Forty-Fourth Annual Allerton Conference Allerton House, UIUC, Illinois, USA Sept 7-9, 006 WIIA.0 On the Reliability of Gaussian Channels with Noisy Feedback Young-Han Kim, Amos Lapidoth, Tsachy Weissman
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationShannon meets Wiener II: On MMSE estimation in successive decoding schemes
Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationA Coding Theorem for a Class of Stationary Channels with Feedback
A Codg Theorem for a Class of Stationary Channels with Feedback Young-Han Kim Department of Electrical and Computer Engeerg University of California, San Diego La Jolla, CA 92093, USA yhk@ucsd.edu Abstract
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationSecure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel
Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More information224 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 1, JANUARY 2012
224 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 1, JANUARY 2012 Linear-Feedback Sum-Capacity for Gaussian Multiple Access Channels Ehsan Ardestanizadeh, Member, IEEE, Michèle Wigger, Member, IEEE,
More informationLECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs
LECTURE 18 Last time: White Gaussian noise Bandlimited WGN Additive White Gaussian Noise (AWGN) channel Capacity of AWGN channel Application: DS-CDMA systems Spreading Coding theorem Lecture outline Gaussian
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationSHANNON [34] showed that the capacity of a memoryless. A Coding Theorem for a Class of Stationary Channels With Feedback Young-Han Kim, Member, IEEE
1488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 4, APRIL 2008 A Coding Theorem for a Cls of Stationary Channels With Feedback Young-Han Kim, Member, IEEE Abstract A coding theorem is proved for
More informationUniversal Anytime Codes: An approach to uncertain channels in control
Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer
More informationCapacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information
Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of
More informationFundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds
Graduate Theses and Dissertations Iowa State University Capstones, Theses and Dissertations 203 Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationConcatenated Coding Using Linear Schemes for Gaussian Broadcast Channels with Noisy Channel Output Feedback
IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. XX, NO. Y, MONTH 204 Concatenated Coding Using Linear Schemes for Gaussian Broadcast Channels with Noisy Channel Output Feedback Ziad Ahmad, Student Member, IEEE,
More informationOptimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters
Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Alkan Soysal Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland,
More informationGaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26
Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationWorst-Case Bounds for Gaussian Process Models
Worst-Case Bounds for Gaussian Process Models Sham M. Kakade University of Pennsylvania Matthias W. Seeger UC Berkeley Abstract Dean P. Foster University of Pennsylvania We present a competitive analysis
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationFeedback Stabilization over a First Order Moving Average Gaussian Noise Channel
Feedback Stabiliation over a First Order Moving Average Gaussian Noise Richard H. Middleton Alejandro J. Rojas James S. Freudenberg Julio H. Braslavsky Abstract Recent developments in information theory
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationTowards control over fading channels
Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationWE study the capacity of peak-power limited, single-antenna,
1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power
More informationWideband Fading Channel Capacity with Training and Partial Feedback
Wideband Fading Channel Capacity with Training and Partial Feedback Manish Agarwal, Michael L. Honig ECE Department, Northwestern University 145 Sheridan Road, Evanston, IL 6008 USA {m-agarwal,mh}@northwestern.edu
More informationRefinement of the outer bound of capacity region in Gaussian multiple access channel with feedback
International Symposium on Information Theory and its Applications, ISITA2004 Parma, Italy, October 0 3, 2004 Refinement of the outer bound of capacity region in Gaussian multiple access channel with feedback
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationThis research was partially supported by the Faculty Research and Development Fund of the University of North Carolina at Wilmington
LARGE SCALE GEOMETRIC PROGRAMMING: AN APPLICATION IN CODING THEORY Yaw O. Chang and John K. Karlof Mathematical Sciences Department The University of North Carolina at Wilmington This research was partially
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationOn the Sufficiency of Power Control for a Class of Channels with Feedback
On the Sufficiency of Power Control for a Class of Channels with Feedback Desmond S. Lun and Muriel Médard Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationOptimal Distributed Detection Strategies for Wireless Sensor Networks
Optimal Distributed Detection Strategies for Wireless Sensor Networks Ke Liu and Akbar M. Sayeed University of Wisconsin-Madison kliu@cae.wisc.edu, akbar@engr.wisc.edu Abstract We study optimal distributed
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationVariable-Rate Universal Slepian-Wolf Coding with Feedback
Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationSGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection
SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationOn the Shamai-Laroia Approximation for the Information Rate of the ISI Channel
On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014
More informationA new converse in rate-distortion theory
A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds
More informationSimplified Composite Coding for Index Coding
Simplified Composite Coding for Index Coding Yucheng Liu, Parastoo Sadeghi Research School of Engineering Australian National University {yucheng.liu, parastoo.sadeghi}@anu.edu.au Fatemeh Arbabjolfaei,
More informationMultiuser Capacity in Block Fading Channel
Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver
More informationESTIMATION THEORY. Chapter Estimation of Random Variables
Chapter ESTIMATION THEORY. Estimation of Random Variables Suppose X,Y,Y 2,...,Y n are random variables defined on the same probability space (Ω, S,P). We consider Y,...,Y n to be the observed random variables
More information( 1 k "information" I(X;Y) given by Y about X)
SUMMARY OF SHANNON DISTORTION-RATE THEORY Consider a stationary source X with f (x) as its th-order pdf. Recall the following OPTA function definitions: δ(,r) = least dist'n of -dim'l fixed-rate VQ's w.
More informationMidterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016
Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationOn Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs
On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationEstimation of parametric functions in Downton s bivariate exponential distribution
Estimation of parametric functions in Downton s bivariate exponential distribution George Iliopoulos Department of Mathematics University of the Aegean 83200 Karlovasi, Samos, Greece e-mail: geh@aegean.gr
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More information19. Channel coding: energy-per-bit, continuous-time channels
9. Channel coding: energy-per-bit, continuous-time channels 9. Energy per bit Consider the additive Gaussian noise channel: Y i = X i + Z i, Z i N ( 0, ). (9.) In the last lecture, we analyzed the maximum
More informationA Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationHigh-dimensional graphical model selection: Practical and information-theoretic limits
1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 26. Estimation: Regression and Least Squares
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 26 Estimation: Regression and Least Squares This note explains how to use observations to estimate unobserved random variables.
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationCovariance function estimation in Gaussian process regression
Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian
More informationSection 27. The Central Limit Theorem. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University
Section 27 The Central Limit Theorem Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 3000, R.O.C. Identically distributed summands 27- Central
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationCapacity Bounds for Diamond Networks
Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ
More informationTesting Algebraic Hypotheses
Testing Algebraic Hypotheses Mathias Drton Department of Statistics University of Chicago 1 / 18 Example: Factor analysis Multivariate normal model based on conditional independence given hidden variable:
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationCan Feedback Increase the Capacity of the Energy Harvesting Channel?
Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information