On Evaluating the Rate-Distortion Function of Sources with Feed-Forward and the Capacity of Channels with Feedback.

Size: px
Start display at page:

Download "On Evaluating the Rate-Distortion Function of Sources with Feed-Forward and the Capacity of Channels with Feedback."

Transcription

1 O Evaluatig the Rate-Distortio Fuctio of Sources with Feed-Forward ad the Capacity of Chaels with Feedback. Ramji Vekataramaa ad S. Sadeep Pradha Departmet of EECS, Uiversity of Michiga, A Arbor, MI 4805 rvekata@umich.edu, pradhav@eecs.umich.edu THIS PAPER IS ELIGIBLE FOR THE STUDENT PAPER AWARD Abstract We study the problem of computig the rate-distortio fuctio for sources with feed-forward ad the capacity for chaels with feedback. The formulas (ivolvig directed iformatio) for the optimal rate-distortio fuctio with feed-forward ad chael capacity with feedback are multiletter expressios ad caot be computed easily i geeral. I this work, we derive coditios uder which these ca be computed for a large class of sources/chaels with memory ad distortio/cost measures. Illustrative examples are also provided. I. INTRODUCTION Feedback is widely used i commuicatio systems to help combat the effect of oisy chaels. It is well-kow that feedback does ot icrease the capacity of a discrete memoryless chael []. However, feedback could icrease the capacity of a chael with memory. Recetly, directed iformatio has bee used to elegatly characterize the capacity of chaels with feedback [2], [3], [4]. The source codig couterpart to chael codig with feedback- source codig with feed-forward- has recetly bee studied i [5], [6], [7], [8]. The optimal rate-distortio fuctio with feed-forward was characterized usig directed iformatio i [6]. I this work, we study the problem of computig these ratedistortio ad capacity expressios. The formulas (ivolvig directed iformatio) for the optimal rate-distortio fuctio with feed-forward [6] ad chael capacity with feedback [4] are multi-letter expressios ad caot be computed easily i geeral. We derive coditios uder which these ca be computed for a large class of sources (chaels) with memory ad distortio (cost) measures. We also provide illustrative examples. Throughout, we cosider source feed-forward ad chael feedback with arbitrary delay. Whe the delay goes to, we obtai the case of o feed-forward/feedback. II. SOURCE CODING WITH FEED-FORWARD A. Problem Formulatio I simple terms, source codig with feed-forward is the source codig problem i which the decoder gets to observe some past source samples to help recostruct the preset This work was supported by NSF Grat ITR ad Grat (CA- REER) CCF sample. Cosider a geeral discrete source X with alphabet X ad recostructio alphabet X. The source is characterized by a sequece of distributios deoted by P X = {P X } =. There is a associated sequece of distortio measures d : X X R +. It is assumed that d (x, ˆx ) is ormalized with respect to ad is uiformly bouded i. For example d (x, ˆx ) may be the average per-letter distortio, i.e., d(x i, ˆx i ) for some d : X ˆX R +. Defiitio : A (N, 2 NR ) source code with delay k feedforward of block legth N ad rate R cosists of a ecoder mappig e ad a sequece of decoder mappigs g i, i =,..., N, e : X N {,..., 2 NR } g i : {,..., 2 NR } X i k X, i =,..., N. The ecoder maps each N-legth source sequece to a idex i {,..., 2 NR }. The decoder receives the idex trasmitted by the ecoder, ad to recostruct the ith sample, it has access to the source samples util time (i k). We wat to miimize R for a give distortio costrait. Defiitio 2: (Probability of error criterio) R is a ɛ- achievable rate at probability- distortio D if for all sufficietly large N, there exists a (N, 2 NR ) source codebook such that ( P X N x N : d N (x N, ˆx N ) > D ) < ɛ, ˆx N deotes the recostructio of x N. R is a achievable rate at probability- distortio D if it is ɛ-achievable for every ɛ > 0. We ow give a brief summary of the rate-distortio results with feed-forward foud i [6]. The rate-distortio fuctio with feed-forward (delay ) is characterized by directed iformatio, a quatity defied i [2]. The directed iformatio flowig from a radom sequece ˆX N to a radom sequece X N is defied as I( ˆX N X N ) = N I( ˆX ; X X ). () = Whe the feed-forward delay is k, the rate-distortio fuctio is characterized by the k delay versio of the directed

2 iformatio: I k ( ˆX N X N ) = N I( ˆX +k ; X X ). (2) = Whe we do ot make ay assumptio o the ature of the joit process {X, ˆX}, we eed to use the iformatio spectrum versio of (2). I particular, we will eed the quatity I k ( ˆX X) lim sup iprob P kˆx X = log P ˆXi ˆX i,x. i k P X, ˆX P kˆx X P X, (3) It should be oted that (2) ad (3) are the same whe the joit process {X, ˆX} is statioary ad ergodic. Theorem : [6] For a arbitrary source X characterized by a distributio P X, the rate-distortio fuctio with feedforward, the ifimum of all achievable rates at probability- distortio D, is give by R ff (D) = if I k( ˆX X), (4) P ˆX X :ρ(p ˆX X ) D ρ(p ˆX X ) lim sup d (x, ˆx ) iprob o (5) = if h : lim P X, ˆX ((x, ˆx ) : d (x, ˆx ) > h) = 0. B. Evaluatig the Rate-Distortio Fuctio with Feed-forward The rate-distortio formula i Theorem is a optimizatio of a multi-letter expressio: I k ( ˆX X) lim sup iprob log P X, ˆX P kˆx X P X, This is a optimizatio over a ifiite dimesioal space of coditioal distributios P ˆX X. Sice this is a potetially difficult optimizatio, we tur the problem o its head ad pose the followig questio: Give a source X with distributio P X ad a coditioal distributio P ˆX X, for what sequece of distortio measures does P ˆX X achieve the ifimum i the rate-distortio formula? A similar approach is used i [9] (Problem 2 ad 3, p. 47) to fid optimizig distributios for discrete memoryless chaels ad sources without feedback/feed-forward. It is also used i [0] to study the optimality of trasmittig ucoded source data over chaels ad i [] to study the duality betwee source ad chael codig. Give a source X, suppose we have a huch about the structure of the optimal coditioal distributio. The followig theorem (proof omitted) provides the distortio measures for which our huch is correct. The lim sup iprob of a radom sequece A is defied as the smallest umber α such that P (A α + ɛ) = 0 for all ɛ > 0 ad is deoted A. Theorem 2: Suppose we are give a statioary, ergodic source X characterized by P X = {P X } = with feedforward delay k. Let P ˆX X = {P X X } = be a coditioal distributio such that the joit distributio is statioary ad ergodic. The P ˆX X achieves the rate-distortio fuctio if for all sufficietly large, the distortio measure satisfies d (x, ˆx ) = c log P X, ˆX(x, ˆx ) P kˆx X (ˆx x ) + d 0(x ), (6) P kˆx X (ˆx x ) = P ˆXi X i k, ˆX (ˆx i i x i k, ˆx i ), c is ay positive umber ad d 0 (.) is a arbitrary fuctio. C. Markov Sources with Feed-forward A statioary, ergodic mth order Markov source X is characterized by a distributio P X = {P X } = P X = P Xi X i,. (7) i m Let the source have feed-forward with delay k. We first ask: Whe is the optimal joit distributio also mth order Markov i the followig sese: P X, ˆX = P Xi, ˆX i X i,. (8) i m I other words, whe does the optimizig coditioal distributio have the form P ˆX X = P ˆXi X. (9) i m, i The aswer, provided by Theorem 2, is stated below. I the sequel, we drop the subscripts o the probabilities to keep the otatio clea. Corollary : For a mth order Markov source (described i (7)) with feed-forward delay k, a mth order coditioal distributio (described i (9)) achieves the optimum i the rate-distortio fuctio for a sequece of distortio measures {d } give by d (x, ˆx ) = c X log P (x i, ˆx i x i i m ) P (ˆx i ˆx i i k+, xi k i k+ m ) + d0(x ), (0) c is ay positive umber ad d 0 (.) is a arbitrary fuctio. Proof: The proof ivolves substitutig (7) ad (9) i (6) ad performig a few maipulatios. I the followig sectio, we provide two examples to illustrate how Theorem 2 ca be used to determie the ratedistortio fuctio of sources with feed-forward.

3 -p 0 0 p 0 q 0 Fig.. A. Stock-market example -p -q p Markov chai represetig the stock value TABLE I DISTORTION e (ˆx i, x i = j, x i ) (x i, x i ) j, j + j, j j, j ˆx i = ˆx i = 0 III. EXAMPLES Suppose that we wish to observe the behavior of a particular stock i the stock market over a N day period. Assume that the value of the stock ca take k + differet values ad is modeled as a k + -state Markov chai, as show i Fig.. If o a particular day, the stock is i state i, i < k, the o the ext day, oe of the followig ca happe. The value icreases to state i + with probability p i. The value drops to state i with probability q i. The value remais the same with probability p i q i. Whe the stock-value is i state 0, the value caot decrease. Similarly, whe i state k, the value caot icrease. Suppose a ivestor ivests i this stock over a N day period ad desires to be forewared wheever the value drops. Assume that there is a isider (with a priori iformatio about the behavior of the stock) who ca sed iformatio to the ivestor at a fiite rate. The value of the stock is modeled as a Markov source X = {X }. The decisio ˆX of the ivestor is biary: ˆX = idicates that the price is goig to drop from day to, ˆX = 0 meas otherwise. Before day, the ivestor kows all the previous values of the stock X ad has to make the decisio ˆX. Thus feed-forward is automatically built ito the problem. The ivestor makes a error either whe he fails to predict a drop or whe he falsely predicts a drop. The distortio is modeled usig a Hammig distortio criterio as follows. d (x, ˆx ) = e(ˆx i, x i, x i ), () e(.,.,.) is the per-letter distortio give Table I. The miimum amout of iformatio (i bits/sample) the isider eeds to covey to the ivestor so that he ca predict drops i value with distortio D is deoted R ff (D). q k -q k k Propositio : For the stock-market problem described above, k R ff (D) = π i (h(p i, q i, p i q i ) h(ɛ, ɛ)) + π k (h(q k, q k ) h(ɛ, ɛ)), h() is the etropy fuctio, [π 0, π,, π k ] is the statioary distributio of the Markov chai ad ɛ = D π 0. Proof: We will use Corollary to verify that a first-order Markov coditioal distributio of the form P ˆX ˆX,X = P ˆX X,X, (2) achieves the optimum. Due to the structure of the distortio fuctio i Table I, we ca guess the structure of P (x i ˆx i, x i ) as follows. Whe X i = 0, the decoder ca always declare ˆX i = 0 - there is o error irrespective of the value of X i. So we assig P ( ˆX i = 0 x i = 0, x i = 0) = P ( ˆX i = 0 x i = 0, x i = ) =, which gives P (X i = 0 x i = 0, ˆx i = 0) = p. The evet (X i = 0, ˆX i = ) has zero probability. Thus we obtai the first two colums of Table II. Whe (X i = j, ˆX i = 0), 0 < j < k, a error occurs whe X i = j. This is assiged a probability ɛ. The remaiig probability ɛ is split betwee P (X i = j x i = j, ˆx i = 0) ad P (X i = j + x i = j, ˆx i = 0) accordig to their trasitio probabilities. I a similar fashio, we obtai all the colums i Table II. We ow show that the distortio criterio () ca be cast i the form d (x, ˆx ) = or equivaletly ( ) c log 2 P (x i ˆx i, x i )+d 0 (x i, x i ), (3) e(ˆx i, x i, x i ) = c log 2 P (x i ˆx i, x i ) + d 0 (x i, x i ), (4) thereby provig that the distributio i Table II is optimal. This is doe by substitutig values from Tables I ad II ito (4) to determie c ad d 0 (.,.). Sice the process {X, ˆX} is joitly statioary ad ergodic, the distortio costrait is equivalet to E[e(ˆx 2, x, x 2 )] D. To calculate the expected distortio E[e(ˆx 2, x, x 2 )] = P (x, x 2 )P (ˆx 2 x, x 2 ) e(ˆx 2, x, x 2 ), x,x 2,ˆx2 (5) we eed the (optimum achievig) coditioal distributio P ( ˆX 2 x, x 2 ). This is foud by substitutig the values from Tables I ad II i the relatio P (x 2 x, ˆx 2 ) = P (x 2 x )P (ˆx 2 x 2, x ) x 2 P (x 2 x )P (ˆx 2 x 2, x ). (6)

4 TABLE II THE DISTRIBUTION P (X i x i, ˆx i ) (x i, ˆx i ) 0, 0 0, j, 0 j, k, 0 k, x i = 0 p x i = p x i =.... x i = j ɛ ɛ ( ɛ)( p j q j ) ɛ( p j q j ) q j x i = j q j ( ɛ)p x i = j + j ɛp j q j q j x i =.... x i = k ɛ ɛ x i = k ɛ ɛ TABLE III THE CONDITIONAL DISTRIBUTION P ( ˆX i x i, x i ) (x i, x i ) 0, 0 0, j, j j, j j, j + k, k k, k ɛ( q ˆx i = 0 j ɛ) ( ɛ)( q j ɛ) ( ɛ)( q j ɛ) ɛ( q j ɛ) ( ɛ)( q j ɛ) q j ( 2ɛ) ( q j )( 2ɛ) ( q j )( 2ɛ) q j ( 2ɛ) ( q j )( 2ɛ) ( ɛ)(q ˆx i = 0 0 j ɛ) ɛ(q j ɛ) ɛ(q j ɛ) ( ɛ)(q j ɛ) ɛ(q j ɛ) q j ( 2ɛ) ( q j )( 2ɛ) ( q j )( 2ɛ) q j ( 2ɛ) ( q j )( 2ɛ) Thus we obtai the coditioal distributio P ( ˆX 2 x, x 2 ) show i Table III. Usig this i (5), we get E[e(ˆx 2, x, x 2 )] = ( π 0 )ɛ D (7) We ca ow calculate the rate distortio fuctio as R ff (D) = P (x 2 x, ˆx 2 ) P (x, x 2, ˆx 2 ) log 2 P (x 2 x ) x,x 2,ˆx 2 = H(X 2 X ) H(X 2 ˆX 2, X ) to obtai the expressio i Propositio. B. Gauss-Markov Source (8) Cosider a statioary, ergodic, first-order Gauss-Markov source X with mea 0, correlatio ρ ad variace σ 2 : X = ρx + N,, (9) {N } are idepedet, idetically distributed Gaussia radom variables with mea 0 ad variace ( ρ 2 )σ 2. Assume the source has feed-forward with delay ad we wat to recostruct at every time istat the liear combiatio ax +bx, for ay costats a, b. We use the mea-squared error distortio criterio: d (x, ˆx ) = (ˆx i (ax i + bx i )) 2. (20) The feed-forward distortio-rate fuctio for this source with average mea-squared error distortio was give i [5]. The feed-forward rate-distortio fuctio ca also be obtaied usig Theorem 2 (proof omitted): R ff (D) = 2 log σ2 ( ρ 2 ) D/a 2. (2) We must metio here that the rate-distortio fuctio i the first example caot be computed usig the techiques i [5]. IV. CHANNEL CODING WITH FEEDBACK I this sectio, we cosider chaels with feedback ad the problem of evaluatig their capacity. A chael is defied as a sequece of probability distributios: PY X ch ch = {PY X,Y } =. (22) I the above, X ad Y are the chael iput ad output symbols at time, respectively. The chael is assumed to have k delay feedback ( k < ). This meas at time istat, the ecoder has perfect kowledge of the chael outputs util time k to produce the iput x. The iput distributio to the chael is deoted by P k X Y = {P X X,Y k} =. I the sequel, we will eed the followig product quatities correspodig to the chael ad the iput. P ch Y X P k X Y P Yi X i,y i, (23) P Xi X i,y i k. The joit distributio of the system is give by P X,Y = {P X,Y } =, P X,Y = P X k ch Y P Y X. (24) Defiitio 3: A (N, 2 NR ) chael code with delay k feedforward of block legth N ad rate R cosists of a sequece of ecoder mappigs e i, ı =,..., N ad a decoder g, e i : {,..., 2 NR } Y i k X, i =,..., N g : Y N {,..., 2 NR }

5 Thus it is desired to trasmit oe of 2 NR messages over the chael i N uits of time. There is a associated cost fuctio for usig the chael give by c N (X N, Y N ). For example, this could be the average power of the iput symbols. If W is the message that was trasmitted, the the probability of error is P e = P r(g(y N ) W ). Defiitio 4: R is a (ɛ, δ)-achievable rate at probability- cost C if for all sufficietly large N, there exists a (N, 2 NR ) chael code such that P e < ɛ, P r(c N (X N, Y N ) > C) < δ R is a achievable rate at probability- cost C if it is (ɛ, δ)- achievable for every ɛ, δ > 0. Theorem 3: [6], [4] For a arbitrary chael PY X ch, the capacity with k delay feedback, the ifimum of all achievable rates at probability- cost C, is give by 2 ad C fb (C) = sup I(X Y ), (25) P k X Y :ρ(p k X Y ) C I(X Y ) lim if iprob P ch log Y X P Y ρ(px Y) k lim sup c (X, Y ) iprob = if{h : lim PX Y ((x, y ) : c (x, y ) > h)} = 0. I the above, we ote that P Y = P X,Y = P k ch X Y P Y X. X X A. Evaluatig the Chael Capacity with Feedback The capacity formula i Theorem 3 is a multi-letter expressio ivolvig optimizig the fuctio I(X Y ) over a ifiite dimesioal space of iput distributios PX Y k. Just like we did with sources, we ca pose the followig questio: Give a chael PY X ch ad a iput distributio PX Y k, for what sequece of cost measures does P X Y k achieve the supremum i the capacity formula? The followig theorem (proof omitted) provides a aswer. Theorem 4: Suppose we are give a chael PY X ch with k delay feedback ad a iput distributio PX Y k such that the joit process P X,Y give by (24) is statioary, ergodic. The the iput distributio PX Y k achieves the k delay feedback capacity of the chael if for all sufficietly large, the cost measure satisfies c (x, y ) = λ P ch log Y X (y x ) P Y (y + d 0, (26) ) λ is ay positive umber ad d 0 is a arbitrary costat. 2 The lim if iprob of a radom sequece A is defied as the largest umber α such that P (A α ɛ) = 0 for all ɛ > 0 ad is deoted A. B. Markov chaels with feedback The problem of evaluatig the capacity of fiite state machie chaels was studied recetly i [2] ad [3]. I [2], it was show that the capacity of such a chael is achieved by a feedback depedet Markov source, i.e., the optimal iput distributio is of the form {P X X,Y }, i.e., Markov i X but depeds o all the past Y symbols. We cosider a simple Markov chael with feedback delay ad the problem of evaluatig its capacity. The chael we study is characterized by PY ch X,Y = P ch Y X,Y. (27) Let the chael have feedback with delay. We are iterested i fidig cost measures for which the capacity of the chael i (27) is easily evaluated. We first ask: Whe is the optimal joit distributio first order Markov i the followig sese: P X,Y = P Xi,Y i Y i,. (28) I other words, whe does the optimizig iput distributio to have the form P X Y = P Xi Y i,. (29) From Theorem 4, it is see that this happes whe the costfuctio has the form: c (x, y ) = λ P Y ch log i X i,y i (y i x i, y i ) + d 0. P Yi Y i (y i y i ) (30) REFERENCES [] C. E. Shao, The zero-error capacity of a oisy chael, IRE Tras. If. Theory, vol. IT-2, pp. 8 9, 956. [2] J. Massey, Causality, Feedback ad Directed Iformatio, Proc. 990 Symp. o If. Theory ad Applicatios (ISITA-90), pp , 990. [3] G. Kramer, Directed Iformatio for chaels with Feedback. PhD thesis, Swiss Federal Istitute of Techology, Zurich, 998. [4] S. Tatikoda, Cotrol Uder Commuicatios Costraits. PhD thesis, Massachusetts Ist. of Techology, Cambridge, MA, September [5] T. Weissma ad N. Merhav, O competitive predictio ad its relatio to rate-distortio theory, IEEE Tras. If. Theory, vol. IT-49, pp , December [6] R. Vekataramaa ad S. S. Pradha, Directed Iformatio for Commuicatio problems with Side-iformatio ad Feedback/Feedforward, Proc. 43rd Aual Allerto Cof. (Moticello, IL), [7] E. Martiia ad G. W. Worell, Source Codig with Fixed Lag Side Iformatio, Proc. 42d Aual Allerto Cof. (Moticello, IL), [8] S. S. Pradha, Source codig with feedforward: Gaussia sources, i Proc. IEEE It. Symp. o If. Theory, p. 22, Jue [9] I. Csisz ar ad J. Korer, Iformatio Theory: Codig Theorems for Discrete Memoryless Systems. New York: Academic Press, 98. [0] M. Gastpar, B. Rimoldi, ad M. Vetterli, To code, or ot to code: lossy source-chael commuicatio revisited., IEEE Tras. If. Theory, vol. 49, o. 5, pp , [] S. S. Pradha, J. Chou, ad K. Ramchadra, Duality betwee source codig ad chael codig ad its extesio to the side-iformatio case, IEEE Tras. If. Theory, vol. 49, pp , May [2] S. Yag, A. Kavcic, ad S. Tatikoda, Feedback capacity of fiite-state machie chaels., IEEE Tras. If. Theory, vol. 5, o. 3, pp , [3] J. Che ad T. Berger, The capacity of fiite-state markov chaels with feedback., IEEE Tras. If. Theory, vol. 5, o. 3, pp , 2005.

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame Iformatio Theory Tutorial Commuicatio over Chaels with memory Chi Zhag Departmet of Electrical Egieerig Uiversity of Notre Dame Abstract A geeral capacity formula C = sup I(; Y ), which is correct for

More information

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes The Maximum-Lielihood Decodig Performace of Error-Correctig Codes Hery D. Pfister ECE Departmet Texas A&M Uiversity August 27th, 2007 (rev. 0) November 2st, 203 (rev. ) Performace of Codes. Notatio X,

More information

Asymptotic Coupling and Its Applications in Information Theory

Asymptotic Coupling and Its Applications in Information Theory Asymptotic Couplig ad Its Applicatios i Iformatio Theory Vicet Y. F. Ta Joit Work with Lei Yu Departmet of Electrical ad Computer Egieerig, Departmet of Mathematics, Natioal Uiversity of Sigapore IMS-APRM

More information

Lecture 27. Capacity of additive Gaussian noise channel and the sphere packing bound

Lecture 27. Capacity of additive Gaussian noise channel and the sphere packing bound Lecture 7 Ageda for the lecture Gaussia chael with average power costraits Capacity of additive Gaussia oise chael ad the sphere packig boud 7. Additive Gaussia oise chael Up to this poit, we have bee

More information

Lecture 14: Graph Entropy

Lecture 14: Graph Entropy 15-859: Iformatio Theory ad Applicatios i TCS Sprig 2013 Lecture 14: Graph Etropy March 19, 2013 Lecturer: Mahdi Cheraghchi Scribe: Euiwoog Lee 1 Recap Bergma s boud o the permaet Shearer s Lemma Number

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS MASSACHUSTTS INSTITUT OF TCHNOLOGY 6.436J/5.085J Fall 2008 Lecture 9 /7/2008 LAWS OF LARG NUMBRS II Cotets. The strog law of large umbers 2. The Cheroff boud TH STRONG LAW OF LARG NUMBRS While the weak

More information

Lecture 11: Channel Coding Theorem: Converse Part

Lecture 11: Channel Coding Theorem: Converse Part EE376A/STATS376A Iformatio Theory Lecture - 02/3/208 Lecture : Chael Codig Theorem: Coverse Part Lecturer: Tsachy Weissma Scribe: Erdem Bıyık I this lecture, we will cotiue our discussio o chael codig

More information

Universal source coding for complementary delivery

Universal source coding for complementary delivery SITA2006 i Hakodate 2005.2. p. Uiversal source codig for complemetary delivery Akisato Kimura, 2, Tomohiko Uyematsu 2, Shigeaki Kuzuoka 2 Media Iformatio Laboratory, NTT Commuicatio Sciece Laboratories,

More information

Lecture 7: October 18, 2017

Lecture 7: October 18, 2017 Iformatio ad Codig Theory Autum 207 Lecturer: Madhur Tulsiai Lecture 7: October 8, 207 Biary hypothesis testig I this lecture, we apply the tools developed i the past few lectures to uderstad the problem

More information

Entropy Rates and Asymptotic Equipartition

Entropy Rates and Asymptotic Equipartition Chapter 29 Etropy Rates ad Asymptotic Equipartitio Sectio 29. itroduces the etropy rate the asymptotic etropy per time-step of a stochastic process ad shows that it is well-defied; ad similarly for iformatio,

More information

Markov Decision Processes

Markov Decision Processes Markov Decisio Processes Defiitios; Statioary policies; Value improvemet algorithm, Policy improvemet algorithm, ad liear programmig for discouted cost ad average cost criteria. Markov Decisio Processes

More information

10-701/ Machine Learning Mid-term Exam Solution

10-701/ Machine Learning Mid-term Exam Solution 0-70/5-78 Machie Learig Mid-term Exam Solutio Your Name: Your Adrew ID: True or False (Give oe setece explaatio) (20%). (F) For a cotiuous radom variable x ad its probability distributio fuctio p(x), it

More information

Multiterminal source coding with complementary delivery

Multiterminal source coding with complementary delivery Iteratioal Symposium o Iformatio Theory ad its Applicatios, ISITA2006 Seoul, Korea, October 29 November 1, 2006 Multitermial source codig with complemetary delivery Akisato Kimura ad Tomohiko Uyematsu

More information

Lossy Compression via Sparse Linear Regression: Computationally Efficient Encoding and Decoding

Lossy Compression via Sparse Linear Regression: Computationally Efficient Encoding and Decoding ossy Compressio via Sparse iear Regressio: Computatioally Efficiet Ecodig ad Decodig Ramji Vekataramaa Dept of Egieerig, Uiv of Cambridge ramjiv@egcamacuk Tuhi Sarkar Dept of EE, IIT Bombay tuhi91@gmailcom

More information

Shannon s noiseless coding theorem

Shannon s noiseless coding theorem 18.310 lecture otes May 4, 2015 Shao s oiseless codig theorem Lecturer: Michel Goemas I these otes we discuss Shao s oiseless codig theorem, which is oe of the foudig results of the field of iformatio

More information

An Introduction to Randomized Algorithms

An Introduction to Randomized Algorithms A Itroductio to Radomized Algorithms The focus of this lecture is to study a radomized algorithm for quick sort, aalyze it usig probabilistic recurrece relatios, ad also provide more geeral tools for aalysis

More information

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1 EECS564 Estimatio, Filterig, ad Detectio Hwk 2 Sols. Witer 25 4. Let Z be a sigle observatio havig desity fuctio where. p (z) = (2z + ), z (a) Assumig that is a oradom parameter, fid ad plot the maximum

More information

Entropy and Ergodic Theory Lecture 5: Joint typicality and conditional AEP

Entropy and Ergodic Theory Lecture 5: Joint typicality and conditional AEP Etropy ad Ergodic Theory Lecture 5: Joit typicality ad coditioal AEP 1 Notatio: from RVs back to distributios Let (Ω, F, P) be a probability space, ad let X ad Y be A- ad B-valued discrete RVs, respectively.

More information

Problem Set 4 Due Oct, 12

Problem Set 4 Due Oct, 12 EE226: Radom Processes i Systems Lecturer: Jea C. Walrad Problem Set 4 Due Oct, 12 Fall 06 GSI: Assae Gueye This problem set essetially reviews detectio theory ad hypothesis testig ad some basic otios

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE Geeral e Image Coder Structure Motio Video (s 1,s 2,t) or (s 1,s 2 ) Natural Image Samplig A form of data compressio; usually lossless, but ca be lossy Redudacy Removal Lossless compressio: predictive

More information

Generalized Semi- Markov Processes (GSMP)

Generalized Semi- Markov Processes (GSMP) Geeralized Semi- Markov Processes (GSMP) Summary Some Defiitios Markov ad Semi-Markov Processes The Poisso Process Properties of the Poisso Process Iterarrival times Memoryless property ad the residual

More information

Refinement of Two Fundamental Tools in Information Theory

Refinement of Two Fundamental Tools in Information Theory Refiemet of Two Fudametal Tools i Iformatio Theory Raymod W. Yeug Istitute of Network Codig The Chiese Uiversity of Hog Kog Joit work with Siu Wai Ho ad Sergio Verdu Discotiuity of Shao s Iformatio Measures

More information

Lecture 15: Strong, Conditional, & Joint Typicality

Lecture 15: Strong, Conditional, & Joint Typicality EE376A/STATS376A Iformatio Theory Lecture 15-02/27/2018 Lecture 15: Strog, Coditioal, & Joit Typicality Lecturer: Tsachy Weissma Scribe: Nimit Sohoi, William McCloskey, Halwest Mohammad I this lecture,

More information

Information Theory and Statistics Lecture 4: Lempel-Ziv code

Information Theory and Statistics Lecture 4: Lempel-Ziv code Iformatio Theory ad Statistics Lecture 4: Lempel-Ziv code Łukasz Dębowski ldebowsk@ipipa.waw.pl Ph. D. Programme 203/204 Etropy rate is the limitig compressio rate Theorem For a statioary process (X i)

More information

A New Achievability Scheme for the Relay Channel

A New Achievability Scheme for the Relay Channel A New Achievability Scheme for the Relay Chael Wei Kag Seur Ulukus Departmet of Electrical ad Computer Egieerig Uiversity of Marylad, College Park, MD 20742 wkag@umd.edu ulukus@umd.edu October 4, 2007

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit Theorems Throughout this sectio we will assume a probability space (, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Lecture 10: Universal coding and prediction

Lecture 10: Universal coding and prediction 0-704: Iformatio Processig ad Learig Sprig 0 Lecture 0: Uiversal codig ad predictio Lecturer: Aarti Sigh Scribes: Georg M. Goerg Disclaimer: These otes have ot bee subjected to the usual scrutiy reserved

More information

Entropies & Information Theory

Entropies & Information Theory Etropies & Iformatio Theory LECTURE I Nilajaa Datta Uiversity of Cambridge,U.K. For more details: see lecture otes (Lecture 1- Lecture 5) o http://www.qi.damtp.cam.ac.uk/ode/223 Quatum Iformatio Theory

More information

Lecture 6: Source coding, Typicality, and Noisy channels and capacity

Lecture 6: Source coding, Typicality, and Noisy channels and capacity 15-859: Iformatio Theory ad Applicatios i TCS CMU: Sprig 2013 Lecture 6: Source codig, Typicality, ad Noisy chaels ad capacity Jauary 31, 2013 Lecturer: Mahdi Cheraghchi Scribe: Togbo Huag 1 Recap Uiversal

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit theorems Throughout this sectio we will assume a probability space (Ω, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Symmetric Two-User Gaussian Interference Channel with Common Messages

Symmetric Two-User Gaussian Interference Channel with Common Messages Symmetric Two-User Gaussia Iterferece Chael with Commo Messages Qua Geg CSL ad Dept. of ECE UIUC, IL 680 Email: geg5@illiois.edu Tie Liu Dept. of Electrical ad Computer Egieerig Texas A&M Uiversity, TX

More information

Infinite Sequences and Series

Infinite Sequences and Series Chapter 6 Ifiite Sequeces ad Series 6.1 Ifiite Sequeces 6.1.1 Elemetary Cocepts Simply speakig, a sequece is a ordered list of umbers writte: {a 1, a 2, a 3,...a, a +1,...} where the elemets a i represet

More information

Lecture 7: Channel coding theorem for discrete-time continuous memoryless channel

Lecture 7: Channel coding theorem for discrete-time continuous memoryless channel Lecture 7: Chael codig theorem for discrete-time cotiuous memoryless chael Lectured by Dr. Saif K. Mohammed Scribed by Mirsad Čirkić Iformatio Theory for Wireless Commuicatio ITWC Sprig 202 Let us first

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2016 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

Optimally Sparse SVMs

Optimally Sparse SVMs A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but

More information

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator Slide Set 13 Liear Model with Edogeous Regressors ad the GMM estimator Pietro Coretto pcoretto@uisa.it Ecoometrics Master i Ecoomics ad Fiace (MEF) Uiversità degli Studi di Napoli Federico II Versio: Friday

More information

OPTIMAL PIECEWISE UNIFORM VECTOR QUANTIZATION OF THE MEMORYLESS LAPLACIAN SOURCE

OPTIMAL PIECEWISE UNIFORM VECTOR QUANTIZATION OF THE MEMORYLESS LAPLACIAN SOURCE Joural of ELECTRICAL EGIEERIG, VOL. 56, O. 7-8, 2005, 200 204 OPTIMAL PIECEWISE UIFORM VECTOR QUATIZATIO OF THE MEMORYLESS LAPLACIA SOURCE Zora H. Perić Veljo Lj. Staović Alesadra Z. Jovaović Srdja M.

More information

Vector Quantization: a Limiting Case of EM

Vector Quantization: a Limiting Case of EM . Itroductio & defiitios Assume that you are give a data set X = { x j }, j { 2,,, }, of d -dimesioal vectors. The vector quatizatio (VQ) problem requires that we fid a set of prototype vectors Z = { z

More information

Sequences and Series of Functions

Sequences and Series of Functions Chapter 6 Sequeces ad Series of Fuctios 6.1. Covergece of a Sequece of Fuctios Poitwise Covergece. Defiitio 6.1. Let, for each N, fuctio f : A R be defied. If, for each x A, the sequece (f (x)) coverges

More information

THE interference channel problem describes a setup where multiple pairs of transmitters and receivers share a communication

THE interference channel problem describes a setup where multiple pairs of transmitters and receivers share a communication Trade-off betwee Commuicatio ad Cooperatio i the Iterferece Chael Farhad Shirai EECS Departmet Uiversity of Michiga A Arbor,USA Email: fshirai@umich.edu S. Sadeep Pradha EECS Departmet Uiversity of Michiga

More information

Intro to Learning Theory

Intro to Learning Theory Lecture 1, October 18, 2016 Itro to Learig Theory Ruth Urer 1 Machie Learig ad Learig Theory Comig soo 2 Formal Framework 21 Basic otios I our formal model for machie learig, the istaces to be classified

More information

ACO Comprehensive Exam 9 October 2007 Student code A. 1. Graph Theory

ACO Comprehensive Exam 9 October 2007 Student code A. 1. Graph Theory 1. Graph Theory Prove that there exist o simple plaar triagulatio T ad two distict adjacet vertices x, y V (T ) such that x ad y are the oly vertices of T of odd degree. Do ot use the Four-Color Theorem.

More information

Information Theory and Coding

Information Theory and Coding Sol. Iformatio Theory ad Codig. The capacity of a bad-limited additive white Gaussia (AWGN) chael is give by C = Wlog 2 ( + σ 2 W ) bits per secod(bps), where W is the chael badwidth, is the average power

More information

Lecture 5: April 17, 2013

Lecture 5: April 17, 2013 TTIC/CMSC 350 Mathematical Toolkit Sprig 203 Madhur Tulsiai Lecture 5: April 7, 203 Scribe: Somaye Hashemifar Cheroff bouds recap We recall the Cheroff/Hoeffdig bouds we derived i the last lecture idepedet

More information

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Convergence of random variables. (telegram style notes) P.J.C. Spreij Covergece of radom variables (telegram style otes).j.c. Spreij this versio: September 6, 2005 Itroductio As we kow, radom variables are by defiitio measurable fuctios o some uderlyig measurable space

More information

Chapter 6 Infinite Series

Chapter 6 Infinite Series Chapter 6 Ifiite Series I the previous chapter we cosidered itegrals which were improper i the sese that the iterval of itegratio was ubouded. I this chapter we are goig to discuss a topic which is somewhat

More information

Distribution of Random Samples & Limit theorems

Distribution of Random Samples & Limit theorems STAT/MATH 395 A - PROBABILITY II UW Witer Quarter 2017 Néhémy Lim Distributio of Radom Samples & Limit theorems 1 Distributio of i.i.d. Samples Motivatig example. Assume that the goal of a study is to

More information

Vector Permutation Code Design Algorithm. Danilo SILVA and Weiler A. FINAMORE

Vector Permutation Code Design Algorithm. Danilo SILVA and Weiler A. FINAMORE Iteratioal Symposium o Iformatio Theory ad its Applicatios, ISITA2004 Parma, Italy, October 10 13, 2004 Vector Permutatio Code Desig Algorithm Dailo SILVA ad Weiler A. FINAMORE Cetro de Estudos em Telecomuicações

More information

Properties and Hypothesis Testing

Properties and Hypothesis Testing Chapter 3 Properties ad Hypothesis Testig 3.1 Types of data The regressio techiques developed i previous chapters ca be applied to three differet kids of data. 1. Cross-sectioal data. 2. Time series data.

More information

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit 08 IEEE Iteratioal Symposium o Iformatio Theory ISIT No-Asymptotic Achievable Rates for Gaussia Eergy-Harvestig Chaels: Best-Effort ad Save-ad-Trasmit Silas L Fog Departmet of Electrical ad Computer Egieerig

More information

c. Explain the basic Newsvendor model. Why is it useful for SC models? e. What additional research do you believe will be helpful in this area?

c. Explain the basic Newsvendor model. Why is it useful for SC models? e. What additional research do you believe will be helpful in this area? 1. Research Methodology a. What is meat by the supply chai (SC) coordiatio problem ad does it apply to all types of SC s? Does the Bullwhip effect relate to all types of SC s? Also does it relate to SC

More information

Lecture Chapter 6: Convergence of Random Sequences

Lecture Chapter 6: Convergence of Random Sequences ECE5: Aalysis of Radom Sigals Fall 6 Lecture Chapter 6: Covergece of Radom Sequeces Dr Salim El Rouayheb Scribe: Abhay Ashutosh Doel, Qibo Zhag, Peiwe Tia, Pegzhe Wag, Lu Liu Radom sequece Defiitio A ifiite

More information

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + 62. Power series Defiitio 16. (Power series) Give a sequece {c }, the series c x = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + is called a power series i the variable x. The umbers c are called the coefficiets of

More information

Finite Block-Length Gains in Distributed Source Coding

Finite Block-Length Gains in Distributed Source Coding Decoder Fiite Block-Legth Gais i Distributed Source Codig Farhad Shirai EECS Departmet Uiversity of Michiga A Arbor,USA Email: fshirai@umichedu S Sadeep Pradha EECS Departmet Uiversity of Michiga A Arbor,USA

More information

(b) What is the probability that a particle reaches the upper boundary n before the lower boundary m?

(b) What is the probability that a particle reaches the upper boundary n before the lower boundary m? MATH 529 The Boudary Problem The drukard s walk (or boudary problem) is oe of the most famous problems i the theory of radom walks. Oe versio of the problem is described as follows: Suppose a particle

More information

ECE 564/645 - Digital Communication Systems (Spring 2014) Final Exam Friday, May 2nd, 8:00-10:00am, Marston 220

ECE 564/645 - Digital Communication Systems (Spring 2014) Final Exam Friday, May 2nd, 8:00-10:00am, Marston 220 ECE 564/645 - Digital Commuicatio Systems (Sprig 014) Fial Exam Friday, May d, 8:00-10:00am, Marsto 0 Overview The exam cosists of four (or five) problems for 100 (or 10) poits. The poits for each part

More information

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio

More information

Basics of Probability Theory (for Theory of Computation courses)

Basics of Probability Theory (for Theory of Computation courses) Basics of Probability Theory (for Theory of Computatio courses) Oded Goldreich Departmet of Computer Sciece Weizma Istitute of Sciece Rehovot, Israel. oded.goldreich@weizma.ac.il November 24, 2008 Preface.

More information

Sequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

Sequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence Sequeces A sequece of umbers is a fuctio whose domai is the positive itegers. We ca see that the sequece 1, 1, 2, 2, 3, 3,... is a fuctio from the positive itegers whe we write the first sequece elemet

More information

Lecture 8: Convergence of transformations and law of large numbers

Lecture 8: Convergence of transformations and law of large numbers Lecture 8: Covergece of trasformatios ad law of large umbers Trasformatio ad covergece Trasformatio is a importat tool i statistics. If X coverges to X i some sese, we ofte eed to check whether g(x ) coverges

More information

Linear Regression Demystified

Linear Regression Demystified Liear Regressio Demystified Liear regressio is a importat subject i statistics. I elemetary statistics courses, formulae related to liear regressio are ofte stated without derivatio. This ote iteds to

More information

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence Chapter 3 Strog covergece As poited out i the Chapter 2, there are multiple ways to defie the otio of covergece of a sequece of radom variables. That chapter defied covergece i probability, covergece i

More information

CS284A: Representations and Algorithms in Molecular Biology

CS284A: Representations and Algorithms in Molecular Biology CS284A: Represetatios ad Algorithms i Molecular Biology Scribe Notes o Lectures 3 & 4: Motif Discovery via Eumeratio & Motif Represetatio Usig Positio Weight Matrix Joshua Gervi Based o presetatios by

More information

18.657: Mathematics of Machine Learning

18.657: Mathematics of Machine Learning 8.657: Mathematics of Machie Learig Lecturer: Philippe Rigollet Lecture 0 Scribe: Ade Forrow Oct. 3, 05 Recall the followig defiitios from last time: Defiitio: A fuctio K : X X R is called a positive symmetric

More information

Increasing timing capacity using packet coloring

Increasing timing capacity using packet coloring 003 Coferece o Iformatio Scieces ad Systems, The Johs Hopkis Uiversity, March 4, 003 Icreasig timig capacity usig packet colorig Xi Liu ad R Srikat[] Coordiated Sciece Laboratory Uiversity of Illiois e-mail:

More information

6.3 Testing Series With Positive Terms

6.3 Testing Series With Positive Terms 6.3. TESTING SERIES WITH POSITIVE TERMS 307 6.3 Testig Series With Positive Terms 6.3. Review of what is kow up to ow I theory, testig a series a i for covergece amouts to fidig the i= sequece of partial

More information

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1. Eco 325/327 Notes o Sample Mea, Sample Proportio, Cetral Limit Theorem, Chi-square Distributio, Studet s t distributio 1 Sample Mea By Hiro Kasahara We cosider a radom sample from a populatio. Defiitio

More information

Empirical Process Theory and Oracle Inequalities

Empirical Process Theory and Oracle Inequalities Stat 928: Statistical Learig Theory Lecture: 10 Empirical Process Theory ad Oracle Iequalities Istructor: Sham Kakade 1 Risk vs Risk See Lecture 0 for a discussio o termiology. 2 The Uio Boud / Boferoi

More information

On Successive Refinement for the Wyner-Ziv Problem with Partially Cooperating Decoders

On Successive Refinement for the Wyner-Ziv Problem with Partially Cooperating Decoders ISIT 2008, Toroto, Caada, July 6-11, 2008 O Successive Refiemet for the Wyer-Ziv Problem with Partially Cooperatig Decoders Shraga I. Bross ad Tsachy Weissma, School of Egieerig, Bar-Ila Uiversity, Ramat-Ga

More information

The Growth of Functions. Theoretical Supplement

The Growth of Functions. Theoretical Supplement The Growth of Fuctios Theoretical Supplemet The Triagle Iequality The triagle iequality is a algebraic tool that is ofte useful i maipulatig absolute values of fuctios. The triagle iequality says that

More information

Channel coding, linear block codes, Hamming and cyclic codes Lecture - 8

Channel coding, linear block codes, Hamming and cyclic codes Lecture - 8 Digital Commuicatio Chael codig, liear block codes, Hammig ad cyclic codes Lecture - 8 Ir. Muhamad Asial, MSc., PhD Ceter for Iformatio ad Commuicatio Egieerig Research (CICER) Electrical Egieerig Departmet

More information

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices Radom Matrices with Blocks of Itermediate Scale Strogly Correlated Bad Matrices Jiayi Tog Advisor: Dr. Todd Kemp May 30, 07 Departmet of Mathematics Uiversity of Califoria, Sa Diego Cotets Itroductio Notatio

More information

Definition 4.2. (a) A sequence {x n } in a Banach space X is a basis for X if. unique scalars a n (x) such that x = n. a n (x) x n. (4.

Definition 4.2. (a) A sequence {x n } in a Banach space X is a basis for X if. unique scalars a n (x) such that x = n. a n (x) x n. (4. 4. BASES I BAACH SPACES 39 4. BASES I BAACH SPACES Sice a Baach space X is a vector space, it must possess a Hamel, or vector space, basis, i.e., a subset {x γ } γ Γ whose fiite liear spa is all of X ad

More information

Introduction to Machine Learning DIS10

Introduction to Machine Learning DIS10 CS 189 Fall 017 Itroductio to Machie Learig DIS10 1 Fu with Lagrage Multipliers (a) Miimize the fuctio such that f (x,y) = x + y x + y = 3. Solutio: The Lagragia is: L(x,y,λ) = x + y + λ(x + y 3) Takig

More information

Time-Domain Representations of LTI Systems

Time-Domain Representations of LTI Systems 2.1 Itroductio Objectives: 1. Impulse resposes of LTI systems 2. Liear costat-coefficiets differetial or differece equatios of LTI systems 3. Bloc diagram represetatios of LTI systems 4. State-variable

More information

A Partial Decode-Forward Scheme For A Network with N relays

A Partial Decode-Forward Scheme For A Network with N relays A Partial Decode-Forward Scheme For A etwork with relays Yao Tag ECE Departmet, McGill Uiversity Motreal, QC, Caada Email: yaotag2@mailmcgillca Mai Vu ECE Departmet, Tufts Uiversity Medford, MA, USA Email:

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure 20 IEEE Iteratioal Symposium o Iformatio Theory Proceedigs Multitermial Source Codig with a Etropy-Based Distortio Measure Thomas A. Courtade ad Richard D. Wesel Departmet of Electrical Egieerig Uiversity

More information

4. Partial Sums and the Central Limit Theorem

4. Partial Sums and the Central Limit Theorem 1 of 10 7/16/2009 6:05 AM Virtual Laboratories > 6. Radom Samples > 1 2 3 4 5 6 7 4. Partial Sums ad the Cetral Limit Theorem The cetral limit theorem ad the law of large umbers are the two fudametal theorems

More information

Fall 2013 MTH431/531 Real analysis Section Notes

Fall 2013 MTH431/531 Real analysis Section Notes Fall 013 MTH431/531 Real aalysis Sectio 8.1-8. Notes Yi Su 013.11.1 1. Defiitio of uiform covergece. We look at a sequece of fuctios f (x) ad study the coverget property. Notice we have two parameters

More information

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function. MATH 532 Measurable Fuctios Dr. Neal, WKU Throughout, let ( X, F, µ) be a measure space ad let (!, F, P ) deote the special case of a probability space. We shall ow begi to study real-valued fuctios defied

More information

Output Analysis and Run-Length Control

Output Analysis and Run-Length Control IEOR E4703: Mote Carlo Simulatio Columbia Uiversity c 2017 by Marti Haugh Output Aalysis ad Ru-Legth Cotrol I these otes we describe how the Cetral Limit Theorem ca be used to costruct approximate (1 α%

More information

New Bounds on the Rate-Distortion Function of a

New Bounds on the Rate-Distortion Function of a ISIT2007, ice, Frace, Jue 24 - Jue 29, 2007 ew Bouds o the Rate-Distortio Fuctio of a Biary Markov Source Shiri Jalali Departmet of Electrical Egieerig Staford Uiversity Staford, CA, 94305, USA shjalali@

More information

Feedback in Iterative Algorithms

Feedback in Iterative Algorithms Feedback i Iterative Algorithms Charles Byre (Charles Byre@uml.edu), Departmet of Mathematical Scieces, Uiversity of Massachusetts Lowell, Lowell, MA 01854 October 17, 2005 Abstract Whe the oegative system

More information

INTRODUCTION ABSTRACT

INTRODUCTION ABSTRACT ABSTRACT mm INTRODUCTION Let us cosider the decisio theory problem of classifyig observatio X as comig from oe of the m possible classes (hypothesis) 0 = {01,02,..., 0 }. Let P; = Pr {0 = 0;}, i = 1,2,...,

More information

Math 155 (Lecture 3)

Math 155 (Lecture 3) Math 55 (Lecture 3) September 8, I this lecture, we ll cosider the aswer to oe of the most basic coutig problems i combiatorics Questio How may ways are there to choose a -elemet subset of the set {,,,

More information

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences Commuicatios of the Korea Statistical Society 29, Vol. 16, No. 5, 841 849 Precise Rates i Complete Momet Covergece for Negatively Associated Sequeces Dae-Hee Ryu 1,a a Departmet of Computer Sciece, ChugWoo

More information

Definitions and Theorems. where x are the decision variables. c, b, and a are constant coefficients.

Definitions and Theorems. where x are the decision variables. c, b, and a are constant coefficients. Defiitios ad Theorems Remember the scalar form of the liear programmig problem, Miimize, Subject to, f(x) = c i x i a 1i x i = b 1 a mi x i = b m x i 0 i = 1,2,, where x are the decisio variables. c, b,

More information

1 Review and Overview

1 Review and Overview DRAFT a fial versio will be posted shortly CS229T/STATS231: Statistical Learig Theory Lecturer: Tegyu Ma Lecture #3 Scribe: Migda Qiao October 1, 2013 1 Review ad Overview I the first half of this course,

More information

Optimization Methods MIT 2.098/6.255/ Final exam

Optimization Methods MIT 2.098/6.255/ Final exam Optimizatio Methods MIT 2.098/6.255/15.093 Fial exam Date Give: December 19th, 2006 P1. [30 pts] Classify the followig statemets as true or false. All aswers must be well-justified, either through a short

More information

Are Slepian-Wolf Rates Necessary for Distributed Parameter Estimation?

Are Slepian-Wolf Rates Necessary for Distributed Parameter Estimation? Are Slepia-Wolf Rates Necessary for Distributed Parameter Estimatio? Mostafa El Gamal ad Lifeg Lai Departmet of Electrical ad Computer Egieerig Worcester Polytechic Istitute {melgamal, llai}@wpi.edu arxiv:1508.02765v2

More information

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3 MATH 337 Sequeces Dr. Neal, WKU Let X be a metric space with distace fuctio d. We shall defie the geeral cocept of sequece ad limit i a metric space, the apply the results i particular to some special

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 9 Multicolliearity Dr Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Multicolliearity diagostics A importat questio that

More information

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit

Non-Asymptotic Achievable Rates for Gaussian Energy-Harvesting Channels: Best-Effort and Save-and-Transmit No-Asymptotic Achievable Rates for Gaussia Eergy-Harvestig Chaels: Best-Effort ad Save-ad-Trasmit Silas L. Fog, Jig Yag, ad Ayli Yeer arxiv:805.089v [cs.it] 30 May 08 Abstract A additive white Gaussia

More information

4.3 Growth Rates of Solutions to Recurrences

4.3 Growth Rates of Solutions to Recurrences 4.3. GROWTH RATES OF SOLUTIONS TO RECURRENCES 81 4.3 Growth Rates of Solutios to Recurreces 4.3.1 Divide ad Coquer Algorithms Oe of the most basic ad powerful algorithmic techiques is divide ad coquer.

More information

Lecture 2: April 3, 2013

Lecture 2: April 3, 2013 TTIC/CMSC 350 Mathematical Toolkit Sprig 203 Madhur Tulsiai Lecture 2: April 3, 203 Scribe: Shubhedu Trivedi Coi tosses cotiued We retur to the coi tossig example from the last lecture agai: Example. Give,

More information

MAT1026 Calculus II Basic Convergence Tests for Series

MAT1026 Calculus II Basic Convergence Tests for Series MAT026 Calculus II Basic Covergece Tests for Series Egi MERMUT 202.03.08 Dokuz Eylül Uiversity Faculty of Sciece Departmet of Mathematics İzmir/TURKEY Cotets Mootoe Covergece Theorem 2 2 Series of Real

More information

Sieve Estimators: Consistency and Rates of Convergence

Sieve Estimators: Consistency and Rates of Convergence EECS 598: Statistical Learig Theory, Witer 2014 Topic 6 Sieve Estimators: Cosistecy ad Rates of Covergece Lecturer: Clayto Scott Scribe: Julia Katz-Samuels, Brado Oselio, Pi-Yu Che Disclaimer: These otes

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22 CS 70 Discrete Mathematics for CS Sprig 2007 Luca Trevisa Lecture 22 Aother Importat Distributio The Geometric Distributio Questio: A biased coi with Heads probability p is tossed repeatedly util the first

More information

SRC Technical Note June 17, Tight Thresholds for The Pure Literal Rule. Michael Mitzenmacher. d i g i t a l

SRC Technical Note June 17, Tight Thresholds for The Pure Literal Rule. Michael Mitzenmacher. d i g i t a l SRC Techical Note 1997-011 Jue 17, 1997 Tight Thresholds for The Pure Literal Rule Michael Mitzemacher d i g i t a l Systems Research Ceter 130 Lytto Aveue Palo Alto, Califoria 94301 http://www.research.digital.com/src/

More information

Hybrid Coding for Gaussian Broadcast Channels with Gaussian Sources

Hybrid Coding for Gaussian Broadcast Channels with Gaussian Sources Hybrid Codig for Gaussia Broadcast Chaels with Gaussia Sources Rajiv Soudararaja Departmet of Electrical & Computer Egieerig Uiversity of Texas at Austi Austi, TX 7871, USA Email: rajivs@mailutexasedu

More information