Matching Dyadic Distributions to Channels

Size: px
Start display at page:

Download "Matching Dyadic Distributions to Channels"

Transcription

1 Matchng Dyadc Dstrbutons to Channels G. Böcherer and R. Mathar Insttute for Theoretcal Informaton Technology RWTH Aachen Unversty, 5256 Aachen, Germany Emal: Abstract arxv:9.375v4 [cs.it] 3 Dec 2 Many communcaton channels wth dscrete nut have non-unform caacty achevng robablty mass functons (PMF). By arsng a stream of ndeendent and equrobable bts accordng to a full refx-free code, a modulator can generate dyadc PMFs at the channel nut. In ths work, we show that for dscrete memoryless channels and for memoryless dscrete noseless channels, searchng for good dyadc nut PMFs s equvalent to mnmzng the Kullback-Lebler dstance between a dyadc PMF and a weghted verson of the caacty achevng PMF. We defne a new algorthm called Geometrc Huffman Codng (GHC) and rove that GHC fnds the otmal dyadc PMF n O(m log m) stes where m s the number of nut symbols of the consdered channel. Furthermore, we rove that by generatng dyadc PMFs of blocks of consecutve nut symbols, GHC acheves caacty when the block length goes to nfnty. I. INTRODUCTION For many communcaton channels, the ultmate rate for relable data transmsson s gven by the maxmum nformaton er cost. For dscrete memoryless channels (DMC) and for addtve nose channels wth fnte nut alhabet, the ultmate rate s the maxmum mutual nformaton between nut and outut er channel use. For memoryless dscrete noseless channels (DNC), the ultmate rate s the maxmum entroy of the nut er average weght. In both cases, the maxmum s acheved by an nut that s dstrbuted accordng to a caacty achevng robablty mass functon (PMF). To use non-unform nut PMFs n a dgtal communcaton system, a modulator has to generate ths PMF by mang ndeendent equrobable data bts to the channel nut symbols. One way to do ths s to arse the data bts by a full refx-free code and to ma each codeword to an nut symbol [, Sec. VII]. PMFs that can be generated n ths way are dyadc,.e., the robablty of each ont s of the form 2 l, l N. The caacty achevng PMFs are n general not dyadc, whch rases two questons. Frst, what s an otmal dyadc PMF that maxmzes nformaton er cost, and second, f we ontly generate blocks of consecutve nut symbols by a dyadc PMF, can we asymtotcally acheve caacty by lettng the block length go to nfnty. For noseless channels, an effcent algorthm to fnd the otmal dyadc PMF that maxmzes entroy er average weght was found n [2]. In general, a common aroach n the lterature s to use the dyadc PMF that results from the otmal source code of the caacty achevng PMF. Dyadc PMFs resultng from source codes are n general Ths work has been suorted by the UMIC Research Centre, RWTH Aachen Unversty.

2 not otmal. For the (d, k) constraned noseless channel, t was clamed n [3] that a source code asymtotcally acheves caacty. To the best of our knowledge, for DMCs, there exst no results n the lterature on otmalty and asymtotc behavor of dyadc PMFs. In [], [4], the authors use source codes for addtve nose channels. Whle good numercal results are observed, otmalty and asymtotc behavor are not assessed. In [5], nut entroy er average weght s maxmzed for addtve nose channels. Ths s n general not equvalent to the maxmzaton of mutual nformaton er channel use. Denote the caacty achevng PMF of a channel by. In ths work, we show for DMCs that mnmzng the Kullback-Lebler dstance (KL) D( ) over all dyadc PMFs maxmzes a lower bound on the acheved mutual nformaton er channel use. For DNCs, we show that searchng for the otmal dyadc nut PMF s equvalent to mnmzng the weghted KL-dstance D( R ) log( / R ) over all dyadc PMFs. The value of R s gven by the fracton of the channel caacty that s achevable by dyadc PMFs. We ntroduce an algorthm called Geometrc Huffman Codng (GHC) and rove that GHC mnmzes D( x) over all dyadc PMFs, for any gven vector x wth non-negatve entres. In artcular, for x =, GHC mnmzes D( ) and for x = R, GHC mnmzes D( R ). The comlexty of GHC s O(m log m), where m s the number of nut symbols of the consdered channel. Furthermore, we show that, to asymtotcally acheve caacty for DMCs and DNCs, the normalzed KL-dstance D( (k) (k) )/k has to vansh for block length k. Ths s acheved by GHC. Based on the resent work, we show n [6] that for fnte sgnal constellatons wth average ower constrant, GHC acheves caacty. GHC s as handy as Huffman codng and an mlementaton of GHC n MATLAB s readly avalable at our webste [7]. The remander of ths work s organzed as follows. In Secton II, we defne GHC. In Secton III, we show otmalty and asymtotc otmalty of GHC for DMCs. We show otmalty and asymtotc otmalty of GHC for DNCs n Secton IV. II. GEOMETRIC HUFFMAN CODING For a PMF and a vector x wth non-negatve entres, the KL-dstance s gven by D( x) = log x. () Note that D( x) can be equal to nfnty. The dyadc PMF that mnmzes the KLdstance s drectly gven by the full refx-free code that s constructed by the algorthm of the followng rooston. A refx-free code s full f t fulflls the Kraft nequalty [8, Theorem 5.2.2] wth equalty. Prooston. Wthout loss of generalty, we assume x x 2 x m. The dyadc PMF that mnmzes D( x) s obtaned by constructng a Huffman tree wth the udatng rule { x xm, f x = m 4x m 2 (2) x m x m, f x m < 4x m. Snce t nvolves a geometrc mean, we call ths method Geometrc Huffman Codng. We wrte = GHC(x). Proof: The roof s gven n the aendx.

3 Tree of GHC Tree of HC Tree of HC Fg. : For q = (.328,.32,.22,.,.22) T, the left fgure dslays the code tree of GHC. The fgure n the mddle shows the code tree of Huffman codng. The rght fgure dslays the code tree of Huffman codng aled to (q,..., q 4 ) T. An mlementaton of GHC n MATLAB can be found at our webste [7]. In comarson to GHC, Huffman codng uses the udatng rule x = x m + x m. Furthermore, t can be shown that Huffman codng mnmzes the KL-dstance D(x ) over all dyadc PMFs. Note that ths s not equvalent to mnmzng () because the KL-dstance s not symmetrc n ts arguments. GHC has the same comlexty as Huffman codng, whch s O(m log m) [9, Cha. 6.3]. For llustraton urose, we aly GHC and Huffman codng to the PMF q = (.328,.32,.22,.,.22) T (3) where ( ) T denotes the transose. The resultng code trees are dslayed n Fgure. By readng off the codeword lengths, the corresondng dyadc PMFs are GHC = (2, 2 2, 2 3, 2 3, ) T and HC = (2 2, 2 2, 2 2, 2 3, 2 3 ) T (4) and the KL-dstances to q are D( GHC q) =.369 and D( HC q) =.9548 (5) where we used the dual logarthm. As exected, the KL-dstance resultng from GHC s smaller than the one that results from Huffman codng. Snce GHC assgns zero to q 5, one may want to manually assgn robablty zero to q 5 and then aly Huffman codng to (q,..., q 4 ) T. The corresondng code tree s dslayed n Fgure. The corresondng PMF and the resultng KL-dstance to q are resectvely HC = (2 2, 2 2, 2 2, 2 2, ) T, D( HC q) = (6) Whle HC slghtly mroves uon HC, the KL-dstance s stll larger than the one resultng from GHC. Let q now denote some arbtrary PMF. We consder k subsequent symbols that are ndeendent and dentcally dstrbuted accordng to q. We denote the ont PMF of these symbols by q (k). Our am s to show that for (k) = GHC(q (k) ), the normalzed KLdstance D( (k) q (k) )/k vanshes for k. To show ths, we wll need the followng lemma, whch shows the exstence of dyadc PMFs wth a bounded KL-dstance for any PMF q. Lemma. Wthout loss of generalty, q q 2 q m. Assgn then = 2 log 2 q for k, and = for > k, where k m s chosen such that m = =. Then It can actually be shown that such k always exsts, so GCC s well-defned.

4 s a dyadc PMF and D( q) log 2. We call ths method Greedy Channel Codng (GCC) and wrte = GCC(q). Proof: D( q) = = log q = = = = log 2 log 2 q q (7) log 2 ( log 2 q ) q (8) log 2q q = log 2 (9) where the nequalty n (7) follows from the values that GCC assgns to the. An mlementaton of GCC n MATLAB s avalable at [7]. It s now easy to show the asymtotc behavor of GHC. Prooston 2. For (k) = GHC(q (k) ) t holds that D( (k) q (k) ) k Proof: Defne (k) = GCC(q (k) ). Then k. () D( (k) q (k) ) D( (k) q (k) ) log 2 () k k k where the frst nequalty follows va Prooston from the otmalty of GHC and where the second nequalty follows from Lemma. log 2/k goes to zero for k and the statement of the rooston follows. III. DISCRETE MEMORYLESS CHANNEL We now show how GHC can be used to fnd dyadc PMFs that well aroxmate the caacty of DMCs. A DMC s secfed by a set of m nut symbols, a set of n outut symbols and a matrx of transton robabltes (h ). An nut PMF relates to ts corresondng outut PMF r as r h h m r =. = (2) r n h n h nm m The mutual nformaton between nut and outut s gven by [, Eq. (8.73)] I() = h log h r. (3) The caacty of a DMC s the maxmum mutual nformaton between nut and outut, where the maxmum s taken over all nut PMFs. To fnd the best dyadc nut PMF, we need to solve the otmzaton roblem maxmze subect to I() s a PMF = 2 l, l N, =,..., m. (4)

5 Ths s a nonlnear otmzaton roblem wth nteger constrants and therefore ntractable for ractcal uroses. In order to overcome ths dffculty, we roceed as follows. Frst, we wll dro the restrcton to dyadc PMFs and characterze the caacty achevng PMF. Then, we wll derve the enalty that results from usng a PMF dfferent from. Fnally, we wll mnmze ths enalty over all dyadc PMFs. Caacty and caacty achevng PMF are resectvely defned as C = max I(), = argmax I(). (5) Denote by r and r the outut PMFs that result from usng the nut PMFs and, resectvely. Accordng to [, Eq. (4.5.)], the outut PMF r resultng from the caacty achevng PMF has the mortant roerty that h log h = C, whenever r >. (6) We now use ths roerty to exress the mutual nformaton I() acheved by some PMF n terms of caacty C and caacty achevng PMF. The only assumton that we make about s that =, whenever =. (7) Under ths assumton, we have for I() I() = h log h = h log h r (8) r r r = h log h + r h log r (9) r = C ( ) h log r (2) r = C r log r r (2) = C D(r r ) (22) where equalty n (2) follows from (6) and (7). From the last lne, we see that the enalty of usng nstead of s exactly the KL-dstance between the corresondng outut PMFs r and r. To get a smle exresson that drectly deends on and, we lower bound the last lne. Accordng to [8, Eq. (4.45)] the KL-dstance between the outut PMFs s uer-bounded by the KL-dstance between the nut PMFs,.e, D(r r ) D( ). Thus, I() C D( ). (23) We conclude that for DMCs, the enalty that results from usng nstead of s uer bounded by D( ). Accordng to Prooston, we can now effcently mnmze the enalty bound over all dyadc nut PMFs by usng = GHC( ). Note that GHC guarantees (7): assume s ordered and m =, m >. Then m > 4 m and GHC assgns m =.

6 We now ontly consder the PMF of k consecutve channel nuts. Denote by (k) the caacty achevng ont PMF. Snce the channel s memoryless, (k) s the roduct of k margnal PMFs and we have I( (k) ) = k I( ) = kc. Thus, for a ont PMF (k) we have I( (k) ) kc D( (k) (k) ). (24) The mutual nformaton er channel use Ī((k) ) I( (k) )/k s thus gven by Ī( (k) ) C D((k) (k) ). (25) k By usng (k) = GHC( (k) ), accordng to Prooston 2, Ī((k) ) C for k and we conclude that GHC s asymtotcally caacty achevng. IV. MEMORYLESS DISCRETE NOISELESS CHANNEL Followng [2], a memoryless DNC s gven by a fnte alhabet A = (a,..., a m ) of atomc symbols and an assocated weght functon w : A R >, a w >. The nformaton rate H that s transmtted over the channel s gven by the entroy of the nut PMF dvded by the average weght,.e., H() = H() w, wth H() = log. (26) To fnd the dyadc PMF that maxmzes H(), we need to solve the otmzaton roblem maxmze subect to H() s a PMF = 2 l, l N, =,..., m. (27) As n the case of DMCs, ths s an ntractable nonlnear otmzaton roblem wth nteger constrants. We wll therefore roceed n the same way as we dd for the DMC n Secton III. We start by calculatng the caacty and the caacty achevng PMF. Ths can be done by Lagrange Multlers, see, e.g., [3]. Denote by b the base of the logarthm log. The caacty s acheved by the nut PMF = b Cw, =,..., m (28) where C denotes caacty and s gven by the greatest ostve real soluton of the equaton b sw =. (29) From (28), we have the relaton w = log C. We can thus wrte w = log. (3) C Denote by R the fracton of C that can be acheved by the best dyadc PMF,.e., argmax dyadc H(), R H( ) C. (3)

7 In general, R s not known beforehand, but we wll show n Subsecton IV-A how t can be found. Suose for now that we know R. Assume further that =, whenever =. (32) Furthermore, we use the conventon log =. Wth these assumtons, we can now wrte H() as H() = R log + R log + H() w (33) = RC log R log w (34) log = RC R = RC D( R ) w (35) w where n (34), we used (3) and the defnton of entroy. By (3), for the best dyadc PMF we have H( ) = RC. It follows that for any dyadc PMF, we have D( R ) and for the best dyadc PMF, we have D( R ) =. We conclude that for DNCs, the best dyadc PMF s found by mnmzng D( R ) over all dyadc PMFs and by Prooston, ths PMF s gven by = GHC( R ). Recall that, as we argued n Secton III, GHC guarantees (32). We now consder the PMF of k consecutve symbols. We denote the corresondng weghts by w (k). The caacty achevng ont PMF s the roduct of k coes of and we denote t by (k). Clearly, w (k) kw mn for =,..., m k where w mn = mn{w,..., w m }. Usng ths, we get for H( (k) ) the lower bound H( (k) ) = H((k) ) (k) w (k) = H((k) ) + (k) log (k) C = C D((k) (k) ) (k) C w mn w (k) (k) (k) log (k) log (k) (36) (37) D( (k) (k) ). (38) k For (k) = GHC( (k) ), accordng to Prooston 2, the last term n the last lne vanshes for k and we have H( (k) ) C, thus GHC s asymtotcally caacty achevng. A. Fndng R The exact value of R s n general not known beforehand. However, R and the best dyadc PMF can be found teratvely by the Lemel-Even-Cohn (LEC) algorthm [2]. The dea of the algorthm s to start wth some R, then fnd the best dyadc PMF for ths R, and then udate the value of R. The best PMF for a gven R s n the orgnal formulaton of the LEC algorthm found as follows. A subset of l nonzero entres of s chosen. A Huffman-lke rocedure of comlexty O(m log m) then fnds the best dyadc PMF wth l nonzero entres. There are m values for l that have to be evaluated, the comlexty of the overall rocedure s thus roughly O(m 2 log m).

8 Algorthm Fndng R and the otmal dyadc PMF for DNCs : rocedure LEC( ) 2: R 3: whle D( R ) do 4: GHC( R ) 5: R H()/C 6: end whle 7: end rocedure From (35) and a careful study of the orgnal formulaton n [2, Sec. III,V], t can be shown that the teraton ste s equvalent to mnmzng the weghted KL-dstance D( R ). Ths can be done wth comlexty O(m log m) by GHC. A formulaton of the comlete LEC algorthm wth GHC as the teraton ste s rovded n Algorthm. Besdes mrovng the comlexty of the teraton ste from O(m 2 log m) to O(m log m), our formulaton answers a queston that was rased n [4], namely how the LEC algorthm could be used to fnd the dyadc PMF that mnmzes the KL-dstance D( ). The smle answer s to erform the teraton ste once wth R =. An mlementaton n MATLAB of our formulaton of the LEC algorthm can be found at [7]. APPENDIX Denote by x some non-negatve vector wth m entres. Assume x s ordered,.e., x x 2 x m. We now show that GHC mnmzes D( x) over all dyadc PMFs. The PMF s dyadc f and only f there exst numbers l N, =,..., m, such that = 2 l, and 2 l =. Ths s equvalent to l,..., l m beng the codeword lengths of a full refx-free code [, Sec ]. Usng ths, we can wrte D( x) = log x = log(2) log 2 x (39) = log(2) 2 l ( log 2 x l ). (4) We defne u by u = log 2 x,. Omttng the constant factor log 2, our am s thus to mnmze 2 l (u l ) (4) subect to l,..., l m are the codeword lengths of a full refx-free code. Based on (4), we now rove the otmalty of GHC n a way smlar to the roof gven n [, Sec ] for the otmalty of Huffman codng. Assume for now that an otmal algorthm assgns fnte values to the codeword lengths l m and l m of the two least lkely symbols, whch corresond to the greatest entres u m and u m of u. We now show that n ths case, there s an otmal algorthm for whch l m = l m. Lemma 2. For an otmal algorthm, u > u mles l l.

9 Proof: Assume the contrary,.e., u > u and l < l. Consder the term By nterchangng l and l, the term decreases: 2 l (u l ) + 2 l (u l ). (42) [2 l (u l ) + 2 l (u l )] [2 l (u l ) + 2 l (u l )] (43) so any code wth u > u and l < l s not otmal. = 2 l (u u ) + 2 l (u u ) (44) = (2 l 2 l }{{} )(u u ) < (45) }{{} > < Lemma 3. There s an otmal algorthm for whch the codewords of the two greatest entres u m and u m are sblngs,.e., l m = l m, and n addton, no other codeword s longer than l m and l m. Proof: In a full refx-free code, the sblng of the longest codeword s also a longest codeword. Accordng to Lemma 2, f u m > u m > u m 2, an otmal algorthm assgns the two longest codewords to u m and u m. If only u m u m u m 2, assgnng the two longest codewords to u m and u m does not change otmalty. We can now use l m = l m to rewrte (4): m 2 2 l (u l ) = 2 l (u l ) + 2 l m (u m l m ) + 2 lm (u m l m ) (46) = = m 2 = 2 l (u l ) + 2 lm (u m + u m 2l m ) (47) = m 2 [( = 2 l (u l ) + 2 (lm ) um + u ) m 2 = }{{} u = m 2 = Thus, by combnng u m and u m through ] (l m ) }{{} l (48) 2 l (u l ) + 2 l (u l ). (49) u = u m + u m (5) 2 the sze m roblem s reduced to a sze m roblem. The otmal algorthm may assgn robablty zero to the greatest entry u m, whch corresonds to l m =. We thus have m m 2 l (u l ) = 2 l (u l ) + 2 (u ) = 2 l (u l ) (5) = = where we used the conventon log = and equvalently 2 =. Thus, f l m =, the sze m roblem reduces to a sze m roblem. It remans to check f t s better to assgn robablty zero to u m or to combne u m and u m. Frst, assume the algorthm combnes u m and u m. Then the contrbuton =

10 to the sum (4) s 2 l (u l ). We can now assgn robablty zero to u m and use the codeword of u for u m. The contrbuton of u m to (4) s then zero and the contrbuton of u m s 2 l (u m l ). Thus, snce our am s to mnmze (4), dong the former s better f and only f 2 l (u m l ) > 2 l (u l ) (52) ( (u m l um + u ) m ) > l (53) 2 u m > u m + u m 2 (54) u m > u m 2. (55) Recallng x = 2 u, the udatng rule (5) and the condton (55) can be exressed n terms of x as x = { xm, f x m 4x m 2 x m x m, f x m < 4x m. (56) REFERENCES [] F. R. Kschschang and S. Pasuathy, Otmal nonunform sgnalng for Gaussan channels, IEEE Trans. Inf. Theory, vol. 39, no. 3, , 993. [2] A. Lemel, S. Even, and M. Cohn, An algorthm for otmal refx arsng of a noseless and memoryless channel, IEEE Trans. Inf. Theory, vol. 9, no. 2, , 973. [3] K. J. Kerez, Runlength codes from source codes, IEEE Trans. Inf. Theory, vol. 37, no. 3, , 99. [4] G. Ungerboeck, Huffman shang, n Codes, Grahs, and Systems, R. Blahut and R. Koetter, Eds. Srnger, 22, ch. 7, [5] J. Abrahams, Varable-length unequal cost arsng and codng for shang, IEEE Trans. Inf. Theory, vol. 44, no. 4, , 998. [6] G. Böcherer, F. Altenbach, and R. Mathar, Caacty achevng modulaton for fxed constellatons wth average ower constrant, 2, submtted to ICC 2. [7] G. Böcherer, Geometrc huffman codng, htt:// Dec. 2. [8] T. M. Cover and J. A. Thomas, Elements of Informaton Theory, 2nd ed. John Wley & Sons, Inc., 26. [9] T. H. Cormen, C. E. Leserson, R. L. Rvest, and C. Sten, Introducton to Algorthms, 2nd ed. The MIT Press, 2. [] R. G. Gallager, Prncles of Dgtal Communcaton. Cambrdge Unversty Press, 28. [], Informaton Theory and Relable Communcaton. John Wley & Sons, Inc., 968. [2] R. M. Krause, Channels whch transmt letters of unequal duraton, Inf. Contr., vol. 5,. 3 24, 962. [3] R. S. Marcus, Dscrete noseless codng, Master s thess, MIT, 957. [4] J. Abrahams, Corresondence between varable length arsng and codng, n The mathematcs of nformaton codng, extracton and dstrbuton, G. Cybenko, D. P. O Leary, and J. Rssanen, Eds. Srnger, 999, ch.,. 7.

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration Managng Caacty Through eward Programs on-lne comanon age Byung-Do Km Seoul Natonal Unversty College of Busness Admnstraton Mengze Sh Unversty of Toronto otman School of Management Toronto ON M5S E6 Canada

More information

Algorithms for factoring

Algorithms for factoring CSA E0 235: Crytograhy Arl 9,2015 Instructor: Arta Patra Algorthms for factorng Submtted by: Jay Oza, Nranjan Sngh Introducton Factorsaton of large ntegers has been a wdely studed toc manly because of

More information

Writing on the Facade of RWTH ICT Cubes: Cost Constrained Geometric Huffman Coding

Writing on the Facade of RWTH ICT Cubes: Cost Constrained Geometric Huffman Coding 2011 8th Internatonal Symposum on Wreless Communcaton Systems, Aachen Wrtng on the Facade of RWTH ICT Cubes: Cost Constraned Geometrc Huffman Codng Georg Bo cherer, Faban Altenbach, Martna Malsbender,

More information

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality Internatonal Journal of Statstcs and Aled Mathematcs 206; (4): 0-05 ISS: 2456-452 Maths 206; (4): 0-05 206 Stats & Maths wwwmathsjournalcom Receved: 0-09-206 Acceted: 02-0-206 Maharsh Markendeshwar Unversty,

More information

Advanced Topics in Optimization. Piecewise Linear Approximation of a Nonlinear Function

Advanced Topics in Optimization. Piecewise Linear Approximation of a Nonlinear Function Advanced Tocs n Otmzaton Pecewse Lnear Aroxmaton of a Nonlnear Functon Otmzaton Methods: M8L Introducton and Objectves Introducton There exsts no general algorthm for nonlnear rogrammng due to ts rregular

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Supplementary Material for Spectral Clustering based on the graph p-laplacian

Supplementary Material for Spectral Clustering based on the graph p-laplacian Sulementary Materal for Sectral Clusterng based on the grah -Lalacan Thomas Bühler and Matthas Hen Saarland Unversty, Saarbrücken, Germany {tb,hen}@csun-sbde May 009 Corrected verson, June 00 Abstract

More information

Fuzzy approach to solve multi-objective capacitated transportation problem

Fuzzy approach to solve multi-objective capacitated transportation problem Internatonal Journal of Bonformatcs Research, ISSN: 0975 087, Volume, Issue, 00, -0-4 Fuzzy aroach to solve mult-objectve caactated transortaton roblem Lohgaonkar M. H. and Bajaj V. H.* * Deartment of

More information

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan. THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction Kragujevac Journal of Mathematcs Volume 35 Number (20, Pages 7 SOME NOISELESS COING THEOREM CONNECTE WITH HAVRA AN CHARVAT AN TSALLIS S ENTROPY SATISH KUMAR AN RAJESH KUMAR 2 Abstract A new measure L,

More information

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00 A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00 Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory

More information

On the Connectedness of the Solution Set for the Weak Vector Variational Inequality 1

On the Connectedness of the Solution Set for the Weak Vector Variational Inequality 1 Journal of Mathematcal Analyss and Alcatons 260, 15 2001 do:10.1006jmaa.2000.7389, avalable onlne at htt:.dealbrary.com on On the Connectedness of the Soluton Set for the Weak Vector Varatonal Inequalty

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

Hidden Markov Model Cheat Sheet

Hidden Markov Model Cheat Sheet Hdden Markov Model Cheat Sheet (GIT ID: dc2f391536d67ed5847290d5250d4baae103487e) Ths document s a cheat sheet on Hdden Markov Models (HMMs). It resembles lecture notes, excet that t cuts to the chase

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

Non-Ideality Through Fugacity and Activity

Non-Ideality Through Fugacity and Activity Non-Idealty Through Fugacty and Actvty S. Patel Deartment of Chemstry and Bochemstry, Unversty of Delaware, Newark, Delaware 19716, USA Corresondng author. E-mal: saatel@udel.edu 1 I. FUGACITY In ths dscusson,

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

Source-Channel-Sink Some questions

Source-Channel-Sink Some questions Source-Channel-Snk Soe questons Source Channel Snk Aount of Inforaton avalable Source Entro Generall nos and a be te varng Introduces error and lts the rate at whch data can be transferred ow uch nforaton

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1 Problem Set 4 Suggested Solutons Problem (A) The market demand functon s the soluton to the followng utlty-maxmzaton roblem (UMP): The Lagrangean: ( x, x, x ) = + max U x, x, x x x x st.. x + x + x y x,

More information

A NOTE ON THE DISCRETE FOURIER RESTRICTION PROBLEM

A NOTE ON THE DISCRETE FOURIER RESTRICTION PROBLEM A NOTE ON THE DISCRETE FOURIER RESTRICTION PROBLEM XUDONG LAI AND YONG DING arxv:171001481v1 [mathap] 4 Oct 017 Abstract In ths aer we establsh a general dscrete Fourer restrcton theorem As an alcaton

More information

Lec 02 Entropy and Lossless Coding I

Lec 02 Entropy and Lossless Coding I Multmeda Communcaton, Fall 208 Lec 02 Entroy and Lossless Codng I Zhu L Z. L Multmeda Communcaton, Fall 208. Outlne Lecture 0 ReCa Info Theory on Entroy Lossless Entroy Codng Z. L Multmeda Communcaton,

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Joint Decoding of Content-Replication Codes for Flash Memories

Joint Decoding of Content-Replication Codes for Flash Memories Ffty-thrd Annual Allerton Conference Allerton House, UIUC, Illnos, USA Setember 29 - October 2, 2015 Jont Decodng of Content-Relcaton Codes for Flash Memores Qng L, Huan Chang, Anxao (Andrew) Jang, and

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

A total variation approach

A total variation approach Denosng n dgtal radograhy: A total varaton aroach I. Froso M. Lucchese. A. Borghese htt://as-lab.ds.unm.t / 46 I. Froso, M. Lucchese,. A. Borghese Images are corruted by nose ) When measurement of some

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period -Adc Comlexty of a Seuence Obtaned from a Perodc Bnary Seuence by Ether Insertng or Deletng Symbols wthn One Perod ZHAO Lu, WEN Qao-yan (State Key Laboratory of Networng and Swtchng echnology, Bejng Unversty

More information

Confidence intervals for weighted polynomial calibrations

Confidence intervals for weighted polynomial calibrations Confdence ntervals for weghted olynomal calbratons Sergey Maltsev, Amersand Ltd., Moscow, Russa; ur Kalambet, Amersand Internatonal, Inc., Beachwood, OH e-mal: kalambet@amersand-ntl.com htt://www.chromandsec.com

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Discrete Memoryless Channels

Discrete Memoryless Channels Dscrete Meorless Channels Source Channel Snk Aount of Inforaton avalable Source Entro Generall nos, dstorted and a be te varng ow uch nforaton s receved? ow uch s lost? Introduces error and lts the rate

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 3.

More information

6. Hamilton s Equations

6. Hamilton s Equations 6. Hamlton s Equatons Mchael Fowler A Dynamcal System s Path n Confguraton Sace and n State Sace The story so far: For a mechancal system wth n degrees of freedom, the satal confguraton at some nstant

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

PARTIAL QUOTIENTS AND DISTRIBUTION OF SEQUENCES. Department of Mathematics University of California Riverside, CA

PARTIAL QUOTIENTS AND DISTRIBUTION OF SEQUENCES. Department of Mathematics University of California Riverside, CA PARTIAL QUOTIETS AD DISTRIBUTIO OF SEQUECES 1 Me-Chu Chang Deartment of Mathematcs Unversty of Calforna Rversde, CA 92521 mcc@math.ucr.edu Abstract. In ths aer we establsh average bounds on the artal quotents

More information

Not-for-Publication Appendix to Optimal Asymptotic Least Aquares Estimation in a Singular Set-up

Not-for-Publication Appendix to Optimal Asymptotic Least Aquares Estimation in a Singular Set-up Not-for-Publcaton Aendx to Otmal Asymtotc Least Aquares Estmaton n a Sngular Set-u Antono Dez de los Ros Bank of Canada dezbankofcanada.ca December 214 A Proof of Proostons A.1 Proof of Prooston 1 Ts roof

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

The Bellman Equation

The Bellman Equation The Bellman Eqaton Reza Shadmehr In ths docment I wll rovde an elanaton of the Bellman eqaton, whch s a method for otmzng a cost fncton and arrvng at a control olcy.. Eamle of a game Sose that or states

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

arxiv: v1 [math.co] 1 Mar 2014

arxiv: v1 [math.co] 1 Mar 2014 Unon-ntersectng set systems Gyula O.H. Katona and Dánel T. Nagy March 4, 014 arxv:1403.0088v1 [math.co] 1 Mar 014 Abstract Three ntersecton theorems are proved. Frst, we determne the sze of the largest

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Digital PI Controller Equations

Digital PI Controller Equations Ver. 4, 9 th March 7 Dgtal PI Controller Equatons Probably the most common tye of controller n ndustral ower electroncs s the PI (Proortonal - Integral) controller. In feld orented motor control, PI controllers

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Lecture 9: Converse of Shannon s Capacity Theorem

Lecture 9: Converse of Shannon s Capacity Theorem Error Correctng Codes: Combnatorcs, Algorthms and Alcatons (Fall 2007) Lecture 9: Converse of Shannon s Caacty Theorem Setember 17, 2007 Lecturer: Atr Rudra Scrbe: Thanh-Nhan Nguyen & Atr Rudra In the

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Some Notes on Consumer Theory

Some Notes on Consumer Theory Some Notes on Consumer Theory. Introducton In ths lecture we eamne the theory of dualty n the contet of consumer theory and ts use n the measurement of the benefts of rce and other changes. Dualty s not

More information

A General Class of Selection Procedures and Modified Murthy Estimator

A General Class of Selection Procedures and Modified Murthy Estimator ISS 684-8403 Journal of Statstcs Volume 4, 007,. 3-9 A General Class of Selecton Procedures and Modfed Murthy Estmator Abdul Bast and Muhammad Qasar Shahbaz Abstract A new selecton rocedure for unequal

More information

Logistic regression with one predictor. STK4900/ Lecture 7. Program

Logistic regression with one predictor. STK4900/ Lecture 7. Program Logstc regresson wth one redctor STK49/99 - Lecture 7 Program. Logstc regresson wth one redctor 2. Maxmum lkelhood estmaton 3. Logstc regresson wth several redctors 4. Devance and lkelhood rato tests 5.

More information

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer

More information

Approximation of Optimal Interface Boundary Conditions for Two-Lagrange Multiplier FETI Method

Approximation of Optimal Interface Boundary Conditions for Two-Lagrange Multiplier FETI Method Aroxmaton of Otmal Interface Boundary Condtons for Two-Lagrange Multler FETI Method F.-X. Roux, F. Magoulès, L. Seres, Y. Boubendr ONERA, 29 av. de la Dvson Leclerc, BP72, 92322 Châtllon, France, ,

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

On New Selection Procedures for Unequal Probability Sampling

On New Selection Procedures for Unequal Probability Sampling Int. J. Oen Problems Comt. Math., Vol. 4, o. 1, March 011 ISS 1998-66; Coyrght ICSRS Publcaton, 011 www.-csrs.org On ew Selecton Procedures for Unequal Probablty Samlng Muhammad Qaser Shahbaz, Saman Shahbaz

More information

An LSB Data Hiding Technique Using Natural Numbers

An LSB Data Hiding Technique Using Natural Numbers An LSB Data Hdng Technque Usng Natural Numbers Sandan Dey (1), Aj Abraham (), Sugata Sanyal (3) 1 Anshn Software Prvate Lmted, Kolata 700091 Centre for Quantfable Qualty of Servce n Communcaton Systems

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Approximate Inference: Mean Field Methods

Approximate Inference: Mean Field Methods School of Comuter Scence Aromate Inference: Mean Feld Methods Probablstc Grahcal Models 10-708 Lecture 17 Nov 12 2007 Recetor A Knase C Gene G Recetor B X 1 X 2 Knase D Knase X 3 X 4 X 5 TF F X 6 Gene

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Submodular Maximization Over Multiple Matroids via Generalized Exchange Properties

Submodular Maximization Over Multiple Matroids via Generalized Exchange Properties Submodular Maxmzaton Over Multle Matrods va Generalzed Exchange Proertes Jon Lee Maxm Svrdenko Jan Vondrák June 25, 2009 Abstract Submodular-functon maxmzaton s a central roblem n combnatoral otmzaton,

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Research Article The Point Zoro Symmetric Single-Step Procedure for Simultaneous Estimation of Polynomial Zeros

Research Article The Point Zoro Symmetric Single-Step Procedure for Simultaneous Estimation of Polynomial Zeros Aled Mathematcs Volume 2012, Artcle ID 709832, 11 ages do:10.1155/2012/709832 Research Artcle The Pont Zoro Symmetrc Sngle-Ste Procedure for Smultaneous Estmaton of Polynomal Zeros Mansor Mons, 1 Nasruddn

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

The Small Noise Arbitrage Pricing Theory

The Small Noise Arbitrage Pricing Theory The Small Nose Arbtrage Prcng Theory S. Satchell Trnty College Unversty of Cambrdge and School of Fnance and Economcs Unversty of Technology, Sydney December 998 Ths aer was wrtten when the Author was

More information

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

Lecture 7: Boltzmann distribution & Thermodynamics of mixing Prof. Tbbtt Lecture 7 etworks & Gels Lecture 7: Boltzmann dstrbuton & Thermodynamcs of mxng 1 Suggested readng Prof. Mark W. Tbbtt ETH Zürch 13 März 018 Molecular Drvng Forces Dll and Bromberg: Chapters

More information

1 Bref Introducton Ths memo reorts artal results regardng the task of testng whether a gven bounded-degree grah s an exander. The model s of testng gr

1 Bref Introducton Ths memo reorts artal results regardng the task of testng whether a gven bounded-degree grah s an exander. The model s of testng gr On Testng Exanson n Bounded-Degree Grahs Oded Goldrech Det. of Comuter Scence Wezmann Insttute of Scence Rehovot, Israel oded@wsdom.wezmann.ac.l Dana Ron Det. of EE { Systems Tel Avv Unversty Ramat Avv,

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Reliability Gain of Network Coding in Lossy Wireless Networks

Reliability Gain of Network Coding in Lossy Wireless Networks Relablty Gan of Network Codng n Lossy Wreless Networks Majd Ghader Deartment of Comuter Scence Unversty of Calgary mghader@cs.ucalgary.ca Don Towsley and Jm Kurose Deartment of Comuter Scence Unversty

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

Two Stage Interval Time Minimizing Transportation Problem

Two Stage Interval Time Minimizing Transportation Problem wo Stage Interval me Mnmzng ransortaton Problem Sona a, Rta Malhotra b and MC Pur c Abstract A wo Stage Interval me Mnmzng ransortaton Problem, where total avalablty of a homogeneous roduct at varous sources

More information

Independent Component Analysis

Independent Component Analysis Indeendent Comonent Analyss Mture Data Data that are mngled from multle sources May not now how many sources May not now the mng mechansm Good Reresentaton Uncorrelated, nformaton-bearng comonents PCA

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

Comparing two Quantiles: the Burr Type X and Weibull Cases

Comparing two Quantiles: the Burr Type X and Weibull Cases IOSR Journal of Mathematcs (IOSR-JM) e-issn: 78-578, -ISSN: 39-765X. Volume, Issue 5 Ver. VII (Se. - Oct.06), PP 8-40 www.osrjournals.org Comarng two Quantles: the Burr Tye X and Webull Cases Mohammed

More information