Poissonization and universal compression of envelope classes

Size: px
Start display at page:

Download "Poissonization and universal compression of envelope classes"

Transcription

1 Possonzaton and unversal compresson o envelope classes Jayadev Acharya, Ashkan Jaarpour, Alon Orltsky, and Ananda Theertha Suresh ECE UCSD, {jacharya, ashkan, alon, asuresh}@ucsd.edu Abstract Posson samplng s a method or elmnatng dependence among symbols n a random sequence. It helps mprove algorthm desgn, strengthen bounds, and smply proos. We relate the redundancy o xed-length and Posson-sampled sequences, use ths result to derve a smple ormula or the redundancy o general envelope classes, and apply ths ormula to obtan smple and tght bounds on the redundancy o powerlaw and exponental envelope classes, n partcular answerng a queston posed n [1]. I. INTRODUCTION Unversal compresson s the desgn o a sngle scheme to encode an unknown source rom a known class o dstrbutons. Let P be a dstrbuton over A. From the source codng theorem we know that H(P bts are necessary and sucent to encode P. Any dstrbuton Q over A mples a code that uses log Q(a bts to encode a A. The code mpled by the underlyng dstrbuton uses log P (a bts and expected code length s the entropy H(P. The (worst-case redundancy o a dstrbuton Q wth respect to P s sup log P (a a A Q(a, the largest derence between the number o bts used and the optmal. We assume the dstrbutons belong to a known class P. Then ˆR(P = n Q sup sup P P a A log P (a Q(a, s the worst case redundancy, or mnmax regret o P. de Let ˆP (a = sup P P P (a be the largest probablty assgned to symbol a by a dstrbuton n P. Then, S(P de = ˆP (a a A s the Shtarkov sum o P. Shtarkov [2] showed that ˆR(P = log (S(P, and the optmal dstrbuton assgns probablty ˆP (a/s(p to a A. In ths work we study codng o..d. samples rom unknown dstrbutons. For a dstrbuton P on an underlyng alphabet X let P n be the product dstrbuton P P... P over X n. Let P belong to a known class P. Let P n be the class o all dstrbutons P n wth P P. Codng length-n sequences over X equvalently mples encodng symbols rom A = X n. In such block encodng we treat each x n 1 X n as a symbol a. As beore we consder the class o all dstrbutons (not only..d. Q n over X n to dene ˆR(P n de = n sup sup log P n (x n 1 Q n P n x n 1 X n Q n (x n 1, as the worst case redundancy, or mnmax regret o P n. The class P s sad to be unversal ˆR(P = o(n,.e., redundancy s sublnear n the block-length. The most extensvely studed [3 12] class o dstrbutons s Ik n, the collecton o all..d. dstrbutons over length-n sequences rom an underlyng alphabet o sze k, e.g., over [k]. It s now well establshed that 1 For k = o(n 2 For n = o(k ˆR(I n k = k 1 2 log n k + k 2 + o(k, ˆR(I n k = n log k n + o(n. In partcular [3, 4] showed that class o all dstrbutons over N has nnte redundancy. Whle the problem was tradtonally studed n the settng o nte underlyng alphabet sze and large block lengths, however n numerous applcatons the underlyng natural alphabet best descrbng the data mght be large. For example, a text document only contans a small set o the entre dctonary, a natural mage has many possble pxel values. The class o all..d. dstrbutons over N s thereore too large to provde meanngul compresson schemes or all dstrbutons. These mpossblty results led researchers to consder natural subclasses o all..d. dstrbutons and then compress sequences generated rom dstrbutons n these or to consder all..d. dstrbutons but decompose sequences nto natural components and compress each part separately. We now descrbe three such approaches that have been proposed n recent years to characterze large alphabet compresson. [13] propose to encode the order o symbols (pattern n the sequences and the dctonary separately. [14, 15] studed the compresson o patterns and n partcular show that patterns can be compressed wth sublnear worst case redundancy.e., they are unversally compressble. [16] consder the class M o all monotone dstrbutons over N. They rst show that even ths class s not unversally compressble. Despte ths, they desgned unversal codes or

2 monotone dstrbutons wth nte entropy. [17] study the redundancy o M k, monotone dstrbutons over alphabet sze k, as a uncton o k and the block length n. In ths paper, we consder a thrd, natural and elegant approach, proposed by [1]. They study a arly general class o dstrbutons, called envelope classes, whch as the name suggests are dstrbutons that are bounded by an envelope. They provde general bounds on the redundancy o envelope classes. The upper bounds on the worst-case redundancy are obtaned by boundng the Shtarkov sum. They provde bounds on the more strngent (or lower bounds average case redundancy, where nstead o the supremum over all possble sequences, they consder the expected log rato,.e., the KL dvergence. For ths quantty, they provde a more complex employng the redundancy-capacty theorem. We now ntroduce the Posson samplng model, and later use t to provde redundancy bounds on envelope classes. II. POISSON MODEL Posson samplng provdes smpled analyss n a number o problems n machne learnng and statstcs [18]. To the best o our knowledge, the rst applcaton o Posson samplng n unversal codng was n [19], and [20] to prove near-optmal bounds on the pattern redundancy. More recently, [21] also apply Posson samplng to provde smpler codng schemes or compressng..d. dstrbutons. In ths work we are nterested n codng/predcton over countably nnte alphabets descrbed bounded by an envelope. Let po(λ be a Posson dstrbuton wth parameter λ and po(λ, µ de λ λµ = e µ! s the probablty o µ under a po(λ dstrbuton. For a dstrbuton P over X, recall that P n s a dstrbuton over X n. Smlarly, we dene a dstrbuton P po(n over X de = X X 2 X 3... as ollows. Frst generate n po(n, then sample the dstrbuton..d. n tmes. Thereore the probablty o any sequence x n 1 X s P po(n (x n 1 = po(n, n P n (x n 1 = po(n, n P (x. Smlar to P n, let P po(n de = {P po(n : P P} n be the class o dstrbutons over X va samplng a dstrbuton..d. po(n tmes. Under ths model, the advantage s that the number o appearances o symbols (multplctes are ndependent o each other. Note that even though the length o the sequence s random, t s concentrated around n as a Posson random varable. For large n, the length n o a Posson sampled sequence s concentrated around n. Usng ths, we show n Theorem 2 that t s sucent to consder P po(n nstead o P n. We rst descrbe some propertes o Posson sampled dstrbutons. A. Propertes o Posson samplng The multplcty o a symbol n a sequence s the number o tmes t appears n t. The type o a sequence x n 1 over X = {a 1,..., } s τ(x n 1 de = (µ(a 1, µ(a 2,..., the tuple o multplctes o the symbols n the sequence x n 1 [22, 23]. For example, a =, or = 1,..., 6 denotes the possble outcomes o a de. Then the sequence o outcomes 2, 3, 1, 6, 1, 3, 3, 4, 6 has type τ = (2, 1, 3, 1, 0, 2. Any product dstrbuton assgns the same probablty to sequences wth the same type. Thereore, or most statstcal problems, under ndependent samplng, t suces to consder only the type as the random varable. When X = k, ths corresponds to the balls and bns model wth n balls and k bns. The man dculty n analyzng such models s the dependences that arse among the multplctes, e.g., they add to n [18]. However, we consder po(n samplng, the dstrbuton P po(n has the ollowng propertes. Lemma 1 ([18]: For any dstrbuton P po(n, 1 condtoned on n, the dstrbuton on X n s P n or dscrete dstrbutons. 2 A symbol a X wth P (a = p appears po(np tmes ndependently o all other symbols. Ths shows that the multplctes are ndependent under Posson samplng. In the next result we relate the redundancy o Posson samplng to xed length samplng. In partcular, we show that t suces to consder P po(n. Theorem 2: For any ɛ > 0, or n > n 0 (ɛ, P ˆR(P po(n(1 ɛ 1 ˆR(P n ˆR(P po(n + 1. III. RELATED WORK AND NEW RESULTS Denton 3: The envelope class o a uncton s the class P de = {(p 1, p 2,... : p, and p 1 + p = 1} o all dstrbutons such that the symbol probablty s bounded by the value o the uncton at that pont. Analogous to P n, we can dene P n and Ppo(n. The study o unversal codng or envelope classes was ntated n [1], where they provde upper and lower bounds on the redundancy. They show that = the redundancy s nnte or any n, and thereore consder only absolutely ntegrable envelopes. Ther upper bound states that [ ˆR(P n mn n ( + u 1 ] log n + O(1. u n 2 u They prove a more complex lower bound on the averageredundancy. More recently, [21] provde bounds o smlar order or envelope classes usng Posson samplng. po(n We provde smple bounds on ˆR(P n terms o the redundancy o a smple one dmensonal class o dstrbutons characterzed by a sngle parameter λ max, as de = {po(λ : λ < λ max }.

3 Ths s the class o all Posson dstrbutons wth parameter bounded above. Ths class s smple enough, so that we can bound ts redundancy tghtly. For an envelope characterzed by, let λ max l be the smallest nteger such that j 1 ɛ. j l Ths also mples that j l λ max j n(1 ɛ. We now state our man result on envelope classes. Theorem 4: For the envelope class P, =l ˆR (POI(λ max ˆR ( P po(n de = n. Let ˆR (, where λ max = n. Snce, we use as a prmtve, we now show smple bounds that wll be used to compute redundances o specc classes later. Lemma 5: For λ 1, ˆR POI(λ = log (2 exp( λ, and or λ 1, 2(λ + 1 ˆR POI(λ 2 +. We now consder power-law and exponental envelopes and apply Theorem 4 to obtan sharp bounds on redundances o these classes mprovng the prevous results. To prove the ecacy o these bounds we apply them on the Power-law and exponental envelopes. Denton 6: The power-law envelope class wth parameters α > 1 and c s the class o dstrbutons Λ c α over N such that N, p c. α Denton 7: The exponental-law envelope class wth parameters α and c s the class o dstrbutons Λ ce α over N such that N, p ce α. [1] obtan bounds on the redundancy o these classes. For power-law envelopes they show that or large n, C 0 n 1 α n ˆR(Λc α ( 2cn α 1 1 α (log n 1 1 α + O(1, (1 where C 0 s a constant (uncton o α and c. For exponental envelopes they prove that log 2 n 8α (1 + o(1 ˆR(Λ n ce α log2 n 2α + O(1. [24] mprove the bounds or Λ n ce and show that α ˆR(Λ n ce α = log2 n 4α + O(log n log log n. More recently, [25] extend the arguments o [24] to nd tght unversal codes or the larger class o sub-exponental dstrbutons, whch have strctly aster decay than power-law classes, but slower than exponental. However these results do not nd the optmal redundancy o power-law envelopes. Ther technques seem to rely on the act that exponental envelopes restrct the support szes to be poly-logarthmc and break down or more heavy-taled dstrbutons, such as the power-law envelopes. We show that smply applyng Theorem 4 to these classes and boundng the resultng expressons gves tght redundancy bounds. In partcular, or Λ n c α, we show that Theorem 8: For large n (cn 1/α [ α + 1 ] 2 α 1 log 3 n 1 ˆR(Λc α (cn 1/α[ α ] α 1 + log For Λ n ce α, we prove that Theorem 9: For large n ˆR(Λ n ce α = log2 n 4α + O(log c log n. Remark Ths result can also be generalzed to provde tght bounds or sub-exponental envelope class consdered n [25]. In the next three sectons we prove these results. A. Proo o Theorem 2 IV. PROOFS For x n 1 X n, let ˆP n (x n 1 = sup P n P P n (x n n 1 be the maxmum lkelhood (ML probablty o x n 1. Then, S(P n = x n 1 X n ˆP n (x n 1. By the rst part o Lemma 1, t ollows that S(P po(n = n po(n, n S(P n. (2 By condtonng on x n 1 n the expresson or S(P n+1 t s easy to see that S(P n S(P n+1. Thereore, S(P po(n (a = n 0 po(n, n S(P n (b S(P n po(n, n (c 1 2 S(Pn, n n where (a ollows rom Equaton (2, (b rom monotoncty o S(P n and (c rom the act that medan o a Posson dstrbuton close to ts mean. Takng logarthms gves the second part o the theorem. The rst part o the theorem uses (2 agan along wth tal bounds on Posson random varables and s omtted here. It essentally uses the act that a po(n(1 ɛ random varable exceeds n wth exponentally small probablty. We omt the proo due to lack o space.

4 B. Proo o Theorem 4 For..d. samplng types are a sucent statstc o the sequence, namely all sequences wth the same type have the same probablty. Let τ(p n be the dstrbuton nduced by P n over types. Let τ(p n = {τ(p n : P P} be all dstrbutons over types rom dstrbutons o the orm P n. By the second tem n Lemma 1, or a dstrbuton P = (p 1, p 2,,... over X = {1, 2,...} τ(p po(n = (po(np 1, po(np 2,..., where each coordnate s an ndependent Posson dstrbuton. We rst show that the redundancy o sequences s the same as the redundancy o types. Lemma 10: ˆR(τ(P n = ˆR(P n and ˆR(τ(P po(n = ˆR(P po(n. Proo: Any..d. dstrbuton assgns the same probablty to all sequences wth the same type. Thereore, such sequences have the same maxmum lkelhood probablty. We show that the Shtarkov sums o the two classes are the same and hence they have same redundancy. S(P n = ˆP n (x n 1 = ˆP n (x n 1 x n 1 X n τ x n 1 :τ(xn 1 =τ = τ ˆP n (τ = S(τ(P n. The proo also apples to Posson samplng. Under Posson samplng, the type o a dstrbuton s a tuple o ndependent random varables, namely the multplctes. In general suppose P be a collecton o product (ndependent dstrbutons over A B,.e., each element n P s a dstrbuton o the orm P 1 P 2, where P 1 and P 2 are dstrbutons over A and B respectvely. Let P A and P B be the class o margnals over A and B respectvely. It can be shown, e.g., [20], that the redundancy o P s at most the sum o the margnal redundances. Lemma 11 (Redundancy o products: For a collecton P o product dstrbutons over A B, ˆR(P ˆR(P A + ˆR(P B. Furthermore, P = P A P B then equalty holds. Proo: For any (a, b A B, Now, sup P 1 (ap 1 (b sup P 1 (a sup P 2 (b. (P 1,P 2 P P 1 P A P 2 P B S(P = sup (P (a,b A B 1,P 2 P P 1 (ap 2 (b sup P (a sup Q(b = S(P A S(P B, P a A 1 P A P b B 2 P B where the nequalty ollows rom the equaton above, and the lemma ollows by takng logarthms. When all possble combnatons o margnals s possble, then the nequalty becomes an equalty. The lemma generalzes to more than two product classes. Smlar to the characterzaton o τ(p po(n t ollows that ( τ P po(n = { (po(λ 1, po(λ 2... : λ n, } λ = n. Notce that each λ λ max de = n, and thereore the class margnal dstrbutons o element s a Posson random varable wth parameter λ max and hence a subset o. Thereore, Lemma 11 yelds ˆR τ(p po(n ˆR ( 1 + ˆR ( For the lower bound note or any choce o λ < λ max or l corresponds to a dstrbuton n P po(n. In other words, all product dstrbutons n l l are vald projectons o a dstrbuton n P po(n along the coordnates l. Thereore, or the elements along these coordnates the redundancy o types under Posson samplng s determned precsely by Lemma 11. ˆR τ(p po(n ˆR l + ˆR l Combnng these wth Lemma 10 proves the theorem. C. Proo o Lemma 5 The Posson dstrbuton assgnng hghest probablty to N s po(. Thereore the ML dstrbuton o j n POI(λ s arg max POI(λ P ( = { po( po(λ Usng ths, the Shtarkov sum o the class s S POI(λ λ = =0 (a = 1 + λ otherwse. e! + λ λ e! λ +1 λ =0 (e! λ e λ,! where (a uses that µ po(λ, µ = 1. Usng the second expresson shows the case λ 1. Usng the rst along wth the ollowng Strlng s approxmaton proves the case o λ > 1. Lemma 12 (Strlng s Approxmaton: For any n 1, 1 there s a θ n ( 12n+1, 1 12n such that n! = ( n n 2n e θ n. e

5 D. Power-law: Proo o Theorem 8 Proo: By Denton 6, or power-law class Λ c α, λ max = cn. Let b de = (cn 1/α, then λ max α 1 or b and λ max < 1 otherwse. Then, (a n ˆR(Λc α b ˆR( + >b (b ( log 2 + b max + ˆR( + 1 where (a ollows rom Theorem 4 and (b rom Lemma 5. We consder the two summatons separately. For the rst term, we note that or λ 1, 2 + / < 3 λ and use t wth the ollowng smplcaton. B log B = log BB B! B, whch ollows rom B! > (B/e B. Thereore, ( b max b ( cn log 2 + < log 3 α = b log(3 + α 2 where (a ollows snce b = (cn 1/α. Takng the second term, =b+1 log(2 e λmax (a b (a < (cn 1/α( log(3 + α 2 =b+1 λ max = cn ( 1 (cn α log, =b+1 1 α = c1/α α 1 n1/α, where (a uses e x 2 e x, and (b ollows by usng s = b + 1 = (cn 1/α + 1 n =s 1 r s 1 (s 11 r. (x 1 r (r 1 Combnng these results proves the theorem. For the lower bound we only use the terms λ max or j l. Usng the lower bound on S(POI(λ rom Lemma 5 and agan applyng the Strlng s approxmaton (lower bound nstead o upper proves the result. Snce the deas are the same, we omt the proo. E. Exponental class: Proo o Theorem 9 Proo: By Denton 7 or Λ n ce α, log(cn α and only λ max 1. Let b de = log(cn α. Then smlar to the proo o power-law class we derve the two summatons and bound them ndvdually. Once agan or λ > 1, 2 + / < 3 λ. Thereore ( b max log 2 + = b log b log[cne α ] 2 =b = b log b α(b < b log(3 + b2 α 4. log(2 e λmax + For 1, the second term, snce e αb = cn, >b log(2 e λmax λ max = =b =b cne α 1 1 e α. Substtutng b = log(cn/α proves the upper bound. Once agan the lower bound s very smlar and s omtted. REFERENCES [1] S. Boucheron, A. Garver, and E. Gassat, Codng on countably nnte alphabets, IEEE Transactons on Inormaton Theory, vol. 55, no. 1, pp , [2] Y. M. Shtarkov, Unversal sequental codng o sngle messages, Problems o Inormaton Transmsson, vol. 23, no. 3, pp. 3 17, [3] J. Keer, A uned approach to weak unversal source codng, IEEE Transactons on Inormaton Theory, vol. 24, no. 6, pp , Nov [4] L. Davsson, Unversal noseless codng, IEEE Transactons on Inormaton Theory, vol. 19, no. 6, pp , Nov [5] L. D. Davsson, R. J. McElece, M. B. Pursley, and M. S. Wallace, Ecent unversal noseless source codes, IEEE Transactons on Inormaton Theory, vol. 27, no. 3, pp , [6] F. M. J. Wllems, Y. M. Shtarkov, and T. J. Tjalkens, The context-tree weghtng method: basc propertes. IEEE Transactons on Inormaton Theory, vol. 41, no. 3, pp , [7] T. Cover, Unversal portolos, Mathematcal Fnance, vol. 1, no. 1, pp. 1 29, January [8] J. Rssanen, Fsher normaton and stochastc complexty, IEEE Transactons on Inormaton Theory, vol. 42, no. 1, pp , January [9] W. Szpankowsk, On asymptotcs o certan recurrences arsng n unversal codng, Problems o Inormaton Transmsson, vol. 34, no. 2, pp , [10] Q. Xe and A. R. Barron, Asymptotc mnmax regret or data compresson, gamblng and predcton, IEEE Transactons on Inormaton Theory, vol. 46, no. 2, pp , [11] A. Orltsky and N. Santhanam, Speakng o nnty [..d. strngs], tt, vol. 50, no. 10, pp , oct [12] W. Szpankowsk and M. J. Wenberger, Mnmax redundancy or large alphabets, n ISIT, 2010, pp [13] J. Åberg, Y. M. Shtarkov, and B. J. M. Smeets, Multalphabet codng wth separate alphabet descrpton, n Proceedngs o Compresson and Complexty o Sequences, [14] A. Orltsky, N. Santhanam, and J. Zhang, Unversal compresson o memoryless sources over unknown alphabets, IEEE Transactons on Inormaton Theory, vol. 50, no. 7, pp , July [15], Always Good Turng: Asymptotcally optmal probablty estmaton, Scence, vol. 302, no. 5644, pp , October , see also Proceedngs o the 44th Annual Symposum on Foundatons o Computer Scence, October [16] D. Foster, R. Stne, and A. Wyner, Unversal codes or nte sequences o ntegers drawn rom a monotone dstrbuton, IEEE Transactons on Inormaton Theory, vol. 48, no. 6, pp , June [17] G. I. Shamr, Unversal source codng or monotonc and ast decayng monotonc dstrbutons, IEEE Transactons on Inormaton Theory, vol. 59, no. 11, pp , [18] M. Mtzenmacher and E. Upal, Probablty and computng - randomzed algorthms and probablstc analyss. Cambrdge Unv. Press, [19] J. Acharya, H. Das, and A. Orltsky, Tght bounds on prole redundancy and dstngushablty, n NIPS, 2012.

6 [20] J. Acharya, H. Das, A. Jaarpour, A. Orltsky, and A. Suresh, Tght bounds or unversal compresson o large alphabets, n Inormaton Theory Proceedngs (ISIT, 2013 IEEE Internatonal Symposum on, 2013, pp [21] X. Yang and A. R. Barron, Large alphabet codng and predcton through possonzaton and tltng, n Workshop on Inormaton Theoretc Methods n Scence and Engneerng, [22] H. Das, Compettve tests and estmators or propertes o dstrbutons, Ph.D. dssertaton, UCSD, [23] T. Cover and J. Thomas, Elements o Inormaton Theory, 2nd Ed. Wley Interscence, [24] D. Bontemps, Unversal codng on nnte alphabets: Exponentally decreasng envelopes, IEEE Transactons on Inormaton Theory, vol. 57, no. 3, pp , [25] D. Bontemps, S. Boucheron, and E. Gassat, About adaptve codng on countable alphabets, IEEE Transactons on Inormaton Theory, vol. 60, no. 2, pp , 2012.

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent

More information

Computational Biology Lecture 8: Substitution matrices Saad Mneimneh

Computational Biology Lecture 8: Substitution matrices Saad Mneimneh Computatonal Bology Lecture 8: Substtuton matrces Saad Mnemneh As we have ntroduced last tme, smple scorng schemes lke + or a match, - or a msmatch and -2 or a gap are not justable bologcally, especally

More information

A Simple Research of Divisor Graphs

A Simple Research of Divisor Graphs The 29th Workshop on Combnatoral Mathematcs and Computaton Theory A Smple Research o Dvsor Graphs Yu-png Tsao General Educaton Center Chna Unversty o Technology Tape Tawan yp-tsao@cuteedutw Tape Tawan

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering Statstcs and Probablty Theory n Cvl, Surveyng and Envronmental Engneerng Pro. Dr. Mchael Havbro Faber ETH Zurch, Swtzerland Contents o Todays Lecture Overvew o Uncertanty Modelng Random Varables - propertes

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Edge Isoperimetric Inequalities

Edge Isoperimetric Inequalities November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Communication Complexity 16:198: February Lecture 4. x ij y ij

Communication Complexity 16:198: February Lecture 4. x ij y ij Communcaton Complexty 16:198:671 09 February 2010 Lecture 4 Lecturer: Troy Lee Scrbe: Rajat Mttal 1 Homework problem : Trbes We wll solve the thrd queston n the homework. The goal s to show that the nondetermnstc

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Endogenous timing in a mixed oligopoly consisting of a single public firm and foreign competitors. Abstract

Endogenous timing in a mixed oligopoly consisting of a single public firm and foreign competitors. Abstract Endogenous tmng n a mxed olgopoly consstng o a sngle publc rm and oregn compettors Yuanzhu Lu Chna Economcs and Management Academy, Central Unversty o Fnance and Economcs Abstract We nvestgate endogenous

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Chapter 3 Differentiation and Integration

Chapter 3 Differentiation and Integration MEE07 Computer Modelng Technques n Engneerng Chapter Derentaton and Integraton Reerence: An Introducton to Numercal Computatons, nd edton, S. yakowtz and F. zdarovsky, Mawell/Macmllan, 990. Derentaton

More information

n-strongly Ding Projective, Injective and Flat Modules

n-strongly Ding Projective, Injective and Flat Modules Internatonal Mathematcal Forum, Vol. 7, 2012, no. 42, 2093-2098 n-strongly Dng Projectve, Injectve and Flat Modules Janmn Xng College o Mathematc and Physcs Qngdao Unversty o Scence and Technology Qngdao

More information

Computing MLE Bias Empirically

Computing MLE Bias Empirically Computng MLE Bas Emprcally Kar Wa Lm Australan atonal Unversty January 3, 27 Abstract Ths note studes the bas arses from the MLE estmate of the rate parameter and the mean parameter of an exponental dstrbuton.

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

= s j Ui U j. i, j, then s F(U) with s Ui F(U) G(U) F(V ) G(V )

= s j Ui U j. i, j, then s F(U) with s Ui F(U) G(U) F(V ) G(V ) 1 Lecture 2 Recap Last tme we talked about presheaves and sheaves. Preshea: F on a topologcal space X, wth groups (resp. rngs, sets, etc.) F(U) or each open set U X, wth restrcton homs ρ UV : F(U) F(V

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Lecture 4: Constant Time SVD Approximation

Lecture 4: Constant Time SVD Approximation Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machne Learnng - Lectures Lecture 1-2: Concept Learnng (M. Pantc Lecture 3-4: Decson Trees & CC Intro (M. Pantc Lecture 5-6: Artfcal Neural Networks (S.Zaferou Lecture 7-8: Instance ased Learnng

More information

36.1 Why is it important to be able to find roots to systems of equations? Up to this point, we have discussed how to find the solution to

36.1 Why is it important to be able to find roots to systems of equations? Up to this point, we have discussed how to find the solution to ChE Lecture Notes - D. Keer, 5/9/98 Lecture 6,7,8 - Rootndng n systems o equatons (A) Theory (B) Problems (C) MATLAB Applcatons Tet: Supplementary notes rom Instructor 6. Why s t mportant to be able to

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

General Tips on How to Do Well in Physics Exams. 1. Establish a good habit in keeping track of your steps. For example, when you use the equation

General Tips on How to Do Well in Physics Exams. 1. Establish a good habit in keeping track of your steps. For example, when you use the equation General Tps on How to Do Well n Physcs Exams 1. Establsh a good habt n keepng track o your steps. For example when you use the equaton 1 1 1 + = d d to solve or d o you should rst rewrte t as 1 1 1 = d

More information

P exp(tx) = 1 + t 2k M 2k. k N

P exp(tx) = 1 + t 2k M 2k. k N 1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

Representation Theorem for Convex Nonparametric Least Squares. Timo Kuosmanen

Representation Theorem for Convex Nonparametric Least Squares. Timo Kuosmanen Representaton Theorem or Convex Nonparametrc Least Squares Tmo Kuosmanen 4th Nordc Econometrc Meetng, Tartu, Estona, 4-6 May 007 Motvaton Inerences oten depend crtcally upon the algebrac orm chosen. It

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

Lecture 2: Gram-Schmidt Vectors and the LLL Algorithm

Lecture 2: Gram-Schmidt Vectors and the LLL Algorithm NYU, Fall 2016 Lattces Mn Course Lecture 2: Gram-Schmdt Vectors and the LLL Algorthm Lecturer: Noah Stephens-Davdowtz 2.1 The Shortest Vector Problem In our last lecture, we consdered short solutons to

More information

Goodness of fit and Wilks theorem

Goodness of fit and Wilks theorem DRAFT 0.0 Glen Cowan 3 June, 2013 Goodness of ft and Wlks theorem Suppose we model data y wth a lkelhood L(µ) that depends on a set of N parameters µ = (µ 1,...,µ N ). Defne the statstc t µ ln L(µ) L(ˆµ),

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

Complex Variables. Chapter 18 Integration in the Complex Plane. March 12, 2013 Lecturer: Shih-Yuan Chen

Complex Variables. Chapter 18 Integration in the Complex Plane. March 12, 2013 Lecturer: Shih-Yuan Chen omplex Varables hapter 8 Integraton n the omplex Plane March, Lecturer: Shh-Yuan hen Except where otherwse noted, content s lcensed under a BY-N-SA. TW Lcense. ontents ontour ntegrals auchy-goursat theorem

More information

ECEN 5005 Crystals, Nanocrystals and Device Applications Class 19 Group Theory For Crystals

ECEN 5005 Crystals, Nanocrystals and Device Applications Class 19 Group Theory For Crystals ECEN 5005 Crystals, Nanocrystals and Devce Applcatons Class 9 Group Theory For Crystals Dee Dagram Radatve Transton Probablty Wgner-Ecart Theorem Selecton Rule Dee Dagram Expermentally determned energy

More information

PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 125, Number 7, July 1997, Pages 2119{2125 S (97) THE STRONG OPEN SET CONDITION

PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 125, Number 7, July 1997, Pages 2119{2125 S (97) THE STRONG OPEN SET CONDITION PROCDINGS OF TH AMRICAN MATHMATICAL SOCITY Volume 125, Number 7, July 1997, Pages 2119{2125 S 0002-9939(97)03816-1 TH STRONG OPN ST CONDITION IN TH RANDOM CAS NORBRT PATZSCHK (Communcated by Palle. T.

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness. 20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The frst dea s connectedness. Essentally, we want to say that a space cannot be decomposed

More information

Approximate Smallest Enclosing Balls

Approximate Smallest Enclosing Balls Chapter 5 Approxmate Smallest Enclosng Balls 5. Boundng Volumes A boundng volume for a set S R d s a superset of S wth a smple shape, for example a box, a ball, or an ellpsod. Fgure 5.: Boundng boxes Q(P

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

Lecture 16 Statistical Analysis in Biomaterials Research (Part II)

Lecture 16 Statistical Analysis in Biomaterials Research (Part II) 3.051J/0.340J 1 Lecture 16 Statstcal Analyss n Bomaterals Research (Part II) C. F Dstrbuton Allows comparson of varablty of behavor between populatons usng test of hypothess: σ x = σ x amed for Brtsh statstcan

More information

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Machine Learning, 43, , 2001.

Machine Learning, 43, , 2001. Machne Learnng, 43, 65-9, 00. Drftng Games Robert E. Schapre (schapre@research.att.com) AT&T Labs Research, Shannon Laboratory, 80 Park Avenue, Room A79, Florham Park, NJ 0793-097 Abstract. We ntroduce

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Finding Primitive Roots Pseudo-Deterministically

Finding Primitive Roots Pseudo-Deterministically Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms

More information

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition) Count Data Models See Book Chapter 11 2 nd Edton (Chapter 10 1 st Edton) Count data consst of non-negatve nteger values Examples: number of drver route changes per week, the number of trp departure changes

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Lecture 17: Lee-Sidford Barrier

Lecture 17: Lee-Sidford Barrier CSE 599: Interplay between Convex Optmzaton and Geometry Wnter 2018 Lecturer: Yn Tat Lee Lecture 17: Lee-Sdford Barrer Dsclamer: Please tell me any mstake you notced. In ths lecture, we talk about the

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Econometrics: What's It All About, Alfie?

Econometrics: What's It All About, Alfie? ECON 351* -- Introducton (Page 1) Econometrcs: What's It All About, Ale? Usng sample data on observable varables to learn about economc relatonshps, the unctonal relatonshps among economc varables. Econometrcs

More information

Exercise Solutions to Real Analysis

Exercise Solutions to Real Analysis xercse Solutons to Real Analyss Note: References refer to H. L. Royden, Real Analyss xersze 1. Gven any set A any ɛ > 0, there s an open set O such that A O m O m A + ɛ. Soluton 1. If m A =, then there

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8 U.C. Berkeley CS278: Computatonal Complexty Handout N8 Professor Luca Trevsan 2/21/2008 Notes for Lecture 8 1 Undrected Connectvty In the undrected s t connectvty problem (abbrevated ST-UCONN) we are gven

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information