Randomness and Computation

Size: px
Start display at page:

Download "Randomness and Computation"

Transcription

1 Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde

2 Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually one at a tme). Magc bns wth no upper lmt on capacty. Common model of random allocatons and ther affect on overall load and load balance, typcal dstrbuton n the system. Classc" queston - what does the dstrbuton look lke for m = n? Max load? wth hgh probablty results are what we want). We have already shown that when m = n same number of balls as bns) and n f suffcently large, the maxmum load s 3 lnn) wth probablty at least n. We wll show an Ω lnn) ) bound today. RC 208/9) Lecture 0 slde 2

3 Some prelmnary observatons, defntons The probablty s of a specfc bn bn, say) beng empty: n )m e m/n. Expected number of empty bns: ne m/n Probablty p r of a specfc bn havng r balls: ) m r p r = ) m r. r n n Note p r e m/n r! Defnton 5.) A dscrete Posson random varable X wth parameter µ s gven by the followng probablty dstrbuton on j = 0,, 2,...: m n r. Pr[X = j] = e µ µ j j!. RC 208/9) Lecture 0 slde 3

4 Posson as the lmt of the Bnomal Dstrbuton Theorem 5.5) If X n s a bnomal random varable wth parameters n and p = pn) such that lm n np = λ s a constant ndependent of n), then for any fxed k N 0 lm Pr[X n = k] = e λ λ k n k!. RC 208/9) Lecture 0 slde 4

5 Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

6 Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

7 Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

8 Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

9 Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

10 Some prelmnares Theorem 5.7) Let f x,..., x n ) be a non-negatve functon. Then E[f X m),..., X m) n )] e m E[f Y m),..., Y m) n )]. Corollary 5.9) Any event that takes place wth probablty p n the Posson case takes place wth probablty at most pe m n the exact balls-n-bns case. RC 208/9) Lecture 0 slde 6

11 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

12 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

13 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

14 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

15 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

16 Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

17 Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

18 Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

19 Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

20 Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

21 Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

22 Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

23 Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

24 Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

25 Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

26 Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

27 References and Exercses Sectons 5., 5.2 of Probablty and Computng". Sectons 5.3 and 5.4 have all precse detals of our Ω lnn) ) result. Secton 5.5 on Hashng s worth a read and has none of the Posson stuff I m skppng t because of tme lmtatons). Exercses I wll release a tutoral sheet. RC 208/9) Lecture 0 slde

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Math 261 Exercise sheet 2

Math 261 Exercise sheet 2 Math 261 Exercse sheet 2 http://staff.aub.edu.lb/~nm116/teachng/2017/math261/ndex.html Verson: September 25, 2017 Answers are due for Monday 25 September, 11AM. The use of calculators s allowed. Exercse

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

CS 798: Homework Assignment 2 (Probability)

CS 798: Homework Assignment 2 (Probability) 0 Sample space Assgned: September 30, 2009 In the IEEE 802 protocol, the congeston wndow (CW) parameter s used as follows: ntally, a termnal wats for a random tme perod (called backoff) chosen n the range

More information

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13] Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table: SELECTED PROOFS DeMorgan s formulas: The frst one s clear from Venn dagram, or the followng truth table: A B A B A B Ā B Ā B T T T F F F F T F T F F T F F T T F T F F F F F T T T T The second one can be

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds. U.C. Berkeley CS273: Parallel and Dstrbuted Theory Lecture 1 Professor Satsh Rao August 26, 2010 Lecturer: Satsh Rao Last revsed September 2, 2010 Lecture 1 1 Course Outlne We wll cover a samplng of the

More information

Chapter 1. Probability

Chapter 1. Probability Chapter. Probablty Mcroscopc propertes of matter: quantum mechancs, atomc and molecular propertes Macroscopc propertes of matter: thermodynamcs, E, H, C V, C p, S, A, G How do we relate these two propertes?

More information

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,

More information

Simulation and Random Number Generation

Simulation and Random Number Generation Smulaton and Random Number Generaton Summary Dscrete Tme vs Dscrete Event Smulaton Random number generaton Generatng a random sequence Generatng random varates from a Unform dstrbuton Testng the qualty

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Lecture 4: Constant Time SVD Approximation

Lecture 4: Constant Time SVD Approximation Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),

More information

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X

More information

Stat 642, Lecture notes for 01/27/ d i = 1 t. n i t nj. n j

Stat 642, Lecture notes for 01/27/ d i = 1 t. n i t nj. n j Stat 642, Lecture notes for 01/27/05 18 Rate Standardzaton Contnued: Note that f T n t where T s the cumulatve follow-up tme and n s the number of subjects at rsk at the mdpont or nterval, and d s the

More information

Strong Markov property: Same assertion holds for stopping times τ.

Strong Markov property: Same assertion holds for stopping times τ. Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up

More information

Applied Stochastic Processes

Applied Stochastic Processes STAT455/855 Fall 23 Appled Stochastc Processes Fnal Exam, Bref Solutons 1. (15 marks) (a) (7 marks) The dstrbuton of Y s gven by ( ) ( ) y 2 1 5 P (Y y) for y 2, 3,... The above follows because each of

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X The EM Algorthm (Dempster, Lard, Rubn 1977 The mssng data or ncomplete data settng: An Observed Data Lkelhood (ODL that s a mxture or ntegral of Complete Data Lkelhoods (CDL. (1a ODL(;Y = [Y;] = [Y,][

More information

CS-433: Simulation and Modeling Modeling and Probability Review

CS-433: Simulation and Modeling Modeling and Probability Review CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Announcements EWA with ɛ-exploration (recap) Lecture 20: EXP3 Algorithm. EECS598: Prediction and Learning: It s Only a Game Fall 2013.

Announcements EWA with ɛ-exploration (recap) Lecture 20: EXP3 Algorithm. EECS598: Prediction and Learning: It s Only a Game Fall 2013. Lecture 0: EXP3 Algorthm 1 EECS598: Predcton and Learnng: It s Only a Game Fall 013 Prof. Jacob Abernethy Lecture 0: EXP3 Algorthm Scrbe: Zhhao Chen Announcements None 0.1 EWA wth ɛ-exploraton (recap)

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Finding Primitive Roots Pseudo-Deterministically

Finding Primitive Roots Pseudo-Deterministically Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Google PageRank with Stochastic Matrix

Google PageRank with Stochastic Matrix Google PageRank wth Stochastc Matrx Md. Sharq, Puranjt Sanyal, Samk Mtra (M.Sc. Applcatons of Mathematcs) Dscrete Tme Markov Chan Let S be a countable set (usually S s a subset of Z or Z d or R or R d

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Random Partitions of Samples

Random Partitions of Samples Random Parttons of Samples Klaus Th. Hess Insttut für Mathematsche Stochastk Technsche Unverstät Dresden Abstract In the present paper we construct a decomposton of a sample nto a fnte number of subsamples

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan. THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Supplement to Clustering with Statistical Error Control

Supplement to Clustering with Statistical Error Control Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Lecture 4: November 17, Part 1 Single Buffer Management

Lecture 4: November 17, Part 1 Single Buffer Management Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Contnuous Tme Markov Chans Brth and Death Processes,Transton Probablty Functon, Kolmogorov Equatons, Lmtng Probabltes, Unformzaton Chapter 6 1 Markovan Processes State Space Parameter Space (Tme) Dscrete

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

CALCULUS CLASSROOM CAPSULES

CALCULUS CLASSROOM CAPSULES CALCULUS CLASSROOM CAPSULES SESSION S86 Dr. Sham Alfred Rartan Valley Communty College salfred@rartanval.edu 38th AMATYC Annual Conference Jacksonvlle, Florda November 8-, 202 2 Calculus Classroom Capsules

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Lecture 4: September 12

Lecture 4: September 12 36-755: Advanced Statstcal Theory Fall 016 Lecture 4: September 1 Lecturer: Alessandro Rnaldo Scrbe: Xao Hu Ta Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer: These notes have not been

More information

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014) 0-80: Advanced Optmzaton and Randomzed Methods Lecture : Convex functons (Jan 5, 04) Lecturer: Suvrt Sra Addr: Carnege Mellon Unversty, Sprng 04 Scrbes: Avnava Dubey, Ahmed Hefny Dsclamer: These notes

More information

TAIL PROBABILITIES OF RANDOMLY WEIGHTED SUMS OF RANDOM VARIABLES WITH DOMINATED VARIATION

TAIL PROBABILITIES OF RANDOMLY WEIGHTED SUMS OF RANDOM VARIABLES WITH DOMINATED VARIATION Stochastc Models, :53 7, 006 Copyrght Taylor & Francs Group, LLC ISSN: 153-6349 prnt/153-414 onlne DOI: 10.1080/153634060064909 TAIL PROBABILITIES OF RANDOMLY WEIGHTED SUMS OF RANDOM VARIABLES WITH DOMINATED

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Société de Calcul Mathématique SA

Société de Calcul Mathématique SA Socété de Calcul Mathématque SA Outls d'ade à la décson Tools for decson help Probablstc Studes: Normalzng the Hstograms Bernard Beauzamy December, 202 I. General constructon of the hstogram Any probablstc

More information

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens Addtonal Proofs for: Large Sample Propertes of atchng stmators for Average Treatment ffects by Alberto Abade & Gudo Imbens Remnder of Proof of Lemma : To get the result for U m U m, notce that U m U m

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Computing MLE Bias Empirically

Computing MLE Bias Empirically Computng MLE Bas Emprcally Kar Wa Lm Australan atonal Unversty January 3, 27 Abstract Ths note studes the bas arses from the MLE estmate of the rate parameter and the mean parameter of an exponental dstrbuton.

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Lecture 6 More on Complete Randomized Block Design (RBD)

Lecture 6 More on Complete Randomized Block Design (RBD) Lecture 6 More on Complete Randomzed Block Desgn (RBD) Multple test Multple test The multple comparsons or multple testng problem occurs when one consders a set of statstcal nferences smultaneously. For

More information

As is less than , there is insufficient evidence to reject H 0 at the 5% level. The data may be modelled by Po(2).

As is less than , there is insufficient evidence to reject H 0 at the 5% level. The data may be modelled by Po(2). Ch-squared tests 6D 1 a H 0 : The data can be modelled by a Po() dstrbuton. H 1 : The data cannot be modelled by Po() dstrbuton. The observed and expected results are shown n the table. The last two columns

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

6.842 Randomness and Computation February 18, Lecture 4

6.842 Randomness and Computation February 18, Lecture 4 6.842 Randomness and Computaton February 18, 2014 Lecture 4 Lecturer: Rontt Rubnfeld Scrbe: Amartya Shankha Bswas Topcs 2-Pont Samplng Interactve Proofs Publc cons vs Prvate cons 1 Two Pont Samplng 1.1

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

Queueing Networks II Network Performance

Queueing Networks II Network Performance Queueng Networks II Network Performance Davd Tpper Assocate Professor Graduate Telecommuncatons and Networkng Program Unversty of Pttsburgh Sldes 6 Networks of Queues Many communcaton systems must be modeled

More information

TCOM 501: Networking Theory & Fundamentals. Lecture 7 February 25, 2003 Prof. Yannis A. Korilis

TCOM 501: Networking Theory & Fundamentals. Lecture 7 February 25, 2003 Prof. Yannis A. Korilis TCOM 501: Networkng Theory & Fundamentals Lecture 7 February 25, 2003 Prof. Yanns A. Korls 1 7-2 Topcs Open Jackson Networks Network Flows State-Dependent Servce Rates Networks of Transmsson Lnes Klenrock

More information

Several generation methods of multinomial distributed random number Tian Lei 1, a,linxihe 1,b,Zhigang Zhang 1,c

Several generation methods of multinomial distributed random number Tian Lei 1, a,linxihe 1,b,Zhigang Zhang 1,c Internatonal Conference on Appled Scence and Engneerng Innovaton (ASEI 205) Several generaton ethods of ultnoal dstrbuted rando nuber Tan Le, a,lnhe,b,zhgang Zhang,c School of Matheatcs and Physcs, USTB,

More information

3.1 ML and Empirical Distribution

3.1 ML and Empirical Distribution 67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum

More information