6.4. RANDOM VARIABLES 233

Size: px
Start display at page:

Download "6.4. RANDOM VARIABLES 233"

Transcription

1 6.4. RANDOM VARIABLES Random Varables What are Random Varables? A random varable for an experment wth a sample space S s a functon that assgns a number to each element of S. Typcally nstead of usng f to stand for such a functon we use X (at frst, a random varable was conceved of as a varable related to an experment, explanng the use of X, but t s very helpful n understandng the mathematcs to realze t actually s a functon on the sample space). For example, f we consder the process of flppng a con n tmes, we have the set of all sequences of nhs and T sasour sample space. The number of heads random varable takes a sequence and tells us how many heads are n that sequence. Somebody mght say Let X be the number of heads n 5 flps of a con. In that case X(HTHHT)whle X(THTHT)2. It may be rather jarrng to see X used to stand for a functon, but t s the notaton most people use. Forasequence of hashes of n keys nto a table wth k locatons, we mght have a random varable X whch s the number of keys that are hashed to locaton of the table, or a random varable X that counts the number of collsons (hashes to a locaton that already has at least one key). For an n queston test on whch each answer s ether rght or wrong (a short answer, True-False or multple choce test for example) we could have a random varable that gves the number of rght answers n a partcular sequence of answers to the test. For a meal at a restaurant we mght have a random varable that gves the prce of any partcular sequence of choces of menu tems. Exercse Gve several random varables that mght be of nterest to a doctor whose sample space s her patents. Exercse If you flp a con sx tmes, how many heads do you expect? Adoctor mght be nterested n patents ages, weghts, temperatures, blood pressures, cholesterol levels, etc. For Exercse 6.4-2, n sx flps of a con, t s natural to expect three heads. We mght argue that f we average the number of heads over all possble outcomes, the average should be half the number of flps. Snce the probablty of any gven sequence equals that of any other, t s reasonable to say that ths average s what we expect. Thus we would say we expect the number of heads to be half the number of flps. We wll explore ths more formally later. Bnomal Probabltes When we study an ndependent trals process wth two outcomes at each stage, t s tradtonal to refer to those outcomes as successes and falures. When we are flppng a con, we are often nterested n the number of heads. When we are analyzng student performance on a test, we are nterested n the number of correct answers. When we are analyzng the outcomes n drug trals, we are nterested n the number of trals where the drug was successful n treatng the dsease. Ths suggests a natural random varable assocated wth an ndependent trals process wth two outcomes at each stage, namely the number of successes n n trals. We wll analyze n general

2 24 CHAPTER 6. PROBABILITY the probablty of exactly k successes n n ndependent trals wth probablty p of success (and thus probablty 1 p of falure) on each tral. It s standard to call such an ndependent trals process a Bernoull trals process. Exercse 6.4- Suppose we have 5 Bernoull trals wth probablty p of success on each tral. What s the probablty of success on the frst three trals and falure on the last two? Falure on the frst two trals and success on the last three? Success on trals 1,, and 5, and falure on the other two? Success on any partcular three trals, and falure on the other two? Snce the probablty of a sequence of outcomes s the product of the probabltes of the ndvdual outcomes, the probablty of any sequence of successes and 2 falures s p (1 p) 2. More generally, n n Bernoull trals, the probablty of a gven sequence of k successes and n k falures s p k (1 p) n k.however ths s not the probablty of havng k successes, because many dfferent sequences could have k successes. How many sequences of n successes and falures have exactly k successes? The number of ways to choose the k places out of n where the successes occur s ( n k),sothe number of sequences wth k successes s ( n k). Ths paragraph and the last together gve us Theorem 6.7. Theorem 6.7 The probablty of havng exactly k successes n a sequence of n ndependent trals wth two outcomes and probablty p of success on each tral s n P (exactly k successes) p k (1 p) n k k. Proof: The proof follows from the two paragraphs precedng the theorem. Because of the connecton between these probabltes and the bnomal coeffcents, the probabltes of Theorem 6.7 are called bnomal probabltes, or the bnomal probablty dstrbuton. Exercse A student takes a ten queston objectve test. Suppose that a student who knows 80% of the course materal has probablty.8 of success an any queston, ndependently of how the student dd on any other problem. What s the probablty that ths student earns a grade of 80 or better? Exercse Recall the prmalty testng algorthm from Secton 2.4. Here we sad that we could, by choosng a random number less than or equal to n, perform a test on n that, f n was not prme, would certfy ths fact wth probablty 1/2. Suppose we perform 20 of these tests. It s reasonable to assume that each of these tests s ndependent of the rest of them. What s the probablty that a non-prme number s certfed to be non-prme? Snce a grade of 80 or better on a ten queston test corresponds to 8, 9, or 10 successes n ten trals, n Exercse we have P (80 or better) (.8) 8 (.2) 2 + (.8) 9 (.2) 1 +(.8)

3 6.4. RANDOM VARIABLES 25 Some work wth a calculator gves us that ths sum s approxmately.678. In Exercse 6.4-5, we wll frst compute the probablty that a non-prme number s not certfed to be non-prme. If we thnk of success as when the number s certfed non-prme and falure when t sn t, then we see that the only way to fal to certfy a number s to have 20 falures. Usng our formula we see that the probablty that a non-prme number s not certfed non-prme s just ( 20) (.5) 20 1/ Thus the chance of ths happenng s less than one n a mllon, and the chance of certfyng the non-prme as non-prme s 1 mnus ths. Therefore the probablty that a non-prme number wll be certfed non-prme s / , whch s more than , so a non-prme number s almost sure to be certfed non-prme. ATaste of Generatng Functons We note a nce connecton between the probablty of havng exactly k successes and the bnomal theorem. Consder, as an example, the polynomal (H + T ). Usng the bnomal theorem, we get that ths s (H + T ) H + 0 H 2 T + 1 HT T. We can nterpret ths as tellng us that f we flp a con three tmes, wth outcomes heads or tals each tme, then there are ( ( 0) 1way of gettng heads, 2) ways of gettng two heads and one tal, ( ( 1) ways ofgettng one head and two tals and ) 1way of gettng tals. Smlarly, f we replace H and T by px and (1 p)y we would get the followng: (px +(1 p)y) p x + 0 p 2 (1 p)x 2 y + 1 p(1 p) 2 xy (1 p) y. Generalzng ths to n repeated trals where n each tral the probablty of success s p, we see that by takng (px +(1 p)y) n we get (px +(1 p)y) n k0 n p k (1 p) n k x k y n k. k Takng the coeffcent of x k y n k from ths sum, we get exactly the result of Theorem 6.7. Ths connecton s a smple case of a very powerful tool known as generatng functons. Wesay that the polynomal (px +(1 p)y) n generates the bnomal probabltes. In fact, we don t even need the y, because n n (px +1 p) n p (1 p) n x. In general, the generatng functon for the sequence a 0,a 1,a 2,...,a n s n 1 a x, and the generatng functon for an nfnte sequence a 0,a 1,a 2,...,a n,... s the nfnte seres 1 a x. Expected Value In Exercse and Exercse we asked about the value you would expect a random varable(n these cases, a test score and the number of heads n sx flps of a con) to have. We haven t yet defned what we mean by the value we expect, and yet t seems to make sense n

4 26 CHAPTER 6. PROBABILITY the places we asked about t. If we say we expect 1 head f we flp a con twce, we can explan our reasonng by takng an average. There are four outcomes, one wth no heads, two wth one head, and one wth two heads, gvng us an average of Notce that usng averages compels us to have some expected values that are mpossble to acheve. For example n three flps of a con the eght possbltes for the number of heads are 0, 1, 1, 1, 2, 2, 2,, gvng us for our average Exercse An nterpretaton n games and gamblng makes t clear that t makes sense to expect a random varable to have a value that s not one of the possble outcomes. Suppose that I proposed the followng game. You pay me some money, and then you flp three cons. I wll pay you one dollar for every head that comes up. Would you play ths game f you had to pay me $2.00? How about f you had to pay me $1? How much do you thnk t should cost, n order for ths game to be far? Snce you expect to get 1.5 heads, you expect to make $1.50. Therefore, t s reasonable to play ths game as long as the cost s at most $1.50. Certanly averagng our varable over all elements of our sample space by addng up one result for each element of the sample space as we have done above s mpractcal even when we are talkng about somethng as smple as ten flps of a con. However we can ask how many tmes each possble number of heads arses, and then multply the number of heads by the number of tmes t arses to get an average number of heads of 0 ( 10) ( ) ( ) ( ) ( ) ( 10). (6.22) Thus we wonder whether we have seen a formula for n ( n ).Perhaps we have, but n any case the bnomal theorem and a bt of calculus or a proof by nducton show that n n 2 n 1 n, gvng us / for the fracton n Equaton If you are askng Does t have to be that hard? then good for you. Once we know a bt about the theory of expected values of random varables, computatons lke ths wll be replaced by far smpler ones. Besdes the nasty computatons that a smple queston lead us to, the average value of a random varable on a sample space need not have anythng to do wth the result we expect. For nstance f we replace heads and tals wth rght and wrong, we get the sample space of possble results that a student wll get when takng a ten queston test wth probablty.9 of gettng the rght answer on any one queston. Thus f we compute the average number of rght answers n all the possble patterns of test results we get an average of 5 rght answers. Ths s not the number of rght answers we expect because averagng has nothng to do wth the underlyng process that gave us our probablty! If we analyze the ten con flps a bt more carefully, we can resolve ths dsconnecton. We can rewrte Equaton 6.22 as

5 6.4. RANDOM VARIABLES 27 ( 10 ) ( 10 ) ( 10 ) ( 10 ) ( 10 ) 10 ( ) (6.2) In Equaton 6.2 we see we can compute the average number of heads by multplyng each value of our number of heads random varable by the probablty that we have that value for our random varable, and then addng the results. Ths gves us a weghted average of the values of our random varable, each value weghted by ts probablty. Because the dea of weghtng a random varable by ts probablty comes up so much n Probablty Theory, there s a specal notaton that has developed to use ths weght n equatons. We use P (X x )tostand for the probablty that the random varable X equals the value x. We call the functon that assgns P (x )tothe event P (X x ) the dstrbuton functon of the random varable X. Thus, for example, the bnomal probablty dstrbuton s the dstrbuton functon for the number of successes random varable n Bernoull trals. We defne the expected value or expectaton of a random varable X whose values are the set {x 1,x 2,...x k } to be E(X) x P (X x ). 1 Then for someone takng a ten-queston test wth probablty.9 of gettng the correct answer on each queston, the expected number of rght answers s (.9) (.1) 10. In the end of secton exercses we wll show a technque (that could be consdered an applcaton of generatng functons) that allows us to compute ths sum drectly by usng the bnomal theorem and calculus. We now proceed to develop a less drect but easer way to compute ths and many other expected values. Exercse Show that f a random varable X s defned on a sample space S (you may assume X has values x 1, x 2,...x k as above) then the expected value of X s gven by E(X) X(s)P (s). (In words, we take each member of the sample space, compute ts probablty, multply the probablty by the value of the random varable and add the results.) In Exercse we asked for a proof of a fundamental lemma Lemma 6.8 If a random varable X s defned on a (fnte) sample space S, then ts expected value s gven by E(X) X(s)P (s).

6 28 CHAPTER 6. PROBABILITY Proof: Assume that the values of the random varable are x 1, x 2,...x k. Let F stand for the event that the value of X s x,sothat P (F )P(X x ). Then, n the sum on the rght-hand sde of the equaton n the statement of the lemma, we can take the tems n the sample space, group them together nto the events F and and rework the sum nto the defnton of expectaton, as follows: X(s)P (s) 1 1 x s:s F X(s)P (s) s:s F x P (s) s:s F P (s) x P (F ) x P (X x )E(X). The proof of the lemma need not be so formal and symbolc as what we wrote; n Englsh, t smply says that when we compute the sum n the Lemma, we can group together all elements of the sample space that have X-value x and add up ther probabltes; ths gves us x P (x ), whch leads us to the defnton of the expected value of X. Expected Values of Sums and Numercal Multples Another mportant pont about expected value follows naturally from what we thnk about when we use the word expect n Englsh. If a paper grader expects to earn ten dollars gradng papers today and expects to earn twenty dollars gradng papers tomorrow, then she expects to earn thrty dollars gradng papers n these two days. We could use X 1 to stand for the amount of money she makes gradng papers today and X 2 to stand for the amount of money she makes gradng papers tomorrow, so we are sayng E(X 1 + X 2 )E(X 1 )+E(X 2 ). Ths formula holds for any sum of a par of random varables, and more generally for any sum of random varables on the same sample space. Theorem 6.9 Suppose X and Y are random varables on the (fnte) sample space S. Then E(X + Y )E(X)+E(Y ). Proof: From Lemma 6.8 we may wrte E(X + Y ) (X(s)+Y(s))P (s) X(s)P (s)+ Y (s)p (s) E(X)+E(Y ).

7 6.4. RANDOM VARIABLES 29 If we double the credt we gve for each queston on a test, we would expect students scores to double. Thus our next theorem should be no surprse. In t we use the notaton cx for the random varable we get from X by multplyng all ts values by the number c. Theorem 6.10 Suppose X s a random varable on a sample space S. Then for any number c, E(cX) ce(x). Proof: Left as a problem. Theorems 6.9 and 6.10 are very useful n provng facts about random varables. Taken together, they are typcally called lnearty of expectaton. (The dea that the expectaton of a sum s the same as the sum of expectatons s called the addtvty of expectaton.) The dea of lnearty wll often allow us to work wth expectatons much more easly than f we had to work wth the underlyng probabltes. For example, on one flp of a con, our expected number of heads s.5. Suppose we flp a con n tmes and let X be the number of heads we see on flp, sothat X s ether 0 or 1. (For example n fve flps of a con, X 2 (HTHHT)0whle X (HTHHT)1.) Then X, the total number of heads n n flps s gven by X X 1 + X 2 + X n, (6.24) the sum of the number of heads on the frst flp, the number on the second, and so on through the number of heads on the last flp. But the expected value of each X s.5. We can take the expectaton of both sdes of Equaton 6.24 and apply Lemma 6.9 repeatedly (or use nducton) to get that E(X) E(X 1 + X X n ) E(X 1 )+E(X 2 )+ + E(X n ) n Thus n n flps of a con, the expected number of heads s.5n. Compare the ease of ths method wth the effort needed earler to deal wth the expected number of heads n ten flps! Dealng wth probablty.9 or, n general wth probablty p poses no problem. Exercse Use the addtvty of expectaton to determne the expected number of correct answers a student wll get on an n queston fll n the blanks test f he or she knows 90% of the materal n the course and the questons on the test are an accurate and unform samplng of the materal n the course. In Exercse 6.4-8, snce the questons sample the materal n the course accurately, the most natural probablty for us to assgn to the event that the student gets a correct answer on a gven queston s.9. We can let X be the number of correct answers on queston (that s, ether 1 or 0 dependng on whether or not the student gets the correct answer). Then the expected number of rght answers s the expected value of the sum of the varables X. From Theorem 6.9 see that n n trals wth probablty.9 of success, we expect to have.9n successes. Ths gves that the expected number of rght answers on a ten queston test wth probablty.9 of gettng each queston rght s 9, as we expected. Ths s a specal case of our next theorem, whch s proved by the same knd of computaton.

8 240 CHAPTER 6. PROBABILITY Theorem 6.11 In a Bernoull trals process, n whch each experment has two outcomes and probablty p of success, the expected number of successes s np. Proof: Let X be the number of successes n the th of n ndependent trals. The expected number of successes on the th tral (.e. the expected value of X ) s, by defnton, p 1+(1 p) 0p. The number of successes X n all n trals s the sum of the random varables X. Then by Theorem 6.9 the expected number of successes n n ndependent trals s the sum of the expected values of the n random varables X and ths sum s np. The Number of Trals untl the Frst Success Exercse How many tmes do you expect to have to flp a con untl you frst see a head? Why? How many tmes to you expect to have to roll two dce untl you see a sum of seven? Why? Our ntuton suggests that we should have to flp a con twce to see a head. However we could concevably flp a con forever wthout seeng a head, so should we really expect to see a head n two flps? The probablty of gettng a seven on two dce s 1/6. Does that mean we should expect to have to roll the dce sx tmes before we see a seven? In order to analyze ths knd of queston we have to realze that we are steppng out of the realm of ndependent trals processes on fnte sample spaces. We wll consder the process of repeatng ndependent trals wth probablty p of success untl we have a success and then stoppng. Now the possble outcomes of our multstage process are the nfnte set {S, F S, FFS,...,F S,...}, n whch we have used the notaton F S to stand for the sequence of falures followed by a success. Snce we have an nfnte sequence of outcomes, t makes sense to thnk about whether we can assgn an nfnte sequence of probablty weghts to ts members so that the resultng sequence of probabltes adds to one. If so, then all our defntons make sense, and n fact the proofs of all our theorems reman vald. 5 There s only one way to assgn weghts that s consstent wth our knowledge of (fnte) ndependent trals processes, namely P (S) p, P (FS)(1 p)p,..., P(F S)(1 p) p,.... Thus we have to hope these weghts add to one; n fact ther sum s (1 p) p p (1 p) 1 p 1 (1 p) p p 1. 5 for those who are famlar wth the concept of convergence for nfnte sums (.e. nfnte seres), t s worth notng that t s the fact that probablty weghts cannot be negatve and must add to one that makes all the sums we need to deal wth for all the theorems we have proved so far converge. That doesn t mean all sums we mght want to deal wth wll converge; some random varables defned on the sample space we have descrbed wll have nfnte expected value. However those we need to deal wth for the expected number of trals untl success do converge.

9 6.4. RANDOM VARIABLES 241 Therefore we have a legtmate assgnment of probabltes and the set of sequences {F, FS, FFS,FFFS,...,F S,...} s a sample space wth these probablty weghts. Ths probablty dstrbuton, P (F S)(1 p) p,scalled a geometrc dstrbuton because of the geometrc seres we used n provng the probabltes sum to 1. Theorem 6.12 Suppose we have a sequence of trals n whch each tral has two outcomes, success and falure, and where at each step the probablty of success s p. Then the expected number of trals untl the frst success s 1/p. Proof: We consder the random varable X whch s f the frst success s on tral. (In other words, X(F 1 S)s.) The probablty that the frst success s on tral s (1 p) 1 p, snce n order for ths to happen there must be 1 falures followed by 1 success. The expected number of trals s the expected value of X, whch s, by the defnton of expected value and the prevous two sentences, E[number of trals] p(1 p) 1 p (1 p) 1 p (1 p) 1 p p 1 p 1 p p 2 1 p To go from the thrd to the fourth lne we used the fact that jx j j0 x (1 x) 2, (6.25) true for x wth absolute value less than one. We proved a fnte verson of ths equaton as Theorem 4.6; the nfnte verson s even easer to prove. Applyng ths theorem, we see that the expected number of tmes you need to flp a con untl you get heads s 2, and the expected number of tmes you need to roll two dce untl you get a seven s 6. Important Concepts, Formulas, and Theorems 1. Random Varable. A random varable for an experment wth a sample space S s a functon that assgns a number to each element of S.

10 242 CHAPTER 6. PROBABILITY 2. Bernoull Trals Process. An ndependent trals process wth two outcomes, success and falure, at each stage and probablty p of success and 1 p of falure at each stage s called a Bernoull trals process.. Probablty of a Sequence of Bernoull Trals. In n Bernoull trals wth probablty p of success, the probablty of a gven sequence of k successes and n k falures s p k (1 p) n k. 4. The Probablty of k Successes n n Bernoull Trals The probablty of havng exactly k successes n a sequence of n ndependent trals wth two outcomes and probablty p of success on each tral s n P (exactly k successes) p k (1 p) n k k. 5. Bnomal Probablty Dstrbuton. The probabltes of of k successes n n Bernoull trals, ( n ) k p k (1 p) n k, are called bnomal probabltes, orthe bnomal probablty dstrbuton. 6. Generatng Functon. The generatng functon for the sequence a 0,a 1,a 2,...,a n s n 1 a x, and the generatng functon for an nfnte sequence a 0,a 1,a 2,...,a n,... s the nfnte seres 1 a x. The polynomal (px +1 p) n s the generatng functon for the bnomal probabltes for n Bernoull trals wth probablty p of success. 7. Dstrbuton Functon. We call the functon that assgns P (x )totheevent P (X x ) the dstrbuton functon of the random varable X. 8. Expected Value. We defne the expected value or expectaton of a random varable X whose values are the set {x 1,x 2,...x k } to be E(X) x P (X x ) Another Formula for Expected Values. If a random varable X s defned on a (fnte) sample space S, then ts expected value s gven by E(X) X(s)P (s). 10. Expected Value of a Sum. Suppose X and Y are random varables on the (fnte) sample space S. Then E(X + Y )E(X)+E(Y ). Ths s called the addtvty of expectaton. 11. Expected Value of a Numercal Multple. Suppose X s a random varable on a sample space S. Then for any number c, E(cX) ce(x). Ths result and the addtvty of expectaton together are called the lnearty of expectaton. 12. Expected Number of Successes n Bernoull Trals. In a Bernoull trals process, n whch each experment has two outcomes and probablty p of success, the expected number of successes s np.

11 6.4. RANDOM VARIABLES Expected Number of Trals Untl Success. Suppose we have a sequence of trals n whch each tral has two outcomes, success and falure, and where at each step the probablty of success s p. Then the expected number of trals untl the frst success s 1/p. Problems 1. Gve several random varables that mght be of nterest to someone rollng fve dce (as one does, for example, n the game Yatzee). 2. Suppose I offer to play the followng game wth you f you wll pay me some money. You roll a de, and I gve you a dollar for each dot that s on top. What s the maxmum amount of money a ratonal person mght be wllng to pay me n order to play ths game?. How many sxes do we expect to see on top f we roll 24 dce? 4. What s the expected sum of the tops of n dce when we roll them? 5. In an ndependent trals process consstng of sx trals wth probablty p of success, what s the probablty that the frst three trals are successes and the last three are falures? The probablty that the last three trals are successes and the frst three are falures? The probablty that trals 1,, and 5 are successes and trals 2, 4, and 6 are falures? What s the probablty of three successes and three falures? 6. What s the probablty of exactly eght heads n ten flps of a con? Of eght or more heads? 7. How many tmes do you expect to have to role a de untl you see a sx on the top face? 8. Assumng that the process of answerng the questons on a fve-queston quz s an ndependent trals process and that a student has a probablty of.8 of answerng any gven queston correctly, what s the probablty of a sequence of four correct answers and one ncorrect answer? What s the probablty that a student answers exactly four questons correctly? 9. What s the expected value of the constant random varable X that has X(s) c for every member s of the sample space? We frequently just use c to stand for ths random varable, and thus ths queston s askng for E(c). 10. Someone s takng a true-false test and guessng when they don t know the answer. We are gong to compute a score by subtractng a percentage of the number of ncorrect answers from the number of correct answers. When we convert ths corrected score to a percentage score we want ts expected value to be the percentage of the materal beng tested that the test-taker knows. How can we do ths? 11. Do Problem 10 of ths secton for the case that someone s takng a multple choce test wth fve choces for each answer and guesses randomly when they don t know the answer. 12. Suppose we have ten ndependent trals wth three outcomes called good, bad, and ndfferent, wth probabltes p, q, and r, respectvely. What s the probablty of three goods, two bads, and fve ndfferents? In n ndependent trals wth three outcomes A, B, and C, wth probabltes p, q, and r, what s the probablty of As, j Bs, and k Cs? (In ths problem we assume p + q + r 1and + j + k n.)

12 244 CHAPTER 6. PROBABILITY 1. In as many ways as you can, prove that n n 2 n 1 n. 14. Prove Theorem Two nckels, two dmes, and two quarters are n a cup. We draw three cons, one after the other, wthout replacement. What s the expected amount of money we draw on the frst draw? On the second draw? What s the expected value of the total amount of money we draw? Does ths expected value change f we draw the three cons all together? 16. In ths exercse we wll evaluate the sum (.9) (.1) 10 that arose n computng the expected number of rght answers a person would have on a ten queston test wth probablty.9 of answerng each queston correctly. Frst, use the bnomal theorem and calculus to show that 10 10(.1+x) 9 10 (.1) 10 x 1 Substtutng n x.9 gves us almost the sum we want on the rght hand sde of the equaton, except that n every term of the sum the power on.9 s one too small. Use some smple algebra to fx ths and then explan why the expected number of rght answers s Gve an example of two random varables X and Y such that E(XY ) E(X)E(Y ). Here XY s the random varable wth (XY )(s) X(s)Y (s). 18. Prove that f X and Y are ndependent n the sense that the event that X x and the event that Y y are ndependent for each par of values x of X and y of Y, then E(XY )E(X)E(Y ). See Exercse 6-17 for a defnton of XY. 19. Use calculus and the sum of a geometrc seres to show that jx j x (1 x) 2 as n Equaton j0 20. Gve an example of a random varable on the sample space {S, F S, FFS,...,F S,...} wth an nfnte expected value.

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

For example, if the drawing pin was tossed 200 times and it landed point up on 140 of these trials,

For example, if the drawing pin was tossed 200 times and it landed point up on 140 of these trials, Probablty In ths actvty you wll use some real data to estmate the probablty of an event happenng. You wll also use a varety of methods to work out theoretcal probabltes. heoretcal and expermental probabltes

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

8.6 The Complex Number System

8.6 The Complex Number System 8.6 The Complex Number System Earler n the chapter, we mentoned that we cannot have a negatve under a square root, snce the square of any postve or negatve number s always postve. In ths secton we want

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials MA 323 Geometrc Modellng Course Notes: Day 13 Bezer Curves & Bernsten Polynomals Davd L. Fnn Over the past few days, we have looked at de Casteljau s algorthm for generatng a polynomal curve, and we have

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

and problem sheet 2

and problem sheet 2 -8 and 5-5 problem sheet Solutons to the followng seven exercses and optonal bonus problem are to be submtted through gradescope by :0PM on Wednesday th September 08. There are also some practce problems,

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

Chapter 1. Probability

Chapter 1. Probability Chapter. Probablty Mcroscopc propertes of matter: quantum mechancs, atomc and molecular propertes Macroscopc propertes of matter: thermodynamcs, E, H, C V, C p, S, A, G How do we relate these two propertes?

More information

arxiv: v1 [math.ho] 18 May 2008

arxiv: v1 [math.ho] 18 May 2008 Recurrence Formulas for Fbonacc Sums Adlson J. V. Brandão, João L. Martns 2 arxv:0805.2707v [math.ho] 8 May 2008 Abstract. In ths artcle we present a new recurrence formula for a fnte sum nvolvng the Fbonacc

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

PhysicsAndMathsTutor.com

PhysicsAndMathsTutor.com PhscsAndMathsTutor.com phscsandmathstutor.com June 005 5. The random varable X has probablt functon k, = 1,, 3, P( X = ) = k ( + 1), = 4, 5, where k s a constant. (a) Fnd the value of k. (b) Fnd the eact

More information

Bernoulli Numbers and Polynomials

Bernoulli Numbers and Polynomials Bernoull Numbers and Polynomals T. Muthukumar tmk@tk.ac.n 17 Jun 2014 The sum of frst n natural numbers 1, 2, 3,..., n s n n(n + 1 S 1 (n := m = = n2 2 2 + n 2. Ths formula can be derved by notng that

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Chapter Twelve. Integration. We now turn our attention to the idea of an integral in dimensions higher than one. Consider a real-valued function f : D

Chapter Twelve. Integration. We now turn our attention to the idea of an integral in dimensions higher than one. Consider a real-valued function f : D Chapter Twelve Integraton 12.1 Introducton We now turn our attenton to the dea of an ntegral n dmensons hgher than one. Consder a real-valued functon f : R, where the doman s a nce closed subset of Eucldean

More information

a b a In case b 0, a being divisible by b is the same as to say that

a b a In case b 0, a being divisible by b is the same as to say that Secton 6.2 Dvsblty among the ntegers An nteger a ε s dvsble by b ε f there s an nteger c ε such that a = bc. Note that s dvsble by any nteger b, snce = b. On the other hand, a s dvsble by only f a = :

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Module 14: THE INTEGRAL Exploring Calculus

Module 14: THE INTEGRAL Exploring Calculus Module 14: THE INTEGRAL Explorng Calculus Part I Approxmatons and the Defnte Integral It was known n the 1600s before the calculus was developed that the area of an rregularly shaped regon could be approxmated

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

1 Generating functions, continued

1 Generating functions, continued Generatng functons, contnued. Generatng functons and parttons We can make use of generatng functons to answer some questons a bt more restrctve than we ve done so far: Queston : Fnd a generatng functon

More information

Complex Numbers Alpha, Round 1 Test #123

Complex Numbers Alpha, Round 1 Test #123 Complex Numbers Alpha, Round Test #3. Wrte your 6-dgt ID# n the I.D. NUMBER grd, left-justfed, and bubble. Check that each column has only one number darkened.. In the EXAM NO. grd, wrte the 3-dgt Test

More information

Problem Solving in Math (Math 43900) Fall 2013

Problem Solving in Math (Math 43900) Fall 2013 Problem Solvng n Math (Math 43900) Fall 2013 Week four (September 17) solutons Instructor: Davd Galvn 1. Let a and b be two nteger for whch a b s dvsble by 3. Prove that a 3 b 3 s dvsble by 9. Soluton:

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

CS-433: Simulation and Modeling Modeling and Probability Review

CS-433: Simulation and Modeling Modeling and Probability Review CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown

More information

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ CSE 455/555 Sprng 2013 Homework 7: Parametrc Technques Jason J. Corso Computer Scence and Engneerng SUY at Buffalo jcorso@buffalo.edu Solutons by Yngbo Zhou Ths assgnment does not need to be submtted and

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Introduction to Random Variables

Introduction to Random Variables Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

12. The Hamilton-Jacobi Equation Michael Fowler

12. The Hamilton-Jacobi Equation Michael Fowler 1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and

More information

Bayesian epistemology II: Arguments for Probabilism

Bayesian epistemology II: Arguments for Probabilism Bayesan epstemology II: Arguments for Probablsm Rchard Pettgrew May 9, 2012 1 The model Represent an agent s credal state at a gven tme t by a credence functon c t : F [0, 1]. where F s the algebra of

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1 MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε

More information

Math 261 Exercise sheet 2

Math 261 Exercise sheet 2 Math 261 Exercse sheet 2 http://staff.aub.edu.lb/~nm116/teachng/2017/math261/ndex.html Verson: September 25, 2017 Answers are due for Monday 25 September, 11AM. The use of calculators s allowed. Exercse

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

Analytical Chemistry Calibration Curve Handout

Analytical Chemistry Calibration Curve Handout I. Quck-and Drty Excel Tutoral Analytcal Chemstry Calbraton Curve Handout For those of you wth lttle experence wth Excel, I ve provded some key technques that should help you use the program both for problem

More information

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13] Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

A random variable is a function which associates a real number to each element of the sample space

A random variable is a function which associates a real number to each element of the sample space Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds. U.C. Berkeley CS273: Parallel and Dstrbuted Theory Lecture 1 Professor Satsh Rao August 26, 2010 Lecturer: Satsh Rao Last revsed September 2, 2010 Lecture 1 1 Course Outlne We wll cover a samplng of the

More information

Midterm Examination. Regression and Forecasting Models

Midterm Examination. Regression and Forecasting Models IOMS Department Regresson and Forecastng Models Professor Wllam Greene Phone: 22.998.0876 Offce: KMC 7-90 Home page: people.stern.nyu.edu/wgreene Emal: wgreene@stern.nyu.edu Course web page: people.stern.nyu.edu/wgreene/regresson/outlne.htm

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013 COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.

More information

A be a probability space. A random vector

A be a probability space. A random vector Statstcs 1: Probablty Theory II 8 1 JOINT AND MARGINAL DISTRIBUTIONS In Probablty Theory I we formulate the concept of a (real) random varable and descrbe the probablstc behavor of ths random varable by

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness. 20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The frst dea s connectedness. Essentally, we want to say that a space cannot be decomposed

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

CHAPTER IV RESEARCH FINDING AND DISCUSSIONS

CHAPTER IV RESEARCH FINDING AND DISCUSSIONS CHAPTER IV RESEARCH FINDING AND DISCUSSIONS A. Descrpton of Research Fndng. The Implementaton of Learnng Havng ganed the whole needed data, the researcher then dd analyss whch refers to the statstcal data

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

The Fundamental Theorem of Algebra. Objective To use the Fundamental Theorem of Algebra to solve polynomial equations with complex solutions

The Fundamental Theorem of Algebra. Objective To use the Fundamental Theorem of Algebra to solve polynomial equations with complex solutions 5-6 The Fundamental Theorem of Algebra Content Standards N.CN.7 Solve quadratc equatons wth real coeffcents that have comple solutons. N.CN.8 Etend polnomal denttes to the comple numbers. Also N.CN.9,

More information

Credit Card Pricing and Impact of Adverse Selection

Credit Card Pricing and Impact of Adverse Selection Credt Card Prcng and Impact of Adverse Selecton Bo Huang and Lyn C. Thomas Unversty of Southampton Contents Background Aucton model of credt card solctaton - Errors n probablty of beng Good - Errors n

More information

Section 3.6 Complex Zeros

Section 3.6 Complex Zeros 04 Chapter Secton 6 Comple Zeros When fndng the zeros of polynomals, at some pont you're faced wth the problem Whle there are clearly no real numbers that are solutons to ths equaton, leavng thngs there

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

COS 511: Theoretical Machine Learning

COS 511: Theoretical Machine Learning COS 5: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #0 Scrbe: José Sões Ferrera March 06, 203 In the last lecture the concept of Radeacher coplexty was ntroduced, wth the goal of showng that

More information

A combinatorial problem associated with nonograms

A combinatorial problem associated with nonograms A combnatoral problem assocated wth nonograms Jessca Benton Ron Snow Nolan Wallach March 21, 2005 1 Introducton. Ths work was motvated by a queston posed by the second named author to the frst named author

More information

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,

More information

Statistics Chapter 4

Statistics Chapter 4 Statstcs Chapter 4 "There are three knds of les: les, damned les, and statstcs." Benjamn Dsrael, 1895 (Brtsh statesman) Gaussan Dstrbuton, 4-1 If a measurement s repeated many tmes a statstcal treatment

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machne Learnng - Lectures Lecture 1-2: Concept Learnng (M. Pantc Lecture 3-4: Decson Trees & CC Intro (M. Pantc Lecture 5-6: Artfcal Neural Networks (S.Zaferou Lecture 7-8: Instance ased Learnng

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

MAE140 - Linear Circuits - Winter 16 Final, March 16, 2016

MAE140 - Linear Circuits - Winter 16 Final, March 16, 2016 ME140 - Lnear rcuts - Wnter 16 Fnal, March 16, 2016 Instructons () The exam s open book. You may use your class notes and textbook. You may use a hand calculator wth no communcaton capabltes. () You have

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Week 2. This week, we covered operations on sets and cardinality.

Week 2. This week, we covered operations on sets and cardinality. Week 2 Ths week, we covered operatons on sets and cardnalty. Defnton 0.1 (Correspondence). A correspondence between two sets A and B s a set S contaned n A B = {(a, b) a A, b B}. A correspondence from

More information

1.4. Experiments, Outcome, Sample Space, Events, and Random Variables

1.4. Experiments, Outcome, Sample Space, Events, and Random Variables 1.4. Experments, Outcome, Sample Space, Events, and Random Varables In Secton 1.2.5, we dscuss how to fnd probabltes based on countng. Whle the probablty of any complex event s bult on countng, brute force

More information