1.4. Experiments, Outcome, Sample Space, Events, and Random Variables

Size: px
Start display at page:

Download "1.4. Experiments, Outcome, Sample Space, Events, and Random Variables"

Transcription

1 1.4. Experments, Outcome, Sample Space, Events, and Random Varables In Secton 1.2.5, we dscuss how to fnd probabltes based on countng. Whle the probablty of any complex event s bult on countng, brute force countng s neffcent when s complex. In fact, there s no way to count f takes up uncountably many values (e.g., the useful lfe of a car can be any real number from 0 to 14 years, say). Fortunately, there has already been a rgorous, formalzed approach to gude us to determne probabltes of events. The approach s bult on defnng an experment, an outcome, a sample space, an event, and a random varable step by step, wth the latter concepts bult on earler ones. An experment s what we do to get raw data (samples, observatons) n order to study a gven (random, stochastc) phenomenon. Example (Dfferent Types of Experments) Rollng a Dce Suppose that you want to study the statstcal characterstcs of the number that you get n rollng a partcular dce. Then the rollng of the dce s the experment that you carry out to get sample data, whch may be a sequence such as {4, 1, 1, 6, 3, 2,...}. In settng your experment, you certanly need to decde the number of rolls (amount of data) that you want to make. Usually, ths s determned by the amount of resources that you have. (b) Qualty Control To check whether a process s n control or not, you sample output from the process and measure the sampled output accordng to some chosen crtera. Based on these sampled data (the raw data), you derve statstcs from whch you make nference on whether the process s n control or not. In ths case, the experment s what you do to get the sampled data. (c) Input Parameters of a Model To help the unversty cafetera to ease the congeston problem at lunchtme, you need to collect data that reflect the arrval and servce patterns of the cafetera at lunchtmes. In ths case, the experment s the desgn and the process to get the data; the data collected may look lke: Batch Sze n Tme arrved, t a Tme to get a Table, t t Tme to order, t o Tme to get the fst dsh, t f Tme to fnsh the last dsh, t l Tme to get the bll, t b Tme to leave, t d The exact type and amount of data that you collect depend on your model. An outcome of an experment s a sample data that you get from your experment. An outcome can be a vector, not just a sngle element; also, t can be qualtatve rather than quanttatve. 1

2 Example (Outcomes of Experments) Rollng a Dce 4 s a possble outcome n rollng a dce once. Actually, any number from 1 to 6 s a possble outcome for a standard dce. If we are talkng about rollng a dce twce, the outcome wll be a vector (m, n) such that 1 m, n 6. (b) Qualty Control The outcome can ether be quanttatve or qualtatve. Suppose that the process s a drllng machne that drlls holes wth a dameter cm. If you measure the actual dameter of holes n your experment, an outcome could be a number such as 4.01 or However, f n our experment, we only get a measure aganst a standard template whch shows whether a hole s good (wthn lmts specfed by the standard) or not (outsde lmts), then the outcome s just one of the two possbltes (good, defectve) [or (wthn lmts, outsde ltmus), etc.]. (c) Input Parameters of a Model An outcome of the experment may be a vector (n, t a, t t, t o, t f, t l, t b, t d ), or ts equvalence. Clearly outcomes vary wth the desgn of an experment. The set of all possble outcomes of an experment, denoted by, s the sample space of the experment. The elements of arecompletely dctated by how an experment s conducted. Later we wll see that outcomes of a sample space (of a gven experment) may or may not appear equally lkely. Example (Sample Space) Rollng a dce = {1, 2, 3, 4, 5, 6}. (b) Qualty Control = {x 3.98 x 4.02}. Note that there are two ways to defne : () lstng out all elements as n part, () specfyng the necessary and suffcent condtons for an outcome to be an element as n part (b). Usually () s used for smple, fnte, whle () s used for complex or nfnte. (c) Input Parameters of a Model 2

3 In ths case, we may not know the real, even after we have fnshed our experment. Who can know for sure the pattern of students and staff arrvng at the cafetera? In practce, we may defne = {(n, t a, t t, t o, t f, t l, t b, t d ) n {0}, a t a < t t < t o < t f < t l < t b < t d b}, where s the set of natural numbers, and a (say, = 11:30 am) and b (say, = 2:30 pm) are the two parameter values that we choose to stand for the begnnng and the end of the lunch tme. (d) Rollng two dce = {( 1, 2 ) and 1 6, = 1, 2}. (e) Flppng cons ndefntely = {( 1, 2,...) {head, tal}}. A sample space tells us what outcomes are possble. Frequently, we are nterested n a collecton of outcomes, rather than ndvdual ones. For example, when we bet n a casno, we may be nterested n the occurrence of any of {10, 11,, 18}, rather than n any partcular number (outcome). Such a collecton of outcome s called an event. Mathematcally, an event s a subset of the sample space. Later, we wll see that probabltes are defned on events rather than on outcomes. There are events of a sngle outcome n a fnte. Then what are the dfferences between an outcome and an event for such cases? Bascally, an outcome s an element of, whle an event s a subset. Of course there are subsets of sngle elements, but an element and the correspondng sngle-element subset are two dfferent objects. Example (Events) In the followng, A, B, C, and D are events defned n ther own sample space. Rollng a dce A = the event of gettng a 5 n a roll = {5}. B = the event of gettng an even number n a roll = {2, 4, 6}. C = the event of gettng a number less than 3 or no more than 5 n a roll = {1, 2, 5, 6}. (b) Qualty Control A = the event that the dameter of a hole s less than 4.00 cm = {x x < 4}. B = the event that the dameter of a hole s defectve = {x x < 3.98 or x > 4.02}. C = the event that the dameter of a hole les wthn the 1st to 3rd quadrants of tolerance range = { x 3.99 x 4.01}. 3

4 (c) Input Parameters of a Model A = the event that a batch of customers comes after 1:00 pm = { t a > 13}. B = the event that a batch of customers wats more than 15 mnutes before gettng a table. = { t t -t a 0.25}. C = the event that a batch s of three or more customers = { n 3}. (d) Rollng two dce A = the event that the frst dce s even and the second dce s less than the frst = {( 1, 2 ) 1 {2, 4, 6} and 1 > 2 }. B = the event that the sum of the two dce s greater than or equal to 5 = {( 1, 2 ) }. C = the event that the dfference between the two dce s greater than 7 = {( 1, 2 ) max( 1, 2 ) - mn( 1, 2 ) > 7} =. Note that C s an mpossble event. (e) Flppng cons ndefntely A B = the event that the number of heads s no less than the number of tals = {( 1, 2,...) number of such that s a head number of such that s a tal}. = the event that the frst two flps are heads and the fourth flp s a tal = {( 1, 2,...) 1 = 2 = head, 4 = tal }. We can generate events by set operatons unon (), ntersecton (), and complement [() c ]. Let A and B be events of. Then AB, AB, A c, and any sets formed by countable set operatons of A and B are subsets, and hence events, of. We have sad n the last part of Secton 1.1 that OR II s a course to study (sequences of) random varables. Here we wll gve more detals. Remember varables are nothng more than the x, y, or z that we use to denote unknowns n problems of our hgh school algebra. Random varables X, Y, or Z are of smlar nature. The dfference s that a real varable x can only take up one value, whle before we know the exact value of a random varable X, t may take up dfferent values (wth possbly dfferent probabltes). For example, there s only one result n rollng a far dce. However, before we observe the outcome, the face that shows up s wth probablty 1/6, = 1 to 6. How do we defne a functon f when we work wth real varables? One common defnton s that f s a mappng from. The defnton of a random varable s an analogy of real-valued functons: A random varable s a functon mapped from the sample space to the real lne. It mples that to defne random varables, we must frst have (an experment and ts). A random varable X s defned by the functon that takes up the real value X() for every. 4

5 For notatonal smplcty, usually we do not spell out when ts content s clear or when explctly spellng out s not necessary n solvng a problem. Smlarly, we hde as long as the actual content of s clear n the problem. Example (Random Varables Con t of Example 1.4.3) space. In the followng, N, X, Y, and Z are random varables defned n ther own sample Rollng a dce Let N be the number shown up n a roll. Remember that = {1, 2, 3, 4, 5, 6}. Note that N(1) = 1, N(2) = 2, N(3) = 3, N(4) = 4, N(5) = 5, and N(6) = 6. Clearly, N: s a functon mapped from to. Let X = 2N X s a functon mapped from to, snce t s defned through N, whch s a functon mapped from to. Check that X: such that X(1) = -1.5, X(2) = 0.5, X(3) = 2.5, X(4) = 4.5, X(5) = 6.5, X(6) = 8.5. It s clear that X s a functon mapped from to the real lne. 1, the roll gves an odd number, Let Y = Then Y: such that Y(1) = Y(3) = Y(5) 0, o. w. = 1 and Y(2) = Y(4) = Y(6) = 0. Y s an ndcator varable (functon) that ndcates whether an event occurs or not. Let Z = the square root of the number shown up n a roll. Then Z: such that Z() =, for. (b) Qualty Control Let X be the dameter of a drlled hole. Then X: such that X() =, for. Let Y be the cost spent n reworkng a defectve hole, gven that each rework costs $3. Then Y: such that Y() = 3, 0, 3.98 or 4.02, o. w.,. Note that s a dummy varable that can be replaced by any symbol as long as the symbol s defned to be an element of. (c) Input Parameters of a Model Let X be number of customers n a batch f the batch spends no more than an hour n the cafetera, and s equal to otherwse. 5

6 Then X: such that X() = n( ), f td( ) ta( ) 1, 489., ow..,. (d). Rollng two dce Let X be the sum of the two throws. Then X: such that X(( 1, 2 )) = 1 + 2, Let Y be the dfference between the larger and the smaller numbers. Then Y: such that Y(( 1, 2 )) = max( 1, 2 ) - mn( 1, 2 ),. (e) Flppng cons ndefntely Let X be the total number of heads n the second through the ffth flps; I j be the ndcator functon such that t s equal to 1 f the jth flp s a head, and s equal to 0 otherwse. (Could you wrte out the functon form of I j?) Then X: such that X() = 5 I j j2 ( ),. As we sad above, when the underlyng sample space s clear, we can save our effort by not explctly lstng out, nor. For example, the random varables X s n (c) and (e) of n, f td ta 1, 5 Example can be lsted, respectvely, as X = and X = I 489., ow.., j. In j2 fact, most appled probablty books prefer to use ths form, rather than the lengthy form wth explct lstng of and. Events are generated by random varables n ther correspondng by operators >,, <,, and =. For example, n part (c) of Example 1.4.5, {: X() > 0} {: t d () - t a () 1}; n part (e) of the same example, {X = 1} {there s 1 head n the second to the ffth flps}. Snce random varables and events are related ntmately, they wll be studed n parallel. You wll see that n latter chapters the dscusson of random varables and events comes hand n hand: (condtonal) expectaton of a random varable comes together wth the (condtonal) probablty of an event. Agan, for notatonal smplcty, we usually hde the dependence of events on. For example, we wrte {X > 0} for {: X() > 0} and {t d - t a 1} for {: t d () - t a () 1}. Don t you feel uncomfortable that up to now we have not used nor dscussed probablty? 1.5. Probablty of Events As we mentoned above, t s mpossble to compute probabltes by purely countng. For example, there may be uncountably many outcomes n. The followng s a standard, mathematcally consstent way to defne probabltes. 6

7 The probablty of events s a (set) functon [.e., a functon that defnes on a set] P that has the followng propertes: P(A) 0 for any A. (b) If A s are mutually exclusve subsets of,.e., A and A A j = for j, then P(A 1 A 2...) = P(A 1 ) + P(A 2 ) (c) P() = 1. Exercse Further Propertes of Probablty Let A and B be subsets of. Show that P(A c ) = 1 - P(A). (b) P() = 0. (c) f A B, then P(A) P(B). (d) 0 P(A) 1. (e) P(AB) = P(A) + P(B) - P(AB). The probabltes of events on a sample space are dependent on how the events are generated by the random mechansm of the sample space. For smple problems, we just use the ntutve dea of farness among dfferent outcomes. Example Roll a dce There are sx faces on a (standard) dce. Provdng that we roll randomly, there s no reason for us to beleve that one face has more chance than another one. So t suffces to determne the probablty of an event by total # of outcomes relevant to the event. P(an even number shown up) = 3/6 = 0.5. total # of possble outcomes (b) Qualty Control Suppose that a hole drlled s equally lkely to be somewhere n (3.97, 4.02), whle the acceptable range s (3.98, 4.02). Snce the length of a lne segment reflects the number of possble outcomes, we may ntutvely put the probablty of an event as length of the range of outcome relevant to the event length of the possble range ( ) For example, P(a hole s defectve) = 02.. Remark We wll soon see that n complex problems the probabltes of events are not found from the ntutve farness dea. In fact, there s no farness at all; smlar events can have substantally dfferent probabltes. We work wth random varables. How can we compute the probablty of a random varable? To do so, we fnd the probablty of the event that s generated by a random 7

8 varable (through the equalty and nequalty operators). For example, to fnd the probablty that X s equal to 3, we determne the probablty of the set { X() = 3}; smlarly, to fnd the probablty that X s greater than t, we determne the probablty of the set { X() > t}. To smplfy exposton, we usually hde n the above expressons and wrte as P(X = 3) and P(X > t). How to fnd the probabltes of complex events? Obvously t s not by countng as n Example Real-lfe problems have so many varables that you cannot fnsh countng a small porton of a problem before you graduate. We may count wth the help of computers, whch s what smulaton for. To prepare you for smulaton, you need basc knowledge n stochastc models, whch s one motvaton of OR II. The process to fnd probabltes of complex events can be smplfed by notng the structure of the underlyng problem and by manpulatng rules n probablty theory and analyss. A lot of the calculaton for probabltes (and expected values) wll be smplfed by condtonng. See Chapter Dscrete and Contnuous Random Varables, Probablty Mass Functons, Densty Functons, and Cumulatve Dstrbuton Functons Random varables that you encounter n undergraduate textbooks generally can be classfed as ether dscrete or contnuous. A dscrete random varable has non-zero probablty on countable number of tems whle a contnuous random varable has non-zero probablty on uncountable number of tems. (Strctly speakng, a contnuous random varable has non-zero densty, rather than probablty, on uncountable number of tems,.) As a result, the sample space of a dscrete random varable has a countable number of elements [for example, of Example 1.4.3, (d), and (e)], and the sample space a contnuous random varable s uncountable [for example, of Example (b) and (c)]. How to fnd the probabltes of events {X = 3} or {X 4.69}? Ths s determned by a probablty mass functon for a dscrete random varable and by a densty functon for a contnuous random varable. The probablty mass functon (p.m.f.) p: of a dscrete random varable X s the functon satsfyng p() = P(X = ), for all. (b) p () 1. By defnton, p() 0. For smplcty, we may use p to denote p(). Example Dscrete Random Varables Let = {1, 2, 3, 4, 5, 6}. The followng p s are p.m.f. s on. 8

9 p 1 = p 2 = p 3 = p 4 = p 5 = p 6 = 1/6 and all other p s are equal to 0. Ths s a (dscrete) unform dstrbuton on. (b) p 2 = 0.5, p 3 = 0.05, p 4 = 0.3, p 5 = 0.05, p 6 = 0.1, and all other p s are equal to 0. (c) p 4 = 1 and all other p s are equal to 0. The random varable has a pont mass at 4. Example shows that for the same, there can be many (n fact, uncountably many) dfferent probablty densty functons that can be defned on t. The densty functon f: of a contnuous random varable X s the functon satsfyng f(x) 0, for all x. (b) f ( x) dx 1. (b) P ( X t) f ( x) dx, for all t. { x t} Example Contnuous Random Varables Let = (0, 1), a lne segment from 0 to 1. The followng f s are densty functons on. f(x) = 1 for 0 < x < 1, and f(x) = 0 otherwse. Ths s a unform dstrbuton on. (b) f(x) = 4x for 0 < x < 0.5, and f(x) = 4-4x for 0.5 x < 1. Ths s a trangular dstrbuton on. The cumulatve dstrbuton functon F: of a random varable, no matter dscrete or contnuous, s defned as p, t for dscrete random varables, F(t) = P(X t) = f ( x) dx, for contnuous random varables. xt Exercse Cumulatve Dstrbuton Functons Fnd the cumulatve dstrbuton functons of the p.m.f. s and densty functons n Examples and It can be shown that gven the cumulatve dstrbuton functons F (of a random varable X), ts correspondng p.m.f. (of X for a dscrete X) or densty functon (of X for a contnuous X) s completely determned. Smlarly, gven a p.m.f. or a densty functon, the correspondng F s also completely determned. From here onwards, when we refer to the 9

10 dstrbuton of a random varable, we may talk about ts p.m.f., densty functon, or ts cumulatve dstrbuton functon, whchever exsts and whchever s dscussed under the gven context Mean, Moments, and Varance of Random Varables The mean (expectaton) of a random varable X s = E(X) = p, xf ( x) dx, for dscrete random varables, for contnuous random varables. It s a measure of the central tendency of the random varable. The rth moment, r > 0, a random varable X s E(X r ) = r x r p, f ( x) dx, for dscrete random varables, for contnuous random varables. The frst moment s the mean of a random varable. The varance 2 of a random varable X s 2 = V(X) = E[(X-) 2 ] = ( ) ( x ) 2 2 p, f ( x) dx, for dscrete random varables, for contnuous random varables. It s a measure of the dsperson of the random varable. Expectaton s a lnear operaton: Let x and y be the mean of X and Y, respectvely; a and b be two real numbers; then E(aX + by) = a x + b y. The expected value of a functon of a random varable can be found easly. Let h(x) be a real-valued functon. Then E[h(X)] = h( ) p, h( x) f ( x) dx, f X s dscrete, f X s contnuous. Exercse Propertes of Mean, Moments, Varance, and Covarance Lnear Transform of Varances Let Y = ax + b, where a and b are two real numbers. Show that V(Y) = a 2 V(X). (b) Relatonshp between the Second Moment and Varance 10

11 E(X 2 ) = V(X) + E 2 (X). (c) Covarance V(X+Y) = V(X) + V(Y) + 2 Cov(X, Y), where Cov(X, Y) = E[(X- x )(Y- y )] s the covarance of X and Y Condtonal Probablty, Bayes formula, and Independence of Events Let A and B be two events defned on the same such that P(B) > 0. Then the probablty of A condtonng on (the occurrence of) B s P(A B) = P(AB)/ P(B). Example Rollng two dce Two dce are rolled. Let A = {the sum s less than 5} and B = {the frst number s 2}. We can fnd P(A B) ether by countng or by condtonal probablty. Countng = {(m, n): 1 m, n 6} has 36 elements. A = {(1, 1), (1, 2), (1,3), (2, 1), (2, 2), (3, 1)}; B = {(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6)}. AB = {(2, 1), (2, 2)} has two elements (outcomes). Gven the occurrence of B, there s no preference of any of ts element over other ones. So, the outcome belongs to A wth probablty 2/6 = 1/3, whch s P(A B). (b) Mathematcal Defnton Clearly P(B) = 1/6 (why?). AB = {(2, 1), (2, 2)} whch means that P(AB) = 2/36 = 1/18. P(A B) = P(AB)/ P(B) = (1/18)/(1/6) = 1/3. In the above smple example, there s not much dfference between countng and condtonal probablty. In countng, P(A B) s found from 2/6; n condtonal probablty, P(A B) s found from (1/18)/(1/6) = (2/36)/(6/36),.e., the condtonal probablty s smply got by dvdng the numerator and denomnator n countng by the total number of elements n. In a way ths s true for all condtonal probabltes. However, the power of condtonng to smplfy the calculaton of probablty and expectaton s tremendous. See Chapter 3. Let B s be mutually exhaustve sets on ;.e., B = and B B j = for j. Assume that P(B ) > 0. The Bayes formula shows how to get P(B A) from P(A B ): P(B A) = PB ( A ) P( A) Example P( B A) = P( B A) j j P( B ) P( A B ) =. P( B ) P( A B ) j j j 11

12 A gadget s produced from two producton lnes L 1 and L 2. The producton rates of L 1 and L 2 are 200 peces / hour and 300 peces / hour, respectvely. The defectve rates of L 1 and L 2 are 0.01 and 0.02, respectvely. Gven that a gadget s defectve, fnd the probablty that t s produced by L 1. Sol. Let A = {the gadget s defectve}; B = {the gadget s produced by L }, = 1, 2. P(A B 1 ) = 0.01, P(A B 2 ) = 0.02, P(B 1 ) = 0.4, P(B 2 ) = 0.6. The probablty that the gadget s produced by L 1 s P(B 1 A) = P( B1 A) PB ( A) = P( B1) PAB ( 1) PB ( ) PAB ( ) = ( 0.4)( 001. ) ( 0.4)( 001. ) ( 06. )( 002. ) = Thnk twce about ths deceptvely smple example. What s the bg deal of t? The calculaton s so smple that even a kd of 12 years old can plug numbers nto the formulae PB ( 1) PAB ( 1) to get Gettng a correct number s not the man pont of such a problem PB ( ) PAB ( ) (n fact, not the man pont of studyng). Rather, the essence s to buld up the ablty to formulate a problem, whch, n ths example, s to defne A and B s correctly to make the calculaton trval. Bear n mnd that modelng s the process to defne varables and represent ther nteractons logcally and/or analytcally. Just knowng how to use CPLEX (ARENA) to solve a lnear program (to smulate) does not necessarly mean that you know lnear programmng (smulaton). Now we are gong to ntroduce another mportant concept n probablty: ndependence. Vaguely speakng, two random objects are ndependent from each f ganng knowledge on one object does not affect your assessment on any probablstc evaluaton of the other object. Ths dea can be formalzed. We frst study the ndependence of events. Later, we wll generalze nto the ndependence of random varables. Two events A and B are ndependent from each other f P(AB) = P(A)P(B). Such a relaton among events (and generalzed to random varables) smplfes calculaton. Example (Independent Events) Sx dce are rolled ndependently. Let A = {face shows up n rollng the th dce}, = 1 to 6. Fnd P( 6 1 A ). Sol. Snce the rolls are ndependent, P( 6 1 A 6 ) = PA ( ) 1 = See how ndependence smplfes the calculaton. We do not need to construct for rollng the sx dce and count the number of elements. We get the correct probablty of an event by followng standard rules n ndependence. An equvalent defnton of ndependence of two events A and B s that P(A B) = P(A) (when P(B) > 0). Intutvely, ths defnton says that events A and B are ndependent f the 12

13 occurrence of B does not affect the probablty of occurrence of A. Note that the above relaton s symmetrc wth respect to A and B,.e., P(A B) = P(A) s equvalent to P(B A) = P(B) (when P(B) > 0) Jont Dstrbutons, Margnal Dstrbutons, and Independence of Random Varables The dstrbuton of a dscrete (resp. contnuous) random varable can be specfed by the dstrbuton functon and the p.m.f. (resp. densty functon) of the random varable. In real-lfe stochastc models, usually there are many random varables and we need to consder a jont dstrbuton: Let X and Y be two random varables defned on the same. Ther jont dstrbuton functon F(x, y) = P(X x, Y y), for all real x and y. If X and Y are dscrete random varables, they have a jont p.m.f. p a,b = P(X = a, Y = b). If X and Y are contnuous 2 F ( x, y) random varables, they have a jont densty functon f(x, y) =. By defnton, for all x xy and y, F(x, y) = y j y xx P( X { ty} { sx} x, Y y f ( s, t) dsdt, j ), for dscrete X and Y, for contnuous X and Y. For a par of contnuous random varables, the jont densty functon f(x, y) = 2 2 F( x, y) P( X x, Y y). xy xy The margnal dstrbuton of X can be found from the jont dstrbuton by settng the lmt for Y to : P(X x) = P(X x, Y < ) = F(x, ). Smlarly, the margnal dstrbuton of Y s found from P(Y y) = P(X <, Y y) = F(, b). The margnal dstrbuton can also be defned from jont p.m.f. or densty functon. Check that f f(x, y) s the jont densty of X and Y and f X (x) s the margnal densty of X, then f X (x) = f ( x, t) dt. Smlarly, the margnal densty of Y f Y (y) = f ( s, y) ds. Example (Jont and Margnal Dstrbutons of Dscrete Random Varables) A bn contans three whte and two black balls. The whte balls are numbered 1, 2, and 3, whle the black balls are numbered 4 and 5. Let X be the number of a ball randomly pcked from bn; Y = 1 f the ball s black; Y = 0 o.w. Fnd the jont dstrbuton of (X, Y). (b) Fnd the margnal dstrbuton of X. (c) Fnd the margnal dstrbuton of Y. 13

14 Sol. In terms of the outcomes, the probablty dstrbuton s: W1 W2 W3 B4 B5 (X, Y) () (1, 0) (2, 0) (3, 0) (4, 1) (5, 1) P{} In terms of the random varables, the probablty dstrbuton s: Y X (b) P(X = 1) = y P( X 1, Y y) = 0.2. Smlarly, P(X = 2) = P(X = 3) = P(X = 4) = P(X = 5) = 0.2. (c) P(Y = 0) = x P( X x, Y 0 ) = 0.6, and P(Y = 1) = x P( X x, Y 1 ) = 0.4. Example (Jont and Margnal Dstrbutons of Contnuous Random Varables) The jont densty functon of X and Y s f(x, y) = 3 x y, 0 < x, y < 1. 2 Fnd the margnal dstrbuton of X. (b) Fnd the margnal dstrbuton of Y. Sol. f X (x) = x y dy = 5 x, 0 < x < (b) f Y (y) = x y dx = 5 y, 0 < y < We have seen n Secton 1.8 how ndependence of events smplfes calculatng ther probabltes. Ths can be generalzed to random varables, snce events are generated by random varables wth operators,, <, >, and =. Two random varables X and Y are ndependent f ther jont dstrbuton F(x, y) = P(X x, Y y) = P(X x)p(y y) for all x and y. If X and Y are dscrete, the ndependence s equvalent to P(X = x, Y = y) = P(X = x)p(y = y) for all x and y. If X and Y are contnuous, the ndependence s equvalent to the jont densty functon f(x, y) = f(x) f(y) for all x and y. Ths s a drect generalzaton of ndependent of events. A term such as {X x, Y y} ({X = x, Y = y}, etc.) s ndeed the ntersecton of {X x}{y y} ({X = x}{y = y}, etc.). The defnton bascally says that for random varables X and Y to be ndependent, any par of 14

15 events {X x} and {Y y} ({X = x} and {Y = y}, etc.) generated by X and Y must be ndependent (events). The ndependence of random varables can also be expressed by the more ntutvely appealng concept that nformaton of one random varable does not affect the dstrbuton of the other varable. For example, when both X and Y are dscrete, they are ndependent f P(X = x Y = y) = P(X = x) for all x, y. Agan, the above expresson s symmetrc. It s equvalence to P(Y = y X = x) = P(Y = y). Example (Independent Random Varables) Two cons are flpped (ndependently). Let T be the number of tals and H be the number of heads n the two flps; H be the number of head n the th flp, = 1, 2. We are gong to check whch pars of random varables are ndependent from each other. T and H Snce H + T = 2, they cannot be ndependent from each other. For example, P(T = 1, H = 0) = 0 P(T = 1) P(H = 0). (b) H and H 1 They cannot be ndependent from each other. Gven H = 0, H 1 must be 0, whch means that P(H = 0, H 1 = 1) = 0 P(H = 0) P(H 1 = 1). By a smlar reasonng, H and H 2 are dependent; T and H 1 are dependent; T and H 2 are dependent. (c) H 1 and H 2 Snce the flps are ndependent, H 1 and H 2 must be ndependent random varables. To be rgorous, we need to go through the mathematcs. = {(head, head), (head, tal), (tal, head), (tal, tal)}, and each element has equal chance of occurrence. P(H 1 = 1, H 2 = 1) = 0.25 = P(H 1 = 1) P(H 2 = 1), P(H 1 = 1, H 2 = 0) = 0.25 = P(H 1 = 1) P(H 2 = 0), P(H 1 = 0, H 2 = 1) = 0.25 = P(H 1 = 0) P(H 2 = 1), P(H 1 = 0, H 2 = 0) = 0.25 = P(H 1 = 0) P(H 2 = 0), whch confrms that H 1 and H 2 are ndependent. Note that n Example the dependence and ndependence of random varables are more or less argued by notng the nteracton among random varables. Mathematcs s only used to confrm our ntuton. For smple problems as ths one, our qualtatve argument s 15

16 suffcent. Mathematcs s reserved for or complex problems. In fact, notng the nteracton among random varables s so essental for modelng that we better sharpen our ablty n t. One may ask where the boundary between smple and complex problems s. The answer s that there s no well-defned boundary at all. Experts can use ntuton to shorten a fve-page mathematcal proof to half a page [though sometme, some experts may force a (wrong) result by (wrong) ntuton]. In any case, one should equp hmself wth both ntuton and mathematcs. Example (Cont n of Example Independence of random varables) The two random varables X and Y n Example are dependent because f(x, y) = 3 x y 5 x 5 y ( )( ) = f X (x) f Y (y) for at least some x and y We say that X and Y are ndependent and dentcally dstrbuted (..d.) f they are ndependent and have the same dstrbuton. Example (A Property of Independent Random Varables) Let X and Y be ndependent random varables. Show that E(XY) = E(X)E(Y). Sol. We gve a partal proof that s suffcent for our purpose. Consder the case that X and Y are dscrete random varables. E(XY) = y x xyp X x, Y y) ( = yp ( Y y) xp( X ) = yp ( Y y) E( X ) = E ( X ) yp( Y y) = E(X)E(Y). y y y x x In fact, f X and Y are ndependent, for any real-valued functons g and h, E[g(X)h(Y)] = E[g(X)]E[h(Y)]. Example (Covarance of ndependent random varables) If X and Y are ndependent, show that ther covarance s equal to zero. Sol. Cov(X, Y) = E[(X- x )(Y- y )] = E(X- x ) E(Y- y ) = 0. The second equalty makes use of the ndependence of X and Y as stated n Example Exercse Let X and Y be two ndependent random varables; G and H are two realvalued functon. Show that G(X) and H(Y) are ndependent Convoluton 16

17 Let X and Y be two ndependent random varables and Z = X + Y. Then the dstrbuton of Z s called the convoluton of X and Y. Ths can be expressed as p.m.f., densty functons, or cumulatve dstrbuton functons, whchever approprate. We wll dscuss how to fnd Z n later chapters. 17

18 Fnal Remark of Chapter 1. The materal from 1.4 to 1.10 can be found from Ross [2000] and Walpole and Myers [1989]. Sectons n Notes Sectons n Ross [2000] Sectons n Walpole & Myers [1989] , , 1.5, (pp 23-27, 33-34) 2.2, (pp 37-38) ,

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Engineering Risk Benefit Analysis

Engineering Risk Benefit Analysis Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007

More information

CS-433: Simulation and Modeling Modeling and Probability Review

CS-433: Simulation and Modeling Modeling and Probability Review CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown

More information

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran Statstcs and Quanttatve Analyss U430 Segment 3: Probablty Prof. Sharyn O Halloran Revew: Descrptve Statstcs Code book for Measures Sample Data Relgon Employed 1. Catholc 0. Unemployed. Protestant 1. Employed

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

A be a probability space. A random vector

A be a probability space. A random vector Statstcs 1: Probablty Theory II 8 1 JOINT AND MARGINAL DISTRIBUTIONS In Probablty Theory I we formulate the concept of a (real) random varable and descrbe the probablstc behavor of ths random varable by

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

CS 798: Homework Assignment 2 (Probability)

CS 798: Homework Assignment 2 (Probability) 0 Sample space Assgned: September 30, 2009 In the IEEE 802 protocol, the congeston wndow (CW) parameter s used as follows: ntally, a termnal wats for a random tme perod (called backoff) chosen n the range

More information

Introduction to Random Variables

Introduction to Random Variables Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering Statstcs and Probablty Theory n Cvl, Surveyng and Envronmental Engneerng Pro. Dr. Mchael Havbro Faber ETH Zurch, Swtzerland Contents o Todays Lecture Overvew o Uncertanty Modelng Random Varables - propertes

More information

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space. Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

7. Multivariate Probability

7. Multivariate Probability 7. Multvarate Probablty Chrs Pech and Mehran Saham May 2017 Often you wll work on problems where there are several random varables (often nteractng wth one another). We are gong to start to formally look

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE THE ROYAL STATISTICAL SOCIETY 6 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE PAPER I STATISTICAL THEORY The Socety provdes these solutons to assst canddates preparng for the eamnatons n future years and for

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

A random variable is a function which associates a real number to each element of the sample space

A random variable is a function which associates a real number to each element of the sample space Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics ) Ismor Fscher, 8//008 Stat 54 / -8.3 Summary Statstcs Measures of Center and Spread Dstrbuton of dscrete contnuous POPULATION Random Varable, numercal True center =??? True spread =???? parameters ( populaton

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Stochastic Structural Dynamics

Stochastic Structural Dynamics Stochastc Structural Dynamcs Lecture-1 Defnton of probablty measure and condtonal probablty Dr C S Manohar Department of Cvl Engneerng Professor of Structural Engneerng Indan Insttute of Scence angalore

More information

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

Modelli Clamfim Equazione del Calore Lezione ottobre 2014

Modelli Clamfim Equazione del Calore Lezione ottobre 2014 CLAMFIM Bologna Modell 1 @ Clamfm Equazone del Calore Lezone 17 15 ottobre 2014 professor Danele Rtell danele.rtell@unbo.t 1/24? Convoluton The convoluton of two functons g(t) and f(t) s the functon (g

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness. 20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The frst dea s connectedness. Essentally, we want to say that a space cannot be decomposed

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology Inverse transformatons Generaton of random observatons from gven dstrbutons Assume that random numbers,,, are readly avalable, where each tself s a random varable whch s unformly dstrbuted over the range(,).

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

PhysicsAndMathsTutor.com

PhysicsAndMathsTutor.com PhscsAndMathsTutor.com phscsandmathstutor.com June 005 5. The random varable X has probablt functon k, = 1,, 3, P( X = ) = k ( + 1), = 4, 5, where k s a constant. (a) Fnd the value of k. (b) Fnd the eact

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

Complex Numbers. x = B B 2 4AC 2A. or x = x = 2 ± 4 4 (1) (5) 2 (1)

Complex Numbers. x = B B 2 4AC 2A. or x = x = 2 ± 4 4 (1) (5) 2 (1) Complex Numbers If you have not yet encountered complex numbers, you wll soon do so n the process of solvng quadratc equatons. The general quadratc equaton Ax + Bx + C 0 has solutons x B + B 4AC A For

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

Winter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan

Winter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan Wnter 2008 CS567 Stochastc Lnear/Integer Programmng Guest Lecturer: Xu, Huan Class 2: More Modelng Examples 1 Capacty Expanson Capacty expanson models optmal choces of the tmng and levels of nvestments

More information

Decision-making and rationality

Decision-making and rationality Reslence Informatcs for Innovaton Classcal Decson Theory RRC/TMI Kazuo URUTA Decson-makng and ratonalty What s decson-makng? Methodology for makng a choce The qualty of decson-makng determnes success or

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

8.6 The Complex Number System

8.6 The Complex Number System 8.6 The Complex Number System Earler n the chapter, we mentoned that we cannot have a negatve under a square root, snce the square of any postve or negatve number s always postve. In ths secton we want

More information

Strong Markov property: Same assertion holds for stopping times τ.

Strong Markov property: Same assertion holds for stopping times τ. Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up

More information

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table: SELECTED PROOFS DeMorgan s formulas: The frst one s clear from Venn dagram, or the followng truth table: A B A B A B Ā B Ā B T T T F F F F T F T F F T F F T T F T F F F F F T T T T The second one can be

More information

1 Definition of Rademacher Complexity

1 Definition of Rademacher Complexity COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #9 Scrbe: Josh Chen March 5, 2013 We ve spent the past few classes provng bounds on the generalzaton error of PAClearnng algorths for the

More information

Lecture 3 Stat102, Spring 2007

Lecture 3 Stat102, Spring 2007 Lecture 3 Stat0, Sprng 007 Chapter 3. 3.: Introducton to regresson analyss Lnear regresson as a descrptve technque The least-squares equatons Chapter 3.3 Samplng dstrbuton of b 0, b. Contnued n net lecture

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Cathy Walker March 5, 2010

Cathy Walker March 5, 2010 Cathy Walker March 5, 010 Part : Problem Set 1. What s the level of measurement for the followng varables? a) SAT scores b) Number of tests or quzzes n statstcal course c) Acres of land devoted to corn

More information

Learning from Data 1 Naive Bayes

Learning from Data 1 Naive Bayes Learnng from Data 1 Nave Bayes Davd Barber dbarber@anc.ed.ac.uk course page : http://anc.ed.ac.uk/ dbarber/lfd1/lfd1.html c Davd Barber 2001, 2002 1 Learnng from Data 1 : c Davd Barber 2001,2002 2 1 Why

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y) Secton 1.5 Correlaton In the prevous sectons, we looked at regresson and the value r was a measurement of how much of the varaton n y can be attrbuted to the lnear relatonshp between y and x. In ths secton,

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables LINEAR REGRESSION ANALYSIS MODULE VIII Lecture - 7 Indcator Varables Dr. Shalabh Department of Maematcs and Statstcs Indan Insttute of Technology Kanpur Indcator varables versus quanttatve explanatory

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

An (almost) unbiased estimator for the S-Gini index

An (almost) unbiased estimator for the S-Gini index An (almost unbased estmator for the S-Gn ndex Thomas Demuynck February 25, 2009 Abstract Ths note provdes an unbased estmator for the absolute S-Gn and an almost unbased estmator for the relatve S-Gn for

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Density matrix. c α (t)φ α (q)

Density matrix. c α (t)φ α (q) Densty matrx Note: ths s supplementary materal. I strongly recommend that you read t for your own nterest. I beleve t wll help wth understandng the quantum ensembles, but t s not necessary to know t n

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Polynomials. 1 More properties of polynomials

Polynomials. 1 More properties of polynomials Polynomals 1 More propertes of polynomals Recall that, for R a commutatve rng wth unty (as wth all rngs n ths course unless otherwse noted), we defne R[x] to be the set of expressons n =0 a x, where a

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

DS-GA 1002 Lecture notes 5 Fall Random processes

DS-GA 1002 Lecture notes 5 Fall Random processes DS-GA Lecture notes 5 Fall 6 Introducton Random processes Random processes, also known as stochastc processes, allow us to model quanttes that evolve n tme (or space n an uncertan way: the trajectory of

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information