Interesting Probability Problems

Size: px
Start display at page:

Download "Interesting Probability Problems"

Transcription

1 Interesting Probability Problems Jonathan Mostovoy University of Toronto August 9, 6 Contents Chapter Questions a) b) Chapter Questions 3 a) b) Chapter 3 Questions 4 3a) b) c) d) e) f) Chapter 4 Questions 9 4a) b) c) Chapter 5 Questions 5a) b) Non-Textbook Problems 3 6a) A b) B c) C d) D

2 Chapter Questions a).8.7 A deck of 5 cards contains four aces. If the cards are shuffled and distributed in a random manner to four players so that each player receives 3 cards, what is the probability that all four aces will be received by the same player? Answer: Since agents i =,..., 4 the probability that agent i will receive 4 aces is ) 3 4. Since 4 agents, it = 4 ) 3 4 scenarios where one agent receives all 4 aces. Therefore, and since there are ) 5 4 ways of arranging the four aces in a deck of 5 cards, the probability that all four aces will be received by the same player is: P r4 aces recieved by same agent) = 4 ) ) ,3,3,9 ) = 4 ) = b) ,3,3,3 Let A,..., A n be n arbitrary events. Show that the probability that exactly one of these n events will occur is: P ra i ) P ra i A j ) + 3 P ra i A j A k ) + ) n+ np ra A A n ) i= i<j i<j<k Proof. = i= Let Pn) = n P ra i \A, A,..., A i, A i+,..., A n )), which we shall prove: i= P ra i ) P ra i A j ) + 3 i<j i<j<k P ra i A j A k ) + ) n+ np ra A A n ) Our plan is to form induction on the function Pn). We begin at n =. We define an alternate definition of the set A and B as A = x : x A\B or A B} and B = y : y B\A or A B}. Both sets for which all x or y in A or B are trivially disjoint to one other since A\B) A B) =. Thus, P ra) = P ra\b)+p ra B). From here, we find P ra)+p rb) = P ra\b)+p rb\a)+ P ra B). Now since all three sets are disjoint, we can replace our additions to unions to yield: P ra) + P rb) P ra B) = P) = P ra\b) P rb\a) We now have our inductive hypothesis assuming validity of our Theorem for Pn), thus for k = n+: Pn + ) = n+ i= P ra i \A, A,..., A i, A i+,..., A n, A n+ )) But by our inductive hypothesis, we know what Pn) is, thus all we need to do is add on the remove all intersections of A n+ and add back it s own probability in a sense, take away the overlaps that A n+ could have on everything else) terms in our Pn) formula, i.e.: Pn + ) = P ra i ) + P ra n+ ) P ra i A j ) P ra i A n+ ) +... i= i<j i=

3 + ) n+ np ra A A n ) + ) n+ np ra A 3 A n n + ) ) n+ P ra A n+ ) n+ n+ = P ra i ) P ra i A j )+3 i= i<j n+ i<j<k P ra i A j A k ) + ) n+ n+)p ra A A n+ ) Chapter Questions a)..4 A machine produces defective parts with three different probabilities depending on its state of repair. If the machine is in good working order, it produces defective parts with probability.. If it is wearing down, it produces defective parts with probability.. If it needs maintenance, it produces defective parts with probability.3. The probability that the machine is in good working order is.8, the probability that it is wearing down is., and the probability that it needs maintenance is.. Compute the probability that a randomly selected part will be defective. Answer: We first define the notation: g, w, b, d as good working order, wearing down, needs maintenance bad) and production of defective parts respectively. We summarize the information given as: P rd g) =., P rd w) =., P rd b) =.3, P rg) =.8, P rw) =. = P rb). We now recall the Law of Total Probability, which states: given events B,..., B k form a partition of the space S and P rb j ) > for j =,..., k. Then, for every event A in S: P ra) = k P rb j )P ra B j ) j= Since g, w & b must be independent and sum to, our answer is a direct application of the Law of Total Probability: Prd) = i P ri)p rd i), j = g, w, b) =.8).) +.).) +.).3) =. b).. Suppose that A,..., A k form a sequence of k independent events. Let B,..., B k be another sequence of k events such that for each value of j, j =,..., k), either B j = A j or B j = A c j. Prove that B,..., B k are also independent events. Hint: Use an induction argument based on the number of events B j for which B j = A c j. Proof. We let n, n k denote the number of B j s where the relation B j = A c j holds, which = k n cases where B j = A j. For n =, our desired relation trivially holds because we assume A,..., A k is independent. For higher order cases, we recall the identity of P ra B) = P ra) P ra B c ) and the fact that if A,..., A l is independent, then so too will A,..., A v where v l. 3

4 Thus, through induction, we assume P rb B k ) are independent, and n of these B j s satisfy B j = A c j. Therefore, for s = n +, we want to see if we can factor P rb B i, Bi c B i+ B k ) where it was B i that now became Bi c from n to n +. We see this relationship to be true from: P rb B i B i+ B k B c i ) = P rb B i B i+ B k ) P rb B i B i+ B k B i ) = P rb ) P rb i ) P rb i+ ) P rb k ) P rb B i B i+ B k B i ) = P rb ) P rb i ) P rb i+ ) P rb k ) P rb ) P rb i ) P rb i+ ) P rb k ) P rb i ) = P rb ) P rb k ) P rb i )) = P rb ) P rb i ) P rb c i ) P rb i+) P rb k ) 3 Chapter 3 Questions 3a) 3..3 An ice cream seller takes gallons of ice cream in her truck each day. Let X stand for the number of gallons that she sells. The probability is. that X =. If she doesn t sell all gallons, the distribution of X follows a continuous distribution with a p.d.f. of the form: cx for < x < fx) = otherwise where c is a constant that makes P rx < ) =.9. Find the constant c so that P rx < ) =.9 as described above. Answer: All we need to remember that the integral of our p.d.f over our sample space must be equal to the the total probability of occurrence, which in this case is equal to.9, but usually. Thus: cx =.9 = c =.9 = c = 9 =.45 3b) Prove that the quantile function F of a general random variable X has the following three properties that are analogous to properties of the c.d.f.:. F is a non-decreasing function of p for < p <.. Let x = lim p, p> F p) and x = lim p, p< F p). Then x equals the greatest lower bound on the set of numbers c such that P rx c) >, and x equals the least upper bound on the set of numbers d such that P rx d) >. 3. F is continuous from the left; that is F p) = F p ) for all < p <. Proofs: 4

5 . Proof. Assume p, q, ) and p q. Then, by definition of the quantile function and since F x) is non-decreasing, F is non-decreasing since p, q: ) ) F p) = minx R : F x) p} F q) = minx R : F x) q}. Proof. x : Let z > z > z 3 >... be a decreasing sequence of numbers such that lim n z n =. = x = lim p, p> F p) = minx R : F x) z n } = minx R : F x) c} n= for some c R where no other x R is less than c and F x) >, which is equivalent to saying c is the least upper bound of all possible numbers where F c) > the condition P rx c) > x : Let y < y < y 3 <... be an increasing sequence of numbers such that lim n y n = = x = lim p, p< F p) = minx R : F x) y n } = minx R : F x) d} n= for some d R where no other x R is greater less than d and F x) <, which is equivalent to saying d is the least upper bound of all possible numbers where F d) < the condition P rx d) > 3. Proof. Let y < y < y 3 <... be an increasing sequence of numbers such that lim n y n = p. We immediately see that: F p) = minx R : F x) p} = minx R : F x) y n } n= = F p) = lim n F y n ) = F p ) 3c) Suppose that X and Y have a continuous joint distribution for which the joint p.d.f. is defined as follows: cy for x and y fx, y) = otherwise Determine:. the value of the constant c. P rx + Y > ) 3. P ry < /) 5

6 4. P rx ) 5. P rx = 3Y ) Answers: cy dxdy = = P rx + Y > ) = P ry < ) = P rx ) = cy dxdy = x cy dy = c 3 = = c = 3 3y dxdy = 3 8 3y dxdy = 8 3y dxdy = 5. P rx = 3Y ) = since with an n-dimensional continuous probability space, the probability that an n )-dimensional event occurs is effectively ; i.e., since X = 3Y is a line in a -dimensional continuous probability space, the probability this happens must be. 3d) Suppose that the joint p.d.f. of X and Y is as follows: 4xy for x, y and x + y fx, y) = otherwise Are X and Y independent? Answer: We recall Theorem 3.5.5, which states: when dealing with a joint p.d.f. fx, y), the random variables, X and Y, will be independent fx, y) = h x)h y) where h i z) is a nonnegative function only dependent on z. For every point within the defined triangle, we can define two functions which work, e.g; kx for x [, ] h x) = otherwise where k R, and h y) = 4 k y for x [, ] otherwise However, if we choose a point within the the unit square outside of our triangle, we need fx, y) =, but h x) > and h y) > which thus leads to a contradiction which = X and Y are NOT independent. Thus, we can conclude the following generalization: If x i within fx, x,... ) whose domain is a function of at least other variable, x j where j i, then some dependency amongst the variables x, x,.... 6

7 3e) Let the conditional p.d.f. of X given Y be g x y) = 3x y for < x < y and otherwise. Let the 3 marginal p.d.f. of Y be f y), where f y) = for y but is otherwise unspecified. Let Z = X Y. Prove that Z and Y are independent and find the marginal p.d.f. of Z. Proof. By the definition of conditional probability, the joint p.d.f. of X, Y ) is 3x f y) fx, y) = y if x > 3 otherwise We recall that if Z = X Y, we can define the dummy second random variable W = Y so that the Jacobian of our inverse transformation of x = zw and y = w is: x x J = z w y y = w z Thus, z w gz, w) = fx, y)w = fzw, w)w = 3zw) f w)w w 3 = 3z f w) Where the bounds on our variables were previously established in the question. Thus, since gz, w) = f z)f w) we may conclude independence. Further, the marginal p.d.f. of Z will be: 3z if z, ) f z) = otherwise Since f w)dw = if f w) is a proper probability function. 3f) 3..6 Let X, X be two independent random variables each with p.d.f. f x) = e x for x > and f x) = for x. Let Z = X X and W = X X.. Find the joint p.d.f. of X and Z.. Prove that the conditional p.d.f. of X given Z = is: e x for x > h x ) = otherwise 3. Find the joint p.d.f. of X and W. 4. Prove that the conditional p.d.f. of X given W = is: 4x e x for x > h x ) = otherwise 7

8 5. Notice that Z = } = W = }, but the conditional distribution of X given Z = is not the same as the conditional distribution of X given W =. This discrepancy is known as the Borel paradox. In light of the discussion that begins on page 46 about how conditional p.d.f. s are not like conditioning on events of probability, show how Z very close to is not the same as W very close to. Hint: Draw a set of axes for x and x, and draw the two sets x, x ) : x x < ɛ} and x, x ) : x x < ɛ} and see how different they are.. Answer: First, let us define the Jacobian from the two equations X = V and X = V Z: x x J = v z = = x v x z gv, z) = fx, x ) J = e v e v z) = e v e z e x e z. Proof. By definition, h x ) = gx,) g ). Thus, we first find g z): = g z) = max,z) [ gx, z)dx = ] x= e z e x x =max,z) e h x ) = e x = e x and if x e = e z if z ez if z < 3. Answer: First, let us define the Jacobian from the two equations X = V and X = V W : = J = x v x v x w x w = w v w gv, w) = fx, x ) J = v w e v e v w ) = v w e v+ w ) x w e x+ w ) for v/x, w > 4. Proof. By definition, h x ) = gx,) g ). Thus, we first find g w): = g w) = gx, w)dx = h x ) = x w e x+ w ) w + w ) [ + x w + w ) e x+ w ) ] x= x = = 4x e x and if x w= = w + w ) 5. The difference between x, x ) : x x < ɛ} and x, x ) : x x below, which = h x w) h x z). < ɛ} can be seen 8

9 4 Chapter 4 Questions 4a) Let X be a random variable with mean µ and variance σ, and let ψ t) denote the m.g.f. of X for < t <. Let c be a given positive constant, and let Y be a random variable for which the m.g.f. is: ψ t) = e cψt) ) for < t < Find expressions for the mean and the variance of Y in terms of the mean and the variance of X.. Answer: We first summarize: ψ ) =, ψ ) = µ and ψ ) = σ + µ. We compute: dψ t) dt = cψ )e cψ) ) = cµ t= d ψ t) dt = cψ )e cψ) ) + c ψ )) e cψ) ) = cσ + µ ) + c µ t= Therefore, Mean = cµ and Variance = cσ + µ ) + c µ cµ) = cσ + µ ). 4b) 4.7. Suppose that X and Y are random variables such that EY X) = ax + b. Assuming that CovX, Y ) exists and that < VarX) <, determine expressions for a and b in terms of EX), EY ), VarX), and CovX, Y ). Answer: We recall EEX X )) = EX ) = EEY X)) = EY ) = EaX +b) = aex)+b. Thus, we have our first equation: EY ) = aex) + b. Next, we apply EXfX, Y )) to both sides of the equation, yielding on the left: EXEY X)) = EEXY X)), and by what we had originally noted above, = EXY ), which = EXaX + b)) = EaX + bx) = aex ) + bex). Therefore, we have our second equation: EXY ) = aex ) + bex). So we must solve the following linear equations: 9

10 . EXY ) = aex ) + bex). EY ) = aex) + b Which yields the equation: 4c) EY ) aex) = EXY ) aex ) EX) = EX)EY ) EXY ) = aex)) ) EX ) = a = EXY ) EX)EY ) EX ) EX)) = CovX, Y ) VarX) = b = EY ) aex) = EY ) EX) CovX, Y ) VarX) Suppose that X,..., X n are random variables for which VarX i ) has the same value σ for i =,..., n and ρx i, X j ) has the same value ρ for every pair of values i and j such that i j. Prove that ρ n. Proof. Let us first note that if we want to find the variance of the sum of: Z = X + + X n, then: VarZ) = i= VarX i ) + i<j CovX i, X j ) In this scenario, we have VarX i ) = σ i and CovX i, X j ) = ρσ i j, which = ) n V arz) = nσ + ρσ = nσ + nn )ρσ We next recall V ary ) Y. Therefore, and in recalling σ, n > : nσ + nn )ρσ = nσ nn )ρσ = n )ρ = ρ n 5 Chapter 5 Questions 5a) In this exercise, we shall prove that the three assumptions underlying the Poisson process model do indeed imply that occurrences happen according to a Poisson process. What we need to show is that, for each t, the number of occurrences during a time interval of length t has the Poisson distribution with mean λt. Let X stand for the number of occurrences during a particular time interval of length t. Feel free to use the following extension of Eq ): For all real a, lim + au + ou)) u = e a u

11 . For each positive integer n divide the time interval into n disjoint subintervals of length t n each. For i =,..., n, let Y i = if exactly one arrival occurs in the i th subinterval, and let A i be the event that two or more occurrences occur during the i th subinterval. Let W n = n i= Y i. For each non-negative integer k, show that we can write P rx = k) = P rw n = k) + P rb), where B n i= A i.. Show that lim n P r n i= A i) =. Hint: Show that P r n i= Ac i ) = + ou)) u where u = n. 3. Show that lim n P rw n = k) = e λ λt) k k!. Hint: lim n n! n k n k)! =. 4. Show that X has the Poisson distribution with mean λt.. Proof. Trivially from chapter, we know X = k} = X = k} A) X = k} A c ) sets A, and the two sets, X = k} A) and X = k} A c ) are disjoint. If we let A = n i= A i, then X = k} n i= A i) c ) = W n = k}. We now trivially note X = k} n i= A i)) A = P rx = k) = P rw n = k) + P rb) where B = X = k} n i= A i)) A.. Proof. First, we note as is stated in the part, n disjoint subintervals of length t n = A,..., A n are independent and that P ra i ) = P ra j ) i, j. Therefore, P r n i=a c i) = n P ra c i) = [P ra c )] n = [ P ra )] n i= It was our assumption that P ra i ) = o n ) = ou) = lim P ra) = lim + n n n o n ))n = e = 3. Proof. We recall that if n Bernoulli R.V. s with the parameter p = λt n + ou), and if W n = n i= Y i, then: ) ) k n λt P rw n = k) = k n + ou) λt ) n k n ou) Next, we note lim n n k λt n Therefore, + ou))k = λt) k, and lim n n k λt n ou))n k = e λt. lim P rw n = k) = λt)k e λt n k! lim n n! n k n k)! = e λt λt) k k! 4. Proof. From part, we already saw P rx = k) = P rw n = k) + P rb), since P rx = k) fn) = P rx = k) = lim P rw n = k) + lim P rb) = e λt λt) k + lim n n k! + n n o n ))n = e λt λt) k k! By parts -3

12 5b) Review the derivation of the Black-Scholes formula 5.6.8). For this exercise, assume that our stock price at time u in the future is S e µu+wu, where W u has the gamma distribution with parameters αu and β with β >. Let r be the risk-free interest rate.. Prove that e ru ES u ) = S µ = r α log β β ).. Assume that µ = r α log β β ). Let R be minus the c.d.f. of the gamma distribution with parameters αu and. Prove that the risk-neutral price for the option to buy one share of the stock for the price q at time u is S Rc[β ]) qe ru Rcβ), where: ) ) q β c = log + αu log ru β S 3. Find the price for the option being considered when u =, q = S, r =.6, α =, and β =.. Proof. We recall ψt) for the Gamma distribution is = β β t )α. Therefore, Thus, S = e ru ES u ) ES u ) = ES e µu+wu = S e µu Ee Wu ) = S e µu β β )α S = e ru S e µu β β )αu β β log) = ru+µu+αu log ) µ = r α log β β ). Proof. We recall the value of an option at time u will be hs u ), where hs) = s q if s > q and otherwise. Therefore, hs u ) > ) ) q β W > log + αu log ru = c β S The risk-neutral price of the option is the present value of EhS u )), which equals: [ ] β e ru EhS ) ) = e ru αu S e µu+wu q Γαu) wαu e βw dw c We split the integrand into two parts at the q. The second integral is then just a constant times the integral of a normal p.d.f., namely, The first integral is: qe ru c β αu Γαu) wαu e βw dw = qe ru Rcβ) e ru S c e µu+wu βαu Γαu) wαu e βw dw = S [Rcβ ))]

13 Combining these two integrals yields: S Rc[β ]) qe ru Rcβ) where ) ) q β c = log + αu log ru S β 3. Since q = S it = q c = log S ) + αu log From here we substitute c into ) β ru = log β S S ) ) + log S Rc[β ]) qe ru Rcβ) S [ R )) e.6 R )) ] 6 Non-Textbook Problems 6a) A S For an event B with P B) >, define QA) = P A B) for any event A. Show that Q satisfies Axioms -3 of probability and conclude Q is a probability. Let us first recall the first 3 Axioms of Probability:.) For every event A, P ra).) P rs) = 3.) If A, A,... is a countably infinite sequence of disjoint events, then P r i= A i) = i= P ra i). Proof. Since P r is a probability function, we know that sets X, P rx). Furthermore, we recall the formal definition of condition probability: P ra B) = P ra B) P rb). From here we note x A B), x B = P ra B) P rb) = P ra B) P rb). Thus, we can conclude the first Axioms hold. For the 3rd Axiom, We find if A, A,... is a countably infinite sequence of disjoint events, then i j A i B) A j B) = A i A j ) B = B) = = A i B) and A j B) are independent = P r i= A i B) = i= P ra i B) steps made due to the distribution law and definition of disjointness respectively). Let us call these findings Rule Z. Next, P r i= QA i)) = P r i= A i B)) =... = P r i= A i)) B) P rb) = P ra B) A B)... ) P rb) by distribution law 3

14 = i= = P ra i B) by Rule Z P rb) P ra i B) and thus completes our proof i= 6b) B Assume that X, X,... are an i.i.d. random variables having E X j p ) < for < p. Let µ = EX j ).. Show that X n = X+ +Xn n µ in L p as n.. Show that X n µ almost surely.. First, we recognize the function, fx j ) = X j p must be convex for p, ]. Therefore, we can establish the lower bound from Jensen s Inequality: E X n p ) EX n ) p and by the Theorems of 4., plimex n )) = µ. Therefore, from Jensen s and Sec. 4., X n c s.t. c µ as n. We now establish the upper bound by E X n p ) n i= E X i p ) = also by 4. findings) X n k, k µ Therefore, combining both our upper and lower bounds, we get X n µ as n. By our previous findings, we know each of these upper and lower bounds work for plim s, but they also do work for a.s. covergance, therefore, P lim n X n = µ) = as n. 6c) C Two random variables X and Y have bivariate normal distribution if the joint density is: ) pdf X,Y x, y) = πσ x σ y ρ e ρ x µx ) σx ) ρ x µx σx ) y µy y µy )+ σy σy ). Compute marginal probability density function of X.. Show conditional distribution of Y given X = x is Nν, τ ) and find ν and τ. 3. Show that X and Y are independent if and only if CovX, Y ) =.. Proof. Let QX, Y ) = ρ ) QX, Y ) as: QX, Y ) = = [ x µx ρ ) σ x x µ x σ x [ y µ y σ y ) ρ x µx σ x ) y µy σ y ) + y µy σ y ) ). We can re-write ) ) x µ x ) ρ + ρ ) ] x µ x ) σ x σ x ) ) ] y αx) +, where αx) = µ y + ρ σ y x µ x ) σ y ρ σ x 4

15 Thus, since f X x) = f X,Y x, y)dy, we now have: We next recognize that: πσy ρ e f X x) = e x µx σx ) e y αx) σy ρ ) dy πσ x σ y ρ y αx) σy ρ ) dy the p.d.f. of the Nαx), σ y ρ ) distribution Therefore, Thus, ξx, y) = f X x) = ζx)ξx, y) = ζx) = πσy ρ e y αx) σy ρ ) dy = πσx e x µx σx ) Nµ x, σ x). Proof. We recall: f Y X y x) = f X,Y x,y) f X x). Thus, from part, we can immediately substitute in f X x) = ζx) and f X,Y x, y): f Y X y x) = [πσ xσ y ρ ] e = πσy ρ e [ x µx σx [ πσ x ] e x µx σx ) y αx) σy ) + y αx) σy ρ ) ] ρ ) Nαx), σ y ρ ) Thus, f Y X y x) does = Nν, τ), where ν = αx) and τ = σ y ρ. 3. Proof. We recall Corollary 3.5., which states two variables are independent f X,Y x, y) = f X x)f Y y). Using this information, and by the symmetry of x and y in the Bivariate Normal Distribution, we know f Y y) Nµ y, σ y). Therefore, f Y y)f X x) = e πσ x σ y [ x µx σx ) + y µy ) ] However, since ρ [, ], we have a problem with independence. If we let: gx, y, ρ) = ρ ) logπσ x σ y ) M gx, y, ρ), ρ) = e ρ gx,y,ρ) log ρ ), and x µ x σ x σy ) ρ x µ x σ x ) y µ y ) + y µ y ) ) σ y σ y Thus one can see that: M gx, y, ρ), ρ) = f X,Y x, y). Furthermore, by construction, it is impossible for M gx, y, ρ), ρ) = f X x)f Y y) unless ρ = since otherwise the ρ and log ρ ) terms will be and respectively, thereby shifting the density too much for the now non-zero ρ x µx σ x ) y µy σ y ) to bring back. Also, since ρ = CovX, Y ) = = X and Y are independent CovX, Y ) = 5

16 6d) D Suppose Y i ind. exponentialµ i ) where µ i = βx i where β > and x i >.. Find β maximizing: n pdf Yi y i ). Show that the ˆβ found in ) converges to β in probability. 3. Show that Var ˆβ). i=. Proof. We know f Yi y i ) = βx i e βx y i i Therefore, by standard MLE practices, we have L = n i= [ e βx y i n i = logl ) = log βx i β n i= n And Maximizing: n logβ) + log logl ) β = n β + n β n i= x i= i x i ) β y i x i = = ˆβ = n ) ] e n y i β i= x i i= y i x i = y i x i= i x iµ i n i=. First, we know E ˆβ) = E n y i n i= x i ) = n E n y i i= x i ) = yi n E x i ) = n nβ) = β. By theorems from 4., and by Chebyshev s Inequality, and noting EY i ) = /µ i = EY i /x i ) = = βxi x i = β. 3. Since there is only one parameter within an exponential distribution, and σ = µ for exponential = ˆβ = ˆσ. Since we already showed that plim ˆβ) = β = plim ˆβ ) = β. Next, we apply Chebyshev s Inequality, which results in the fact that Var ˆβ) as n else our previous findings would not hold under such assumptions. 6

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get 18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Probability: Handout

Probability: Handout Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006 Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Elements of Probability Theory

Elements of Probability Theory Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

TAMS39 Lecture 2 Multivariate normal distribution

TAMS39 Lecture 2 Multivariate normal distribution TAMS39 Lecture 2 Multivariate normal distribution Martin Singull Department of Mathematics Mathematical Statistics Linköping University, Sweden Content Lecture Random vectors Multivariate normal distribution

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

Lecture 21: Convergence of transformations and generating a random variable

Lecture 21: Convergence of transformations and generating a random variable Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017 Probability Notes Compiled by Paul J. Hurtado Last Compiled: September 6, 2017 About These Notes These are course notes from a Probability course taught using An Introduction to Mathematical Statistics

More information

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2 Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

1 Complete Statistics

1 Complete Statistics Complete Statistics February 4, 2016 Debdeep Pati 1 Complete Statistics Suppose X P θ, θ Θ. Let (X (1),..., X (n) ) denote the order statistics. Definition 1. A statistic T = T (X) is complete if E θ g(t

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Joint Gaussian Graphical Model Review Series I

Joint Gaussian Graphical Model Review Series I Joint Gaussian Graphical Model Review Series I Probability Foundations Beilun Wang Advisor: Yanjun Qi 1 Department of Computer Science, University of Virginia http://jointggm.org/ June 23rd, 2017 Beilun

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:. MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

Lecture 5: Expectation

Lecture 5: Expectation Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4

More information