Hints and Answers to Selected Exercises in Fundamentals of Probability: A First Course, Anirban DasGupta, Springer, 2010

Size: px
Start display at page:

Download "Hints and Answers to Selected Exercises in Fundamentals of Probability: A First Course, Anirban DasGupta, Springer, 2010"

Transcription

1 Hints and Answers to Selected Exercises in Fundamentals of Probability: A First Course, Anirban DasGupta, Springer, 00

2 Chapter The number of sample points are: a) 3760; b) 560; c) 6840; d) 9; e) / 5 Yes, because the total possible number of initials is less than 0,000 6 a) P (AB)+P(BC)+P(AC) 3P (ABC) 7 a) EF c G c ;b)egf c ;d)e F G 9 a) Mutually exclusive; c) Not mutually exclusive 0 a) Head on the first toss; c) Same outcome on the first two tosses 3/4 3 / 4 5/6 5 (3!) 4 /9! 6 a) 0 0 ; b) 0!/ /6 /3 Use inclusion-exclusion formula with A i as the event that the handyman gets day i of the week off, i =,,, 7 3 Just count the sample points 5 Use inclusion-exclusion formula with A i as the event that the ith child gets chosen everyday of the week For example, P (A )=(/3) 7 6 / (8!) /8; /; 9/3; 43/64 33 (a) is more likely 35 (7 6) ( 0 6!!(3 6 ) 8! 6 ) Think of the complement of the event, which is that the m shoes are of m different colors, and from each of these m colors, you would choose either the left or the right shoe 40 4 (n r+)r!(n r)! n! (n k+) (n+)(n+)

3 3) 3) 43 a) (6 ( 5 3) ; b) ( 6 ( 5 3)( 39 3) 44 Use inclusion-exclusion formula with A i as the event that the hand of the ith player is void in the particular suit For example, P (A )= (39 47 /4 Chapter 3) ( 5 3) =05 withm = (365) 3 Use Stir- n 5 Solve the equation m(m ) (m n+) m n ling s approximation to find the root Chapter 3 3 5/6 3 5/6 33 /3 34 P (B A) 8/9 36 5% of days 38 0/9 39 Consider the events A i = North has i aces and South has 4 i, and look at P (A 4 i= A i )andp(a 3 4 i= A i ) ( 3 ) ( 59 ) 3 The winning probability of Sam is j=0 ( ) j Find one of the other two probabilities, and then the third one by subtraction 35 Consider the events A i = The last face to show up is face i, and observe that P ( 6 i=a i )=6P(A ) 36 First solve the equation p 4 +( p) 4 = / weeks 30 Both strategies are equally good 3 /8; use Bayes theorem 3 x/(x + y) 34 p/(6 5p) 36 a) 000; b)

4 Consider the probability that the first cell remains empty and the other two do not, and prove that this probability is 3y( x)( x y) Then find the required conditional probability 39 List the sample points and find P (b <ac), where b is the off-diagonal element, by direct counting The probability of this event is about 008, which is small 334 Chapter 4 (N r)(n r ) (N )(N ) 4 p(0) = p(4) = /6; p() = p(3) = /4; p() = 3/8 F (x) =0(x<0); = /6(0 x<); = 5/6( x<); = /6( x< 3); = 5/6(3 x<4); = (x 4) 4 p(0) = p(3) = /6; p() = 5/8; p() = /9; p(4) = /9; p(5) = /8 43 p(0) = /; p() = /4; p() = /8; p(3) = p(4) = /6 44 Use the total probability formula; eg, P (X =)= 6 E(X) =7/ (a) P (h(x) =±) = 4/3; P (h(x) =0)=5/3; E(h(X)) = 0 48 First prove that the variance of X must be zero 6n= n n,etc; 40 The expected value is 3n(n )5n 6 n 4 (a) /5; (b) 4/ n Consider indicator random variables I i of the events that the pair of cards marked as i remain in the jar Show that E(I i )= (N m ) ( N m ) 40 Show that P (X x) = 0 for x < 5 and is equal to (x 5) ( 0 5 ) for x =5, 6,, 0 From here, find the pmf of X 4 Show that P (X >n)= (366 n) for n =, 3,, 365, and (365) n then apply the tailsum formula log 4

5 Differentiate E(X a) with respect to a 48 Write n= np (X >n)= n= n( j=n+ p(j)) as j= p(j)( j n= n), and simplify 49 Mean = (n +)/; variance = (n )/ Find medians separately for odd and even n 43 (a) At most, people; (b) at most 55,555 people 433 First show that the maximum possible value of E(X )ife(x) =µ is µm Then apply the Paley-Zygmund inequality 434 Look at random variables whose pmf go to zero at the rate x p+ 437 Show that {f(x ) c} and {g(x ) d} are independent events for any c, d 438 A sufficient condition is that E(X )=E(X )=0 Chapter 5 5 pgf is s/( s), s < ; the mgf is e t /( e t ),t<log 5 For example, G(s) = s, s 0 54 e bt ψ(at) 55 Use the fact that x x for all x 57 Use the fact that E[X(X ) (X k +)]=G (k) () For example, σ = G () () + G () () [G () ()] 59 Use the fact that e tx is a convex function of x for any t 5 κ = p; κ = p( p); κ 3 = p 3p +p 3 ; κ 4 = p 7p +p 3 6p 4 Chapter (n = 0); 0445(n = 30); 03(n = 50) The limit can be proved to be zero 6 p(0) = p(3) = 0; p() = 038 The distribution is not Binomial ; 075 The most likely value is zero (n =);0687(n =);05973(n =3);5845(n =4);05757(n = 5) 68 For p /

6 6 (k/n) n ((k )/N ) n,k =,,,N for with replacement sampling; ( n) ( k k n ),k = n,,n for without replacement sampling ( N n) 6 Compute a Poisson approximation with n = 00 and p = (00 00) = The final answer is The probability of getting an A is (0) 5 ; the probability of getting a B is ( ) 5 4 (0) 5 (08) + ( ) 6 4 (0) 5 (08),etc 66 Binomial(00, 75) 69 Find the sum ( k=0 P (X) = Y = k) andsimplifyittoe λ ( p) n n n! k=0( λp p )k /[(k!) (n k)!] 60 Solve two appropriate equations to first show that p = 0988,n = 86, and so the variance is np( p) = Use the skewness and kurtosis formulas for a general Poisson distribution; the answers are 0447; No You have to first prove that if three such values exist, they must be consecutive integers 630 Compute a Poisson approximation to P (X ) The answer is No +e 634 λ 637 (a) 0839; (b) 00803; (c) Expand (q+p) n and (q p) n by using the Binomial expansion theorem and add, with q = p Chapter 7 7 It is a density for k = 7 c = 73 (a) c =;(b)forx>0,f(x) =/+x x 4 /; (c) 085; 085;

7 74 F (x) =0(x<0), = p( cos(x))+( p)sinx(0 x π), =(x> π) Median solves p( cos(x)) + ( p)sinx =05 If p =05, the median is π 4 76 (a) c = 4/6875; (b) 4074%; %; (c) 9 minutes 77 F (x) =0(x<0), = x/5(0 x 5), =(x>5); expected payout is $87500 This distribution is a mixed distribution 79 Density function of the ratio of the two lengths is, 0 <v<; +v expected value is log = 0693 You should find the density by first finding the CDF 70 One example is f(x) =, 0 <x< x 7 One example is f(x) =,x x 73 Quantile function is p 76 (a) Mean = 0; variance = / 77 c =/5; mean is 3π 5π and the variance is 69π 00 ] Quantile function is tan [(p )π 75th percentile equals 70 Use the trigonometric identity arctan x + arctan y = π + arctan x+y xy for all x, y > 0,xy >, and plug in suitable special values of x, y 7 The density function of Y = is e y y 3/ X π,y >0 73 First prove that E[f(X)] must belong to the range space of f and then use the intermediate value theorem In the standard normal case, x 0 = log Use the formula for the area of a triangle in pp 439 of the text 77 α/ Γ( α+ ) π 730 µ = x 0 [e h(t)dt 0 ]dx n 73 n+ 733 n 735 Use Lyapounov s inequality suitably 736 Use Jensen s inequality suitably 737 I(x) =0(x ), = x log x (x >) Chapter 8 8 a =/,b=/4 8 a =,b=5 7

8 83 /3 85 (a) x 3 3x is a strictly monotone function of x on (0, ), taking values in (, 0) The equation x 3 3x = y has a unique real root, say x = g(y) The density function of Y = X 3 3X is g (y), <y<0 (b) f Y (y) = y, 0 <y</4; (c) f Y (y) = πy 3/4, 0 <y< y 86 (a) X can be simulated as X = U /5 ; ( ) ; (b) X can be simulated as X = sin πu ( ) (c) X can be simulated as X = log ( U) if U > and X = ( ) +log U if U 87 Substitute β =6 α into the equation F (0) = 0, and solve for α numerically Then find β =6 α n j=0 ( 89 The nth moment equals n j) (j+)(n j+) n 8 Write the mean absolute deviation as E( X µ ) =c m m+n 0 ( m m+n x)x m ( x) n dx + c m (x m m+n m+n )xm ( x) n dx, and then integrate term by term by expanding ( x) n Here c is the normalizing constant of the Beta density 8 α = 0985,β = 08548,P(X >) = 0086 This problem requires numerical work 84 e / The density is e (y )/,y 87 Use the fact that n i= X i G(n, ) and then follow the same steps as in e 8 The mean is n i= m i and the second moment is n i= m i (m i +) 83 The number of centuries is log( 5/N ) log,wheren =0 5 Thisworks out to approximately 836 centuries 84 P (X >λ) =e, which does not depend on λ 85 (a) The expected profit is E[(c c )min(x, t)] c 3 t; (b) The optimum value of t satisfies F (t) = c 3 c c,provided0< c 3 c c < 88 The density function of Y = e X is a standard exponential density 8

9 89 The density function of Y =loglog is X e ey e y, <y< 830 The density function of Y = θ is X αyα, 0 <y 83 (a) 00465; (b) 00009; (c) this is the probability that X>Y where X, Y are independent Poisson variables with means 4/3 and ; see Theorem 63 in text for more details on how to compute it 83 s t 833 (a) The answer is yes; (b) the answer is no 834 The distribution is Bin(n, u) t Chapter ; ; 0357; 05 9 ; 5 93, The density of Y = is Z y π e y, <y< It is uniformly bounded 94 Z +:all;z 3 : all -3; Z 3 : all zero 95 The density of Y = φ(z) is, 0 <y log(πy ) π Φ(x) = +erf(x/ ) 90 A :668%; B :47%; C :383%; D :857%; F :8% 9 If X denotes the diameter of a ball bearing, then we want E(X 48 X 5) and Var(X 48 X 5) These are the mean and the variance of a truncated distribution (section 4) For example, E(X X 5) equals xf(x)dx This is equal to 5 (why?) The variance P (48 X 5) calculation should be done by first calculating E(X 48 X 5) 93 Y = g(z) has a mixed distribution Its CDF is F Y (y) =0(y< a), = Φ( a)(y = a), =Φ(y)( a <y<a), =(y a) oz 95 First prove that E[Φ(X)] = P (Z < X), where Z is an independent standard normal variable Since Z X N( µ, +σ ), P (Z < X)= µ P (Z X<0) = Φ( +σ ) 98 Median = e µ ;mode=e µ σ 90 0, 0, 0,

10 9 (a) 00037; (b) About at x = 65 Chapter 0 0 Exact = 03770; normal approximation without continuity correction = 0643; normal approximation with continuity correction = Exact = 088; normal approximation = The first person has to make at least 44 correct predictions and the second person has to make at least 434 correct predictions Find the probability of the intersection of these two events 06 P (Z )= This equals P ( n i= (X i Xi ) > 0) Use the central limit theorem on these new variables X i Xi The answers are: n =0:00339; n =0: 00049; n =30: reservations can be allowed rolls should suffice 05 First show that the mean and the variance of log X are - and The final answers are 038 for (a) and for (b) Formula for 99% confidence interval is (X +335)± X Formula for 95% confidence interval is (X +9) ± X For X =5, 95% confidence interval is 69 ± 478, and 99% confidence interval is 835 ± (a) n; (b)99n; (c) Use the CLT for each of the totals scored by Tom and Sara and then find a normal approximation for the difference of these two totals The final answer is (a) P (Z > 4) 0; (b) use normal approximation for Binomial with n =5,p= 054 6x( x)dx; the final answer is To find the third moment of S n,expand(x + X + + X n ) 3 and 0

11 take expectations term by term Then simplify 07 The roundoff error on one transaction is uniformly distributed on { 50, 49, 48,, 0,, 49} Thisgivesµ = 05 Now find σ,andthen use a normal approximation to P ( 5 S n 5), with n = It is about 45 times more likely that you will get exactly 50 heads than that you will get more than sixty heads 03 It is just a bit more likely that you will get exactly 0 sixes Chapter The joint pmf of (X, Y )isp(, 0) = /7,p(3, 0) = p(3, ) = p(4, ) = p(6, ) = p(6, ) = /7, and p(x, y) = 0 otherwise E(X) =6/7; Var(X) = /49; E(Y )=;Var(Y )=6/7; ρ X,Y 059 The joint pmf of (X, Y )isp(0, 4) = p(3, ) = p(4, 4) = /6,p(, ) = /4,p(, 0) = 3/8,p(3, ) = 3/6, and p(x, y) = 0 otherwise E(Y )=3/6 3 (a) c = /36; (b) They are independent; (c) E(X) = E(Y ) =,E(XY )=4 4 (a) c = /5; (b) They are not independent; (c) E(X) = 53/5; E(Y ) = 67/5; E(XY ) = 47/5 6 The joint pmf of (X, Y )isp(, 3) = p(, 4) = p(, 5) = p(, 3) = p(, 5) = p(, 6) = p(3, 4) = p(3, 5) = p(3, 7) = p(4, 5) = p(4, 6) = p(4, 7) = /, and p(x, y) = 0 otherwise X, Y are not independent 7 The direct method of calculating E(X Y = y) is to use the formula x xp(x,y) )( 39 )( 3 x 3 x y )( 6+x 3 y E(X Y = y) = In this problem, p(x, y) x p(x,y) =( 3 x 0,x + y 3 However, you can avoid the calculation by logically arguing how many clubs South (or any other player) should get, if y clubs have already been picked up by North You can also logically see that E(X Y =3) and E(Y X = 3) must be the same 9 E(Y )=0 0 Var(X) = λ + λ Use the fact that P (X >Y)andP (Y > X) are equal in this problem, and the fact that P (X >Y)+P (Y > X)+P (X = Y )= Nowfind P (X = Y ) and then conclude that P (X Y )= p ; P (X >Y)= p p ),x,y,

12 3 pmf of X + Y is P (X + Y =)=P (X + Y =8)=004; P (X + Y = 3) = P (X+Y =7)=0; P (X+Y =4)=P (X+Y =6)=0; P (X+Y = 5) = 06 4 (a) They are not independent; (b) 0; (c) 0 7 X Poi(λ + η); Y Poi(µ + η) X and Y are not independent The joint pmf is found by simplifying the sum P (X = x, Y = y) = min(x,y) w=0 e λ λ x w (x w)! e µ µ y w (y w)! e η η w w! 8 (a) 35+y; (b)35y; (c) y 0 The joint pmf of (X, Y )isp(, 0) = /; p(3, 0) = p(3, ) = p(4, 0) = p(4, ) = /8 E(X Y =0)=35 You can argue logically what E(X Y =) should be You can prove this by using the result in problem 6 Show, by using problem 6, that E(XY )=E Y [E(XY Y = y)] E(Y )E(X), and so Cov(X, Y ) 0 3 Use the two iterated expectation formulas E(X) =E Y [E(X Y = y)] and E(XY )=E Y [E(XY Y = y)] 4 Without loss of generality, you may assume that each of X, Y takes thevalues0, Now try to represent the covariance between X and Y in terms of P (X =0,Y =0) P (X =0)P (Y = 0) Finally show that if P (X =0,Y =0)=P (X =0)P (Y = 0), then that would force X, Y to be independent 6 First show that for any two random variables U, V with finite variance, Cov(U, V )= [P (U u, V v) P (U u)p (V v)]dudv Apply this to U = g(x),v = h(x) 7 The joint mgf is ( 6 et + 6 et + 3 )4 The value of the covariance is 9 8 The joint mgf is ψ(t,t )=e λ µ η+λet +µe t +ηe t +t Find E(XY ) by evaluating t t ψ at (t,t )=(0, 0) Chapter (a) c = 4; (b) They are independent; (c) f X (x) =x, 0 x ; f Y (y) =y, 0 y,e(x) =E(Y )=/3; E(XY )=4/9 (a) c = 4; (b) They are not independent; (c) f X (x) =x( x), 0 x ; f Y (y) =y( y), 0 y ; E(X) =E(Y )=/5; E(XY )=/5

13 3 (a) c = ; (b) They are not independent; (c) f X (x) =e x,x > 0; f Y (y) =ye y,y > 0; E(X) =;E(Y )=;(d)e(x Y = y) =y/; (e) E(Y X = x) =x + 5 (a) c = 3 ;(b)no;(c)f 4π X(x) = 3( 4 x ), x ; you can now find f Y (y),f Z (z) by using a suitable symmetry in the formula for the joint density; (d) E(X Y = y) = 0 for any y; (e) Similarly, E(Y X = x) = 0 for any x; (f)0 6 y y + 8 (you can get the answer without doing any hard calculations) 9 E(X X + Y )= 5 8 π; use the fact that X and X + Y are independent X+Y 0 3 8/55 The joint distribution of (X, Y ) does not have a density The joint distribution is a distribution on the line segment y =x, 0 x You can write the joint CDF of (X, Y )asp(x x, Y y) =P (X x, X y) = P (X x, X y ) and simplify this last line 4 E(X n )= 0 n a, b satisfy a(σ + ρσ σ )+b(σ + ρσ σ )=0 8 /4 0 /4 Note the similarity of this problem to Example 7 in the text 048 If we denote X +Y = U, X Y = V,Y = W,then(U, V, W )doesnot have a joint density The joint distribution of (U, V, W ) is a distribution on the plane w = u v You can characterize the joint distribution of (U, V, W ) by writing the joint distribution of (U, V ) which is a bivariate normal, and the conditional distribution of W given U = u, V = v is a one point distribution at w = u v 3 The correlation coefficient between X and X is µ µ +σ 5 (a) Y N(, ); (b) ;(c)x Y = y N( y, ) 6 The mean is 3/4, and the variance is 3/80 7 (a) 3 π x y,x + y ; (b) 3 4 ( x ), x 3

14 σφ( x µ σ ) Φ( x µ σ ) 9 The mean residual life is E(X X >x)=µ + 30 You have to find E(Y X >40) This is different from E(Y X = 40 40) E(Y X > 40) equals yp(x,y)dxdy p(x, y) is a bivariate normal P (X>40) density, and you will be able to simplify the integral in the numerator The denominator is easy because X is marginally normal 3 (a) The densities of U, V, W are 3( u) (0 <u<), 6v( v)(0 < v < ), 3w (0 < w < ) (b) U U[0, ] and T = V has the density V W U t(0 <t<) and V are independent (c) E( U )=/; E( V )=/3 V W V W 33 U has an Exponential density with mean The densities of V,W 3 are 6e v ( e v )(v >0) and 3e w ( e w ) (w>0) T = W U has the density e t ( e t )(t >0) 34 (a) The epoch T of the last departure has the density 4 λ (e t/λ e t/λ t λ e t/λ ); (b) /; (c) The total time T spentbymaryatthepost office has the density λ e t/λ ( e t/λ ) 36 (a) (09) n ( + n); (b) Find the smallest n such that 9 (099)n (098) n 0 This gives n = 57 Chapter 3 3 U = XY has the density f U (u) =( u), 0 u ; V = X has the Y density f V (v) = (0 v ), = (v>) 3 3v 3 3 Z = X + Y has the density f Z (z) =z (0 z ), =z z ( <z ) V = X Y has the density f V (v) = v ( v 0), =( v) (0 < v ) W = X Y has the density f W (w) =( w), 0 w 33 (a) No; (b) c =/; (c) Z = X + Y has the density f Z (z) = z e z,z >0, which is a Gamma density u log u 34 (a) No; (b) c =8;(c)U = XY has the density f U (u) =, 0 < u< 35 T = NXY has a mixed distribution T can be equal to zero with a positive probability; P (T =0)= Conditional on T>0,T has a density 4 You have to find this by conditioning separately on N =andn =,and take the mixture of the two densities that you will get for (XY )=XY and (XY )=XY To complete this problem, first show that V = XY has the density f V (v) = log v, 0 <v<, and then proceed as outlined 4

15 above 36 P (m = X) =+ a 3a(b+c) ; P (m = Z) = a(3b a) ; P (m = Y )canbe 6bc 6bc found by subtraction 38 E( XY XY )=E( X +Y X )=0 +Y 39 It is better to first find the CDF of X + Y This is easy for t, but must be done carefully for t> The CDF of X + Y is P (X + Y t) = πt(0 <t ), and P 4 (X + Y t) =t( π arctan t ) + t for 4 <t Of course, for t,p(x + Y t) = To evaluate E( X + Y ), apply its definition, and find the answer by numerical integration 3 The point P Q has the planar coordinates (U, V )=(X Y,Z W ) First show that the joint density of (U, V ) in polar coordinates is (arccos( r ) r r ), 0 r, π <θ<π This will lead to the π 4 density of r as 4 r(arccos( r ) r r ), 0 r We need E(r), which is π 4 4 π 0 r (arccos( r ) r r 8 )dr Thisintegralequals =090545, which 4 45π is the needed average distance 3 /6 33 P ( X < ) = /4; P (X <Y)=/ They would have been the Y same if Y was a nonnegative random variable 34 The answer is arctan( σ) π τ 36 By using the general formula for the density of a product in the text, first show that the density of U = XY is of the form c c u γ u xα δ γ ( x) β (x u) δ dx, wherec,c are the normalizing constants of the Beta densities for X, Y Now simplify this expression in terms of the F Hypergeometric function 38 The density of U = XY is f U (u) = log u, <u< π u 30 This can be proved by simply using the Jacobian formula You have to be a bit careful and take the cases Y and Y> separately, and then put the two cases together 3 The density of W = X + Y + Z is w g(w),w >0 3w 3 The required density is,w >0 (+w) 4 34 The density of Z = X + Y is f Z (z) = e z (0 z ), = (e )e z,z > 35 The density of Z = X + Y is f Z (z) =Φ( z µ z µ ) Φ( ) σ σ 5

16 37 Given z 0, let n be such that n z<n+ Then the density of Z = X + Y at z is f Z (z) = e λ λ n n! 38 (a) c = r ;(b)no;(c)rhas the density,r >0, θ U[ π, π], π (+r ) 3/ and r, θ are independently distributed; (d) 330 The correlation is The correlation is 0 33 The density of Z = X + Y is f Z (z) = e z/µ e z/λ,z >0; the density µ λ of W = X Y is f W (w) = λ+µ ew/µ (w<0), and = λ+µ e w/λ (w>0) 335 If X U[0, ], then n,n, are each uniform on {0,,,, 9}, and they are independent 336 The integer part of X and the fractional part of X are independent in this case The integer part has the pmf P ( X = n) =e n ( e ),n = 0,,,, and the fractional part has the CDF P ({X} y) = ey, 0 e y (e ) y Note the interesting fact that the integer part therefore has a Geometric distribution 338 The density of R = X + X + + Xn n+3 4Γ( is f R (r) = ) πγ( n ) r n (+r ) (n+3)/,r > 0 You can find the density of X + X + + X n from this by a simple transformation 6

17 Hints and Answers to Supplementary Problems Appendix I: Word Problems I 7576; 85 I 00; 07; 07 I3 /6 I4 /3 I5 /6; /3; / I I8 The winning probabilities of A, B, C are 36/9, 30/9, 5/9 respectively I I 9/9 I I6 For with replacement sampling, the probability is /9; for without replacement sampling, the probability is 3/55 ( I7 4 ( 39 3) ( 5 3) ( 4 ( 6 3) ( 5 3) + ( 4 ( 3) 3 ( 5 3) I9 0 x )( 3 ) I0 ( 39 x=5 ( 5 x ) ( 6 4) 6 x=3 ( 6 x) I I3 4! 4 4 [ 0 ( 5 x )( 0 x) 5 x=0 ( 50 0) I6 ] I8 (06) 5 I9 /5 I30 Denoting the selection of a green ball by G and that of a white ball by W, the favorable sample points are GGGGG, GGGW G, GW GGG, GW GW G Compute the probability of each of these sample points and add I3 /3 I ; 058; 0559 I35 (a) ; (b) all numbers in [, ] are medians of the distribution; (c) First prove that n(n ) n= =4 n I36 Var( X ) =E(X ) [E( X )] E(X ) E(X) (since E( X ) 7

18 E(X) ) =Var(X) I38 This is the negative hypergeometric distribution (see Exercise 65) (a) To find the mass function, show that P (X >x)= (39 x ),x =,,, 39, ( 5 x ) and then find the mass function by subtraction; (b) By the tailsum formula, E(X) =+ 39 ( 39 x ) x= =53/4 = 3786 ( 5 x ) I40 Consider a two valued random variable X with the mass function 0005 P (X = µ 0005) = δ, P (X = N) =δ If δ =,thenx has N+0005 µ mean µ NowtakeN to be sufficiently large I4 Since E(X) = µ = E(X ), and E(X ) (E(X)) for any random variable X, it follows that µ µ, ie, 0 µ Therefore, Var(X) =E(X ) (E(X)) = µ µ = µ( µ) 4 I44 (a) E(X) = E(Y ) = 4/3; (b) Var(X) = Var(Y ) = 8/9; (c) P (Z =0)=4/9; P (Z =)=4/7; P (Z =)=/7; (d) E(Z) =6/7 I45 The variance is 67 I47 Mean = 4405; variance = 363 I49 For example, take X such that P (X =0)=,P(X = 000) = ) ( 5 3) I50 4 ( I5 (a) 0; (b) Show that P (N >n)= 0 9 ( n),n =,,, 0 0 n Then, by the tailsum formula, E(N) =+ 0 n= P (N >n) I5 9/6 I54 log I55 The mgf equals ψ(t) = +et +e t +e 3t +e 4t +e 5t The mean is zero 5e t I56 E(Y )= (n )+4( n 4)+ = n;themgfofy equals ψ(t) = + +(n )e t +( n 4)e 4t + n 4 n I57 The mgf of XY is p + pe λ(et ) I59 The mgf is n i= ( p i + p i e t ); the variance is n i= p i ( p i ) I60 X takes the values ± with probability each, I6 (a) The mgf is e λ n (λe t ) x x=0 + P (X >n); (b) The limit equals the x! mgf of X itself, ie, e λ(et ) I64 The nth factorial moment is λ n I65 (a) c = ;(b)p(0) = /8; p() = /; p() = /4; p(3) = /8; (c) 8 /8 8

19 I66 e (s )λ I68 Yes, it is possible Take a random variable X with the stated property and take Y = X I70 66 I7 For sending one parcel, the expected cost is $950 For sending two parcels, the expected cost is $50 It is better to send one parcel I73 The distribution of X is a binomial with parameters 3 and Therefore, the mean is and the variance is I75 e 5/6 p I77 θ I79 (a) 00835; (b) 0007; (c) k = 5 in each case I80 (a) (39 3) ; (b) 03664; (c) ( 5 3) I8 P (max(x, Y ) n) = 4 n 5 n + n,n 0 By using the tailsum formula now, E(max(X, Y )) = n=0 (4 n + 5 n n )= 9 =4 I83 The mgf of X Y is e λ µ+λet +µe t I85 The first four moments are, 6,, 94 I86 The first four moments are 5, 75, 65, 075 I87 0 I88 Use the tailsum method to find the mean I90 Suppose the couple has N children First prove that P (N > n)= ( r n x=n r+ x) p x ( p) n x,r n r Then find the pmf of N by subtraction 587 I9 =4 630 I9 No; because by problem I93, Var(XY ) > Var(X)Var(Y ) = E(X)E(Y ) = E(XY ) I94 X must be ± with probability each I96 06 I97 (a) The density is f(x) =0(x<), =(x )/3( x ), = /3( x 4), =(5 x)/3(4 x 5), =0(x>5); (b) The CDF is F (x) =0(x ), =(x ) /6( <x ), = x/3 /( <x 4), = 5x/3 (x + 9)/6(4 <x<5), =(x 5); (c) 3; (d) 3 3 I98 (a) c = 4; (b) 0; 075; 075; P (C A B) = 0 and is undefined 0 I00 F (x) =0(x<0), =09(x =0), = 0 e x (x >0) I0 (a) 0; (b) ; (c) prove that x 3 x x > 0ifx> So the 9

20 required probability is P (X >); (d) ( + e 3 ); (e) 0 I03 The density of Y = e X is f(y) = y ( y<), and 0 log y e otherwise I05 The density of Y = g(x) isf(y) =e y + e y, 0 <y<and0 y otherwise I07 Y has the CDF F (y) = +y, ( y ), =0(y< ), =(y>) In other words, Y has the same distribution as Z I09 (a) σ equals 53; (b) 696; (c) I0 The shortest interval with probability 5 under each of these three distributions is [, ] I 6 I4 Mean equals log,medianequals, and variance equals n n n I5 3 I6 4 I7 log( arctan X) is one such transformation π I8 (a) c = ;(b) π 4 π π I0 square inches I π log I3 n 770 log(φ(4)) I4 a< I5 The function has a minima of zero and a maxima of I7 Hazard rates with a bathtub shape can be generated by using densities of the form cx d e xp,x > 0,p,d > 0 Of course, this is not the only possible way to generate bathtub hazard rates I9 (09) 0 I30 (09544) 0 I3 The deciles are e 8,e 084,e 055,e 053,,e 053,e 055,e 084,e 8 I3 The name is lognormal I33 No, because the mean µ would be zero, and then the three percentile values become inconsistent I35 For a 90% confidence interval, n approximately 7σ ; for a 95% confidence interval, n approximately 384σ ; for a 99% confidence interval, n approximately 663σ I

21 I37 For Z 5, the mean, median, and mode are all zero; for Z, themean is, the median is 0675, and the mode is zero; for Z, themeanis πe π +Φ(), the median is 05, and the mode is zero I39 The distribution is approximately a normal distribution with mean and variance 6 36 I4 n approximately 3393 I I44 A normal approximation cannot be used here, because this density does not have a finite mean I I49 The conditional expectation of the number of heads in the last 5 tosses given that the number of heads in the first 5 tosses is x is given by the formula 5 + x 3 I50 By using the iterated variance formula, you can easily prove that Var(Y ) Var(X) I5 (a) Each of X, Y, Z takes the values ± with probability each; (b) each of these joint pmfs assigns probability to the four points (±, ±); (c) 4 each correlation is zero; (d) I5 (a) The variance is 3n ; (b) the limit is zero by Chebyshev s inequality 4 I 53 The percentages of A, B, C, D,andF would be 8%, 359%, 686%, 359% and 8% So the required probability is 40! (008) 8 (0359) 8 (0686) 8 (0359) 8 (008) 8 (8!) 5 I55 0 I56 (a) ( k ); (b) min(x, k); (c) The covariance between X and min(x, k) is (k +) k The variance of min(x, k) is + k 4 k 4k k Using these, the correlation between X and min(x, k) is This converges to when k k k 4 k (k ) k ; (c) Show that the co- The correlation is I57 (a) The expected values are N+ variance between the two numbers drawn is N+ (f) 0 N(N+ n ;(b) N N ; I59 (a) ;(b) ;(c) 6 6 I60 E(X X X 3 )=n(n )(n )p p p 3 I6 For a general k, the conditional expectation is 4 k 3 I64 (a) E(X Y = y) = 3( y); (b) E(Y X = x) = 3 ( x); (c) 4 4

22 E(XY )= 9 56 ;(d)e(x Y )= 35 I65 (a) ;(b)0;(c) φ(c)φ(c); (d) arcsin( 3 4 π 0 ) I66 The density of Z = X Y is f Z (z) = z, z I67 A necessary and sufficient condition that you can make a triangle with three segments of lengths a, b, c is that the sum of any two of a, b, c is at least as large as the third one Using this, you can show that the required probability in this problem is log I68 I69 The joint density of U = X, V = XY,W = XY Z is f U,V,W (u, v, w) = e u u v w v,u,v,w >0 uv I7 (a) Show that Cov(X + X,X X ) = 0, and that it follows from this that X + X,X X are independent; (b) since X + X,X X are independent, any function of X + X and any function of X X are independent; (c) write X + X and X X in terms of the polar coordinates, from which it will follow that they are independent I73 (a) E(XY )=, Var(XY )= 5 4 ;(b)e(x Y )=0;(c)0;(d)c = I I76 I77 3 I78 Yes, the expectation is finite and equals I80 (a) A bound for the density of Z = X + Y is f Z (z) π for all z; (b) no, the density of XY may not be bounded Think of a random variable Y with a density converging to very sharply at y =0 4 I8 9 θ( p) I83 p+θ( p) I84 The mean equals 0( e 0 ) by applying the iterated expectation formula; the variance has to be found by applying the iterated variance formula, and the terms in the formula have to be found numerically The variance is approximately 90 I85

23 Appendix I: True-False Problems T 4 T 6 F 8 T 9 F F 3 F 4 T 5 F 6 T 8 F 9 F T 3 T 5 T 7 T 8 F 30 T 3 F 33 F 35 T 36 T 39 T 40 T 4 T 45 T 46 T 47 T 49 T 50 F 5 T 5 F 53 T 3

24 55 F 57 T 59 F 6 T 63 T 64 T 65 T 66 T 68 T 70 T 7 F 7 T 73 T 76 T 77 T 78 T 79 F 8 T 8 T 84 F 86 F 87 T 88 F 90 T 9 F 94 F 95 T 97 T 99 F 00 T 0 T (the expected value is 49) 03 T 04 T 06 F 4

25 07 T 09 T 0 F T 3 T 4 F (ρ can also be 08) 6 T 8 T 9 F T 5

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Math 493 Final Exam December 01

Math 493 Final Exam December 01 Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Set Theory Digression

Set Theory Digression 1 Introduction to Probability 1.1 Basic Rules of Probability Set Theory Digression A set is defined as any collection of objects, which are called points or elements. The biggest possible collection of

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Statistics Examples. Cathal Ormond

Statistics Examples. Cathal Ormond Statistics Examples Cathal Ormond Contents Probability. Odds: Betting...................................... Combinatorics: kdm.................................. Hypergeometric: Card Games.............................4

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Math Bootcamp 2012 Miscellaneous

Math Bootcamp 2012 Miscellaneous Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Expectation, inequalities and laws of large numbers

Expectation, inequalities and laws of large numbers Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator

More information

Practice Midterm 2 Partial Solutions

Practice Midterm 2 Partial Solutions 8.440 Practice Midterm 2 Partial Solutions. (20 points) Let X and Y be independent Poisson random variables with parameter. Compute the following. (Give a correct formula involving sums does not need to

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering.

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering. Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran 1. Which of the following tasks can be best solved using Clustering. (a) Predicting the amount of rainfall based on various cues

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2 THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION NEW MODULAR SCHEME introduced from the eaminations in 7 MODULE SPECIMEN PAPER B AND SOLUTIONS The time for the eamination is ½ hours The paper

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

3. DISCRETE RANDOM VARIABLES

3. DISCRETE RANDOM VARIABLES IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

TEST CODE: MMA (Objective type) 2015 SYLLABUS

TEST CODE: MMA (Objective type) 2015 SYLLABUS TEST CODE: MMA (Objective type) 2015 SYLLABUS Analytical Reasoning Algebra Arithmetic, geometric and harmonic progression. Continued fractions. Elementary combinatorics: Permutations and combinations,

More information