Order Statistics; Conditional Expectation

Size: px
Start display at page:

Download "Order Statistics; Conditional Expectation"

Transcription

1 Department of Mathematics Ma 3/3 KC Border Introduction to Probability and Statistics Winter 27 Lecture 4: Order Statistics; Conditional Expectation Relevant textbook passages: Pitman [5]: Section 4.6 Larsen Marx [2]: Section Order statistics Given a random vector X,..., X n ) on the probability space S, E, P ), for each s S, sort the components into a vector X ) s),..., X n) s) ) satisfying Pitman [5]: 4.6 X ) s) X 2) s) X n) s). The vector X ),..., X n) ) is called the vector of order statistics of X,..., X n ). Equivalently, { } X k) min max{x j : j J} : J {,..., n} & J k. Order statistics play an important role in the study of auctions, among other things. 4.2 Marginal Distribution of Order Statistics From here on, we shall assume that the original random variables X,..., X n are independent and identically distributed. Let X,..., X n be independent and identically distributed random variables with common cumulative distribution function F, and let X ),..., X n) ) be the vector of order statistics of X,..., X n. By breaking the event X k) x ) into simple disjoint subevents, we get Xk) x ) X n) x ) Xn) > x, X n ) x ). Xn) > x,..., X j+) > x, X j) x ). Xn) > x,..., X k+) > x, X k) x ). 4

2 KC Border Order Statistics; Conditional Expectation 4 2 Each of these subevents is disjoint from the ones above it, and each has a binomial probability: Xn) > x,..., X j+) > x, X j) x ) n j of the random variables are > x and j are x), so Thus: P X n) > x,..., X j+) > x, X j) x ) ) n ) n jf F x) x) j. j The cdf of the k th order statistic from a sample of n is: F k,n) x) P X k) x ) jk ) n ) n jf F x) x) j. ) j 4.3 Marginal Density of Order Statistics Pitman [5]: p. 326 If F has a density f F, then we can calculate the marginal density of X k) by differentiating the CDF F k,n) : d dx F k,n)x) d ) n F x) j F x) ) n j dx j jk ) n d j dx F x)j F x) ) n j jk ) n jf x) j F x) ) n j F x) n j)f x) j F x) ) n j F x)) j jk ) n jf x) j F x) ) n j n j)f x) j F x) ) ) n j fx) j jk jk n! j )!n j)! F x)j F x) ) n j fx) n n! ) n j F F x) x) j fx) j!n j )! jk n! ) n jf F x) x) k fx) k )!n k)! n! ) n jf + F x) x) j fx) j )!n j)! jk+ n n! ) n j F F x) x) j ft). j!n j )! jk v ::3.25 KC Border

3 KC Border Order Statistics; Conditional Expectation 4 3 The last two terms above cancel, since using the change of variables i j, jk+ n! ) n jf F x) x) j j )!n j)! n ik n! ) n if F x) x) i. i!n i )! So the density of the k th order statistic from a sample of size n is: f k,n) x) n! ) n kf F x) x) k fx) k )!n k)! ) n ) n ) k )F n F x) x) k fx). 2) k Note: I could have written n k instead of n ) k ), but then the binomial coefficient would have seemed more mysterious. 4.4 Some special order statistics The st order statistic X ) from a sample of size n is just the minimum X ) min{x,..., X n }. The event that min{x,..., X n } x is just the event that at least one X i x. The complement of this event is that all X i > x, which has probability F x) ) n, so the cdf is and its density is F,n) x) F x) ) n f,n) x) n F x) ) n ) fx). The n th order statistic X n) from a sample of size n is just the maximum X n) max{x,..., X n }. The event that max{x,..., X n } x is just the event that all X i x, so its cdf is and its density is F n,n) x) F x) n f n,n) x) nf x) n ) fx). The n ) st order statistic is the second-highest value. Its cdf is and its density is F n,n) x) n F x) ) F x) n 2 + F x) n nf x) n n )F x) n, nn ) F x) ) F x) n 2 fx). In the study of auctions, the second-highest bid is of special interest. Indeed a secondprice auction awards the item to the highest bidder, but the price is the second-highest bid. A standard auction provides incentives for bidders to bid less than their values, so the distribution of bids and the distribution of values is not the same. Nevertheless it can be shown see my online notes) that the expected revenue to a seller in an auction with n bidders with independent and identically distributed values is just the expectation of the second-highest order statistic for the distribution of values. KC Border v ::3.25

4 KC Border Order Statistics; Conditional Expectation Uniform order statistics and the Beta function Pitman [5]: pp For a Uniform[,] distribution, F t) t and ft) on [, ]. In this case equation 2) tells us: The density f k,n) of the k th order statistic for n independent Uniform[, ] random variables is ) n f k,n) t) n t) n k t k. k Since f k,n) is a density, ) n f k,n) t) dt n t) n k t k, k or t) n k t k Now change variables by setting Then rewrite 3) as n n k ) k )!n k)!. 3) n! r k and s n r + so s n r and n r + s ). t) s t r r )!s )! s + r )! Γs)Γr) Γr + s). Recall that the Gamma function is a continuous version of the factorial, and has the property that Γs + ) sγs) for every s >, and Γm) m )! for every natural number m. See Definition 3... This fact suggests to at least some people) the following definition: 4.5. Definition The Beta function is defined for r, s > not necessarily integers), by Br, s) So for integers r and s, we see that Br +, s + ) Γs + )Γr + ) Γr + s + 2) t r t) s dt Γs)Γr) Γr + s). r+s r s)!r)! s)!r)! r + s) r + s )! r + s)! r + s) ) Definition The betar, s) distribution has the density fx) on the interval [, ] and zero elsewhere. Br, s) xr x) s v ::3.25 KC Border

5 KC Border Order Statistics; Conditional Expectation 4 5 Note that for integer rand s, the density of the betar +, s + ) distribution is fx) ) r + s x r x) s, r + s r which is /r + s) times the Binomial probability of r successes and s failures in r + s trials, where the probability of success is x. The mean of a betar, s) distribution is r r + s. Proof : xfx) dx Br, s) x x r x) s dx x r+ x) s Br, s) Br +, s) Br, s) Γs)Γr + ) Γr + s) Γr + + s) Γs)Γr) Γs)rΓr) r + s)γr + s) r r + s. Γr + s) Γs)Γr) Thus for a Uniform[,] distribution, the k, n) order statistic has a betak, n k + ) distribution and so has mean k n +. [Application to breaking a unit length bar into n + pieces by choosing n breaking points. The expectation of the k th breaking point is at k/n + ), so each piece has expected length /n.] 4.6 The war of attrition In the 97s, the ethologist John Maynard Smith [3] began to apply game theory to problems of animal behavior. One application was to the settlement of intraspecies conflict. In some species e.g., peafowl), conflicts are not settled by violent means, but by means of displays. The rivals will fan their tails, and eventually one will depart, leaving to the other whatever was the source of the conflict. Maynard Smith modeled this as a war of attrition. In the war of attrition game there are two rival contestants i, 2 for a prize of value v. Each chooses a length of time t i at random according to a common probability distribution with cumulative distribution function F. Waiting is costly, and the cost of waiting a length of time t is ct. The rivals continue their displays, until the lesser time elapses and that animal leaves. The distribution is an symmetric equilibrium distribution if it has the following properties. i.) KC Border v ::3.25

6 KC Border Order Statistics; Conditional Expectation 4 6 Each rival, knowing that the opponent has drawn a time t i from the distribution specified by F, is also willing to choose a time specified by F. ii.) When the time t i has elapsed, and contestant i s opponent has not left, then i does not have an incentive to stay longer, and so will leave. Suppose contestant 2 chooses a waiting time s at random according to an exponential distribution with parameter λ. Now consider contestant s decision. Suppose contestant chooses to wait a length of time t. If s < t, which happens with probability e λt, hew ins the prize and receives V, but he also incurs a waiting cost cs. If s > t, which happens with probability e λt, then his cost is ct and he doe not get the prize. The expected total payoff φt) to is then [ t φt) V e λt ) cse λs ds + cte λt]. Rather than integrate this to find out its value, let s see how it depends on t by computing its derivative. Recalling the Fundamental Theorem of Calculus, we see that [ φ t) vλe λt ctλe λt + ce λt cλte λt)] vλ c)e λt. Now if λ is chosen so that the expected waiting cost is equal to the value of the prize, c v vλ c, λ then φ t). That is, contestant receives the same expected payoff regardless of when he leaves. As a result he is perfectly content to choose his waiting time at random according to the same exponential distribution. Thus an exponentially distributed waiting time with parameter λ c/v satisfies property i) of a symmetric equilibrium distribution. It is easy to see that φ), so the expected payoff is zero. This makes sense, as the expected payoff is the same for both players, and they both can t win.) To verify property ii) of a symmetric equilibrium distribution, suppose some length of time passes and neither contestant has dropped out. Since the exponential distribution is memoryless, each contestant can redo the calculation above, and conclude there is no advantage to choosing a different time to leave. Thus contestant i is happy to leave at time t i. If both contestants choose t i according to this λ, then we have an equilibrium. The length of the contest is min{t, T 2 }. Now min{t, T 2 } t if and only if it is not the case that T > t and T 2 > t. Thus P min{t, T 2 } t) P T > t & T 2 > t) P T > t) P t 2 > T ) e λt e λt e 2λt, which is an exponential survival function with parameter 2λ. Thus the expected length of the contest is /2λ). So the winner and loser both expect to wait /2λ) and the expected total cost incurred is equal to v, the value of the prize. Note that this model implies that the observed length of contest durations should be exponentially distributed, provided c and v are the same in each contest. The noted game theorist Robert Rosenthal once told me that in fact the duration of display contests among dung flies is exponentially distributed, but he didn t mention any citations. A little digging found a paper by Parker and Thompson [4] where they find qualified support for this distribution, but argue that an asymmetric model would fit better. 4.7 The Winner s Curse The Winner s Curse is a phenomenon that was first observed in oil leases. The Federal government claims all land on the continental shelf up to 2 miles offshore. It would auction off the v ::3.25 KC Border

7 KC Border Order Statistics; Conditional Expectation 4 7 right to drill for oil on tracts, and oil companies found that despite their best efforts to employ scientific methods to estimating the value of the lease, they systematically lost money on their leases. The explanation is straightforward. The value of a lease is an unknown number V, and each company i got an estimate X i of the value of V. Let s suppose that each estimate is an independent draw from the same distribution F. If the company bids based on the distribution F, they will subject to the winner s curse. Namely, the winner will be the company with the largest X i. But the distribution of the largest value of X i is the n th order statistic, the maximum, which has cdf F n, not F. In order to avoid the winner s curse you have to take into account the fact that you win only when your X i X n). 4.8 The σ-algebra of events generated by random variables Recall that the Borel sets of the real numbers are the members of the smallest σ-algebra that contains all the intervals. Given a set X,..., X n of random variables defined on the probability space S, E, P ) and intervals I,..., I n, X I, X 2 I 2,... X n I n ) is an event. The smallest σ-algebra that contains all these events, as the intervals I j range over all intervals, is called the σ-algebra of events generated by X,..., X n, and is denoted σx,..., X n ). A function g : S R is σx,..., X n )-measurable if for every interval I, the set g I) belongs to E. The following theorem is beyond the scope of this course, but may be found, for instance, in Aliprantis Border [, Theorem 4.4] or M. M. Rao [6, Theorem.2.3, p. 4] Theorem Let X X,..., X n ) be a random vector on S, E, P ) and let g : S R. Then the function g is σx,..., X n )-measurable if and only if there exists a Borel function h: R n R such that g h X. This means there is a one-to-one correspondence between X-measurable functions and functions that depend only on X. As a corollary we have: Corollary The set of X-measurable functions is a vector subspace of the space of random variables. 4.9 Conditioning on the value of a Random Variable: The discrete case Let X and Y be discrete random variables with joint pmf px, y), and let p X and p Y be the respective marginals. Then Y y) and X x) are events so the conditional probability of Y y) given X x) is Pitman [5]: Section 6. P Y y X x ) P Y y, X x) P X x) px, y) px, y) y px, y ) p X x). 4) This is a function of x, known as the conditional pmf of Y given X x and it defines the conditional distribution of Y given X x 4.9. Example Pitman [5, Exercise 6..5, p. 399]) Let X and Y be independent Poisson random variables with parameters µ and λ. What is the distribution of X given X +Y n? KC Border v ::3.25

8 KC Border Order Statistics; Conditional Expectation 4 8 You may or may not recall what the distribution of X + Y is, so let s just roll out the old convolution formula recalling that X and Y are always nonnegative): P X + Y n) pk, n k) k e k µ µk e µ+λ) n! k! k which is a Poissonµ + λ) distribution. So e λ λn k n k)! n! k!n k)! µk λ n k convolution & nonnegativity independence arithmetic e µ+λ) µ + λ) n Binomial Theorem n! P X k X + Y n ) P X k, X + Y n) P X + Y n) defn. P X k, Y n k) P X + Y n) arithmetic P X k) P Y n k) independence P X + Y n) ) ) µ µk λ λn k e k! e n k)! Poisson e µ+λ) n! µ + λ) n ) n µ k λ n k k n k µ + λ) n ) µ µ + λ ) k λ µ + λ ) n k. This is just the probability that a Binomialn, p) random variable is equal to k, when p µ/µ + λ) Application to the Poisson arrival process In the Poisson process we discussed last time, N t is the number of arrivals in the interval [, t] and it has Poissonλt) distribution where λ is the arrival rate the rate in the Exponentialλ) waiting time distribution. Now let < s < t. What can we say about P N s k N t n)? Well N s is Poissonλs) random variable and it is independent of N t N s, which is distributed according to the Poisson λt s) ) pmf. Moreover N t N s + N t N s ), so according to Example 4.9. that we just worked out, P N s k N t n) is the Binomial v ::3.25 KC Border

9 KC Border Order Statistics; Conditional Expectation 4 9 probability P N s k N t n) ) n k n k ) s λs λs + λt s) t ) ) k n k t s. t ) k λt s) ) n k λs + λt s) When you think about it, one interpretation of the Poisson process, that arrivals are uniformly scattered in an interval, so the probability of hitting [, s] given that [, t] has been hit is just s/t. Remember s < t.) So the probability of getting k hits on [, s] given n hits on [, t] is given by the Binomial with probability of success s/t. 4. Conditional Expectation In one way, conditional expectation is quite simple. It is the expectation of a random variable with respect to a conditional distribution. For instance, E Y X x ) y yp Y y X x) y px, y) y px). Note that we have used the common convention not to subscript the probability mass functions, but to use the names of the arguments, x or y or x, y), to indicate whether we are talking about the marginal distribution X or Y, or the joint distribution of the vector X, Y ). 4.. Example Continuing with the previous example, Example 4.9.: Let X and Y be independent Poisson random variables with parameters µ and λ. Then we saw that the distribution of X given X + Y n was a Binomial n, µ/µ + λ) ) distribution: P X k X + Y n ) n k ) ) k ) n k µ λ. µ + λ µ + λ Now a Binomial n, µ/µ + λ) ) has expectation nµ/µ + λ), so Similarly E X X + Y n ) nµ µ + λ E Y X + Y n ) which implies the comforting conclusion that nλ µ + λ E X X + Y n ) + E Y X + Y n ) n. 4. Conditional Expectation, Part 2 So far we have defined EY X x) for discrete random variables. This quantity depends on x, so we can write it as a function of x. Let s use the name υ for this function, because y is the Latin equivalent of the Greek υ ypsilon). Pitman [5]: p. 42 υx) E Y X x ). KC Border v ::3.25

10 KC Border Order Statistics; Conditional Expectation 4 The random variable υx) is known as the conditional expectation of Y given X, which is written E Y X ) υx). The thing to see is that this is a random variable since it is a function of the random variable X. By Theorem 4.8. E Y X ) is σx)-measurable. Another way to say this is that E Y X ) is a random variable that equals E Y X x ) when X x. In terms of the probability space S, E, P ) on which X is defined, we have Xs) x E Y X ) s) υ Xs) ) E Y X x ). 4.. Example Let S be the six-point sample space consisting of x, y) pairs S {, ),, ),, 2),, ),, ),, 2)} with probability measure given in the following diagram where the numbers in each box represent the probability of the sample point: 3 y 2 5 y 4 y 3 8 x x Let X and Y be random variables on S defined by Xx, y) x and Y x, y) y. Then so p X ) p X ) 2, E X p X ) + p X ) 2, and so The event X is highlighted below. p Y ) 2, p Y ) 3, p Y 2) 6 E Y 2p Y 2) + p Y ) + p Y ) v ::3.25 KC Border

11 KC Border Order Statistics; Conditional Expectation 4 So EY X ) 2P Y 2 X ) + P Y X ) + P Y X ) P Y 2, X ) P Y, X ) P Y, X ) P X ) P X ) P X ) Similarly EY X ) 5 2. Thus the random variable EY X) is defined on S by EY X)x, y) { 2 x, 5 2 x, which can be represented in the diagram, where now the numbers in each box represent the value of the random variable EY X): The expectation of the random variable EY X) is From the calculation above, E EY ) X) E EY ) X) 2 3 E Y. 4.2 Conditional Expectation is a Positive Linear Operator Too Ordinary expectation is a positive linear operator that assigns random variables a real number. Conditional expectation assigns random variables another random variable, but it is also linear and positive: Pitman [5]: p. 42 E ay + bz X ) a E Y X ) + b E Z X ) Y E Y X ). KC Border v ::3.25

12 KC Border Order Statistics; Conditional Expectation Iterated Conditional Expectation Since E Y X ) is a random variable, we can take its expectation. Pitman [5]: p. 43 E E Y X )) E Y. More remarkable is the following generalization Theorem For a Borel) function φ, E φx) E Y )) ) X E φx)y. Proof : Let υx) E Y X x ). Then υx) y ypx, y), px) and so E φx) E Y X )) E φx)υx) ) φx)υx)px) x ) ypx, y) φx) px) px) x y ) φx) ypx, y) x y φx)ypx, y) x,y) E φx)y ). In light of Theorem 4.8. we have the following corollary Corollary If Z is σx)-measurable, that is, if we can write Z φx) for some Borel) function φ, then E Z E Y X )) EZY ). 4.4 Conditional Expectation is an Orthogonal Projection Recall Corollary 4.8.2, which states that the space of σx)-measurable random variables is a vector subspace. If we further restrict attention to L 2, the space of square-integrable random variables, we have the inner product defined by X, Y ) EXY ). See Section 9.. Recall from your linear algebra class that in an inner product space, the orthogonal projection of a vector y on a subspace M is the unique vector y M such that y M M, and y y M ) M. It turns out that conditional expectation with respect to X is the orthogonal projection on to the subspace of σx)-measurable random variables. v ::3.25 KC Border

13 KC Border Order Statistics; Conditional Expectation Theorem Let X, Y, and Z have finite variances, and assume that Z is σx)-measurable. Then E Y E Y X )) Z ). Proof : Expand Y E Y X )) Z to get E Y Z E Y X )) Z ) EY Z) E E Y X ) Z ), since expectation is a linear operator. By Corollary 4.3.2, E E Y X ) Z ) EY Z). Substituting this in the previous equation proves the theorem. What this says is that the random variable Y E Y X ) is orthogonal to Z for every σx)-measurable random variable Z. Since E Y X ) is itself σx)-measurable, we have E Y X ) is the orthogonal projection of Y onto the vector space of σx)-measurable random variables, where the inner product X, Y ) is given by EXY ). 4.5 Conditional Expectation and Densities When X and Y have a joint density the definition of P Y y X x ) seems ill-defined: Since when X has a density, P X x) for every x, we cannot define P Y y X x ) as P Y y, X x) /P X x), since that would entail division by zero. It is beyond the scope of this course to prove it, but the following approach works. Given an interval B, a real number x, and ε >, consider P Y B X x ε, x + ε) ) P Y B, X x ε, x + ε)). P X x ε, x + ε)) If the marginal density f X of X is positive and continuous at x, then the denominator is positive, so we are no longer dividing by zero. What we want is for this to tend to a limit as ε tends to zero, and in fact it does, a result known as the Radon Nikodym Theorem. Define so f Y y X x ) fx, y) f X x) fx, y) f Y y X x ) f X x). For a function h: R R, E hy ) ) X x hy)f Y y X x ) dy fx, y) hy) f X x) dy KC Border v ::3.25

14 KC Border Order Statistics; Conditional Expectation Conditioning with Several Variables Let Y, X,, X n have joint density fy, x,..., x n ). The conditional density of Y given X X,..., X n x n is then f Y y x,..., x n ) fy, x,..., x n ) f X,...,X n x,..., x n ), and we may speak of E Y ) X,..., X n, etc. Similarly, f X,...,X n X,..., X n Y y) fy, x,..., x n ), f Y y) and we may speak of E X,..., X n Y ), etc. 4.7 Conditional Independence If f X,Y ) x, y Z z) f X x Z z)f Y y Z z), we say that X and Y are conditionally independent given Z z. If this is true for all z, we say that X and Y are conditionally independent given Z. Here is a transparent but artificial example Example The outcome space has eight points, each equally probable. The values of the random variables X, Y, and Z are given in the following table. s s 2 s 3 s 4 s 5 s 6 s 7 s 8 Z X 2 2 Y 2 2 You can see that X and Y are not independent. For example, P X 2, Y ) P X 2) P Y ) 6. But X and Y are conditionally independent given Z. conditional on values of z: Here is the distribution of X and Y Z Z X X 2 X X Y Y 4 4 Y 4 4 Y 4 4 It is easy to verify conditional independence. A common way that dependent, but conditionally independent random variables can arise is like this. Let U, V, Z be independent random variables, and let X Z + U and Y Z + V. Then usually X and Y are not independent, but they are conditionally independent given Z. Bibliography [] C. D. Aliprantis and K. C. Border. 26. Infinite dimensional analysis: A hitchhiker s guide, 3d. ed. Berlin: Springer Verlag. v ::3.25 KC Border

15 KC Border Order Statistics; Conditional Expectation 4 5 [2] R. J. Larsen and M. L. Marx. 22. An introduction to mathematical statistics and its applications, fifth ed. Boston: Prentice Hall. [3] J. Maynard Smith The theory of games and the evolution of animal conflicts. Journal of Theoretical Biology 47): DOI:.6/ )9-6 [4] G. A. Parker and E. A. Thompson. 98. Dung fly struggles: a test of the war of attrition. Behavioral Ecology and Sociobiology 7): DOI:.7/BF3256 [5] J. Pitman Probability. Springer Texts in Statistics. New York, Berlin, and Heidelberg: Springer. [6] M. M. Rao. 98. Foundations of stochastic analysis. Mineola, NY: Dover. Reprint of the 98 Academic Press edition. KC Border v ::3.25

16

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 217 Lecture 13: The Poisson Process Relevant textbook passages: Sections 2.4,3.8, 4.2 Larsen Marx [8]: Sections

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4

More information

Random variables and expectation

Random variables and expectation Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 5: Random variables and expectation Relevant textbook passages: Pitman [5]: Sections 3.1 3.2

More information

Introducing the Normal Distribution

Introducing the Normal Distribution Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 10: Introducing the Normal Distribution Relevant textbook passages: Pitman [5]: Sections 1.2,

More information

Random variables and expectation

Random variables and expectation Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2019 Lecture 5: Random variables and expectation Relevant textbook passages: Pitman [6]: Sections 3.1 3.2

More information

Introducing the Normal Distribution

Introducing the Normal Distribution Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 219 Lecture 1: Introducing the Normal Distribution Relevant textbook passages: Pitman [5]: Sections 1.2, 2.2,

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

The Poisson Arrival Process

The Poisson Arrival Process Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 219 Lecture 13: The Poisson Arrival Process Relevant textbook passages: Pitman [12]: Sections 2.4,3.5, 4.2

More information

Expectation is a positive linear operator

Expectation is a positive linear operator Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 6: Expectation is a positive linear operator Relevant textbook passages: Pitman [3]: Chapter

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

University of Warwick, Department of Economics Spring Final Exam. Answer TWO questions. All questions carry equal weight. Time allowed 2 hours.

University of Warwick, Department of Economics Spring Final Exam. Answer TWO questions. All questions carry equal weight. Time allowed 2 hours. University of Warwick, Department of Economics Spring 2012 EC941: Game Theory Prof. Francesco Squintani Final Exam Answer TWO questions. All questions carry equal weight. Time allowed 2 hours. 1. Consider

More information

Continuous Probability Spaces

Continuous Probability Spaces Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes. Can not assign probabilities to each outcome and add

More information

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras (Refer Slide Time: 00:23) Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras Lecture - 22 Independent

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Chapter 4. Continuous Random Variables

Chapter 4. Continuous Random Variables Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Thus far, we ve made four key assumptions that have greatly simplified our analysis:

Thus far, we ve made four key assumptions that have greatly simplified our analysis: Econ 85 Advanced Micro Theory I Dan Quint Fall 29 Lecture 7 Thus far, we ve made four key assumptions that have greatly simplified our analysis:. Risk-neutral bidders 2. Ex-ante symmetric bidders 3. Independent

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Introduction to Probability and Statistics Winter 2015

Introduction to Probability and Statistics Winter 2015 CALIFORNIA INSTITUTE OF TECHNOLOGY Ma 3/13 KC Border Introduction to Probability and Statistics Winter 215 Lecture 11: The Poisson Process Relevant textbook passages: Pitman [15]: Sections 2.4,3.8, 4.2

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Normal Random Variables and Probability

Normal Random Variables and Probability Normal Random Variables and Probability An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2015 Discrete vs. Continuous Random Variables Think about the probability of selecting

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Stat 512 Homework key 2

Stat 512 Homework key 2 Stat 51 Homework key October 4, 015 REGULAR PROBLEMS 1 Suppose continuous random variable X belongs to the family of all distributions having a linear probability density function (pdf) over the interval

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Discrete Random Variables

Discrete Random Variables Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

04. Random Variables: Concepts

04. Random Variables: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition: nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Common ontinuous random variables

Common ontinuous random variables Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2014 Introduction The markets can be thought of as a complex interaction of a large number of random

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 26, 2018 CS 361: Probability & Statistics Random variables The discrete uniform distribution If every value of a discrete random variable has the same probability, then its distribution is called

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999 Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.

More information

Continuous Distributions

Continuous Distributions Chapter 5 Continuous Distributions 5.1 Density and Distribution Functions In many situations random variables can take any value on the real line or in a certain subset of the real line. For concrete examples,

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 11: Geometric Distribution Poisson Process Poisson Distribution Geometric Distribution The Geometric

More information

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution

More information

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 1 Modelling incomplete information So far, we have studied games in which information was complete,

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

Differentiation. Timur Musin. October 10, University of Freiburg 1 / 54

Differentiation. Timur Musin. October 10, University of Freiburg 1 / 54 Timur Musin University of Freiburg October 10, 2014 1 / 54 1 Limit of a Function 2 2 / 54 Literature A. C. Chiang and K. Wainwright, Fundamental methods of mathematical economics, Irwin/McGraw-Hill, Boston,

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information