Sampling Distributions

Size: px
Start display at page:

Download "Sampling Distributions"

Transcription

1 In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example, the average and the variance formula are functions of random sample. The term statistic is referred to an individual function of a random sample, and the understanding of sampling distributions is the major undertaking of statistics. Moment generating function. Let X be a random variable. The moment generating function (mgf) of X is given by M(t) E [ e tx]. The mgf M(t) is a function of t defined on some open interval (c, c 1 ) around with c < < c 1. In particular when X is a continuous random variable having the pdf f(x), the mgf M(t) can be expressed as M(t) e tx f(x) dx. The most significant property of moment generating function is that the moment generating function uniquely determines the distribution. MGF of normal distribution. Suppose that X is a standard normal random variable. Then we can compute the mgf of X as follows. M X (t) 1 π e t 1 π e tx e x dx 1 π e (x t) dx exp e 1 (x t) + t ( ) t Since X (Y µ)/σ becomes a standard normal random variable and Y σx + µ, the mgf of Y can be given by ( ) σ M Y (t) e µt t M X (σt) exp + µt for < t <. MGF of gamma distribution. Suppose now that X is a gamma random variable with parameter (α, λ). Then the mgf M X (t) of X can be computed as follows. M X (t) e tx λ α λα Γ(α) xα 1 e λx dx (λ ( t) α Γ(α) ) α λ 1 1 λ t Γ(α) λα Γ(α) dx x α 1 e (λ t)x dx ((λ t)x) α 1 e (λ t)x (λ t)dx ( ) α λ u α 1 e u du λ t where (λ t)x is substituted by u, and (λ t)dx by du. It should be noted that this mgf M X (t) is defined on the open interval (, λ). Joint moment generating function. Let X and Y be random variables having a joint density function f(x, y). Then we can define the joint moment generating function M(s, t) of X and Y Page 1 Probability and Statistics II/February 6, 18

2 by M(s, t) E [ e sx+ty ] e sx+ty f(x, y) dx dy. If X and Y are independent, then the joint mgf M(s, t) becomes M(s, t) E [ e sx+ty ] E [ e sx] E [ e ty ] M X (s) M Y (t). Moreover, it has been known that if the joint mgf M(s, t) of X and Y is of the form M 1 (s) M (t), then X and Y are independent and have the respective mgf M 1 (s) and M (t). Sum of independent random variables. Let X and Y be independent random variables having the respective probability density functions f X (x) and f Y (y). Then the cumulative distribution function F Z (z) of the random variable Z X + Y can be given as follows. [ z x ] F Z (z) P (X + Y z) f X (x)f Y (y) dy dx where F Y f X (x)f Y (z x) dx, is the cdf of Y. By differentiating F Z (z), we can obtain the pdf f Z (z) of Z as f Z (z) d ( ) d dz F Z(z) f X (x) dz F Y (z x) dx f X (x)f Y (z x) dx. Convolution. The function of z (f g)(z) f(x)g(z x) dx is called the convolution of f and g. Thus, the pdf f Z (z) for Z X + Y is given by the convolution of the pdf s f X (x) and f Y (y). Then we can easily verify [ ] M Z (t) e tz f X (x)f Y (z x) dx dz M X (t)m Y (t) Therefore, as we will see in the examples below, the construction of moment generating function is much easier in order to find the distribution of the sum of independent random variables. Example 1. Let X and Y be independent normal random variables with the respective parameters (µ x, σ x) and (µ y, σ y). Find the distribution for Z X + Y. Solution. We can calculate ( ) σ M Z (t) M X (t) M Y (t) exp x t + µ xt ( (σ x + σy)t ) exp + (µ x + µ y )t ( σ y t ) exp + µ yt which is the mgf of normal distribution with parameter (µ x + µ y, σ x + σ y). Thus, we find that Z is a normal random variable with parameter (µ x + µ y, σ x + σ y). Page Probability and Statistics II/February 6, 18

3 Example. Let X and Y be independent gamma random variables with the respective parameters (α 1, λ) and (α, λ). Find the distribution for Z X + Y. Solution. We can calculate M Z (t) M X (t) M Y (t) ( ) α1 ( ) α ( ) α1 +α λ λ λ λ t λ t λ t which is the mgf of gamma distribution with parameter (α 1 + α, λ). random variable with parameter (α 1 + α, λ). Thus, Z is a gamma Chi-square distribution. The gamma distribution f(x) 1 n/ Γ(n/) xn/ 1 e x/, x with parameter (n/, 1/) is called the chi-square distribution with n degrees of freedom (df). Since it has only one parameter n, the mgf M(t) of chi-square distribution can be simply expressed as ( ) n/ 1/ M(t) (1 t) n/ 1/ t Chi-square distribution of one degree of freedom. Let X be a standard normal random variable, and let Y X be the square of X. Then we can express the cdf F Y of Y as F Y (t) P (X t) P ( t X t) Φ( t) Φ( t) in terms of the standard normal cdf Φ. By differentiating the cdf, we can obtain the pdf f Y as follows. f Y (t) d dt F Y (t) 1 π t 1 e t 1 1/ Γ(1/) t 1/ e t/, where we use Γ(1/) π. Thus, the square X of X is the chi-square random variable with 1 degree of freedom. Chi-square distribution with n df. Let X 1,..., X n be iid standard normal random variables. Since X i s have the gamma distribution with parameter (1/, 1/), the sum Y n i1 X i has the gamma distribution with parameter (n/, 1/). That is, the sum Y has the chi-square distribution with n degree of freedom. Student s t-distribution. Let X be a chi-square random variable with n degrees of freedom, and let Y be a standard normal random variable. Suppose that X and Y are independent. Then the distribution of the quotient Z Y X/n Page 3 Probability and Statistics II/February 6, 18

4 is called the Student s t-distribution with n degrees of freedom. The pdf f Z (z) is given by f Z (z) Γ((n + 1)/) nπγ(n/) (1 + z /n) (n+1)/, < z <. F -distribution. Let U and V be independent chi-square random variables with r 1 and r degrees of freedom, respectively. Then the distribution of the quotient Z U/r 1 V/r is called the F-distribution with (r 1, r ) degree of freedom. The pdf f Z (z) is given by f Z (z) Γ((r ( 1 + r )/) Γ(r 1 /)Γ(r /) (r 1/r ) r1/ z r 1/ r ) (r1 +r )/ 1 z, < z <. r Quotient of two random variables. Let X and Y be independent random variables having the respective pdf s f X (x) and f Y (y). Then the cdf F Z (z) of the quotient can be computed as follows. Z Y/X F Z (z) P (Y/X z) P (Y zx, X < ) + P (Y zx, X > ) [ ] [ xz ] f Y (y) dy f X (x) dx + f Y (y) dy f X (x) dx By differentiating, we obtain xz f Z (z) d dz F Z(z) x f Y (xz)f X (x) dx. [ xf Y (xz)]f X (x) dx + [xf Y (xz)]f X (x) dx Derivation of t-distribution. Let X be a chi-square random variable with n degrees of freedom. Then the pdf of the random variable U X/n is given by for t ; otherwise, f U (t). f U (x) d dx P ( X/n x) d dx F X(nx ) nxf X (nx ) n n/ n/ 1 Γ(n/) xn 1 e nx / Derivation of t-distribution, continued. Suppose that Y is a standard normal random variable and independent of X. Then the quotient Z Y X/n Page 4 Probability and Statistics II/February 6, 18

5 has a t-distribution with n degrees of freedom. The pdf f Z (z) can be computed as follows. n n/ x f Y (xz)f U (x) dx (n 1)/ x n e (z +n)x / dx πγ(n/) 1 (1 + z /n) (n+1)/ t (n 1)/ e t dt nπγ(n/) Γ((n + 1)/) nπγ(n/) (1 + z /n) (n+1)/ for < z <. Derivation of F -distribution. Let U and V be independent chi-square random variables with r 1 and r degrees of freedom, respectively. Then the quotient Z U/r 1 V/r has F -distribution with (r 1, r ) degree of freedom. By putting Y U/r 1 and X V/r, we can compute the pdf f Z (z) as follows. x f Y (xz)f X (x) dx r r 1/ 1 r r / (r 1+r )/ Γ(r 1 /)Γ(r /) zr 1/ 1 r r 1/ 1 r r / Γ(r 1 /)Γ(r /) (r 1z + r ) (r 1+r )/ z r 1/ 1 Γ((r 1 + r )/) Γ(r 1 /)Γ(r /) (r 1/r ) r 1/ z r 1/ 1 x (r 1+r )/ 1 e (r 1z+r )x/ dx ( 1 + r 1 r z t (r 1+r )/ 1 e t dt ) (r1 +r )/, < z <. Population and random sample. The data values recorded x 1,..., x n are typically considered as the observed values of iid random variables X 1,..., X n having a common probability distribution f(x). Specifically we may assume that f(x) is a normal distribution with parameter (µ, σ ). In a typical statistical problem it is useful to envisage a population from which the sample should be drawn. A random sample is chosen at random from the population, ensuring that the sample is representative of the population. Sample mean. Let X 1,..., X n be iid normal random variables with parameter (µ, σ ). The function of random sample X 1 n X i n is called the sample mean. Then we obtain i1 E[ X] µ and Var( X) σ n. Furthermore, X has the normal distribution with parameter (µ, σ /n). Page 5 Probability and Statistics II/February 6, 18

6 Sample variance. The function of random sample S 1 n (X i n 1 X) is called the sample variance. The sample variance S can be rewritten as ( n ) ( n ) S 1 Xi n n 1 X 1 (X i µ) n( n 1 X µ). In particular, we can obtain i1 E[S ] 1 n 1 1 n 1 i1 i1 ( n ) E[(X i µ) ] ne[( X µ) ] i1 ( n ) Var(X i ) nvar( X) σ. i1 Theorem 3. X has the normal distribution with parameter (µ, σ /n), and X and S are independent. Furthermore, (n 1)S W n 1 1 n (X σ σ i X) has the chi-square distribution with (n 1) degrees of freedom. Rough explanation of Theorem 3. Assuming that X and S are independent (which is not so easy to prove), we can show that the random variable W n 1 has the chi-square distribution with (n 1) degrees of freedom. By introducing V 1 n σ i1 (X i µ) and W 1 n ( X µ), σ we can write V W n 1 + W 1. Since W n 1 and W 1 are independent, the mgf M V (t) of V can be expressed as M V (t) M Wn 1 (t) M W1 (t), where M Wn 1 (t) and M W1 (t) are the mgf s of W n 1 and W 1, respectively. Observe that W 1 has the chi-square distribution with 1 degrees of freedom. Thus, we can obtain M Wn 1 (t) M V (t) M W1 i1 (1 t) n/ (1 t) (n 1)/ (1 t) 1/ which is the mgf of the chi-square distribution with (n 1) degrees of freedom. t-statistic. Assuming iid normal random variables X 1,..., X n with parameter (µ, σ ), we can construct a standard normal random variable Z 1 X µ σ/ and a chi-square random variable W n n 1 of Theorem 3. Furthermore, the function of random sample ) T X µ S/ n ( X µ σ/ n (n 1)S /σ n 1 Z 1 W n 1 n 1 has the t-distribution with (n 1) degrees of freedom, and it is called t-statistic. The statistic S S is called the sample standard deviation. Sampling distributions under normal assumption. Page 6 Probability and Statistics II/February 6, 18

7 (a) The sample mean X is normally distributed with parameter (µ, σ /n), and Z 1 n( X µ) σ N(, 1) (b) The statistic W n 1 1 σ n (X i X) i1 (n 1)S σ χ n 1 has a chi-square distribution with (n 1) df, and independent of X (Theorem 3). (c) The t-statistic has t-distribution with (n 1) df X µ S/ n n( X µ) σ (n 1)S /σ n 1 Z 1 W n 1 n 1 t n 1 Critical points. Given a random variable X, the critical point for level α is defined as the value c satisfying P (X > c) α. (a) Suppose that a random variable X has the chi-square distribution with n degrees of freedom. Then the critical point of chi-square distribution, denoted by χ α,n, is defined by P (X > χ α,n ) α. (b) Suppose that a random variable Z has the t-distribution with m degrees of freedom. Then the critical point is defined as the value t α,m satisfying P (Z > t α,m ) α. Critical points, continued. Suppose that a random variable Z has the t-distribution with m degrees of freedom. Since the t-distribution is symmetric, the critical point t α,m can be equivalently given by P (Z < t α,m ) α. Together we can obtain the expression P ( t α/,m Z t α/,m ) 1 P (Z < t α/,m ) P (Z > t α/,m ) 1 α. Extrema. Let X 1,..., X n be independent and identically distributed random variables (iid random variables) with the common cdf F (x) and the pdf f(x). Then we can define the random variable V min(x 1,..., X n ) of the minimum of X 1,..., X n. To find the distribution of V, consider the survival function P (V > x) of V and calculate as follows. P (V > x) P (X 1 > x,..., X n > x) P (X 1 > x) P (X n > x) [1 F (x)] n. Page 7 Probability and Statistics II/February 6, 18

8 Extrema, continued. Since F V (x) 1 P (V > x), we obtain the cdf F V (x) and the pdf f V (x) of V F V (x) 1 [1 F (x)] n and f V (x) d dx F V (x) nf(x)[1 F (x)] n 1. In particular, if X 1,..., X n be independent exponential random variables with the common parameter λ, then the minimum min(x 1,..., X n ) has the density (nλ)e (nλ)x which is again the exponential density function with parameter (nλ). Order statistics. Let X 1,..., X n be iid random variables with the common cdf F (x) and the pdf f(x). When we sort X 1,..., X n as X (1) < X () < < X (n), the random variable X (k) is called the k-th order statistic. Theorem 4. The pdf of the k-th order statistic X (k) is given by f k (x) n! (k 1)!(n k)! f(x)f k 1 (x)[1 F (x)] n k. (1.1) Proof of theorem: Infinitesimal probability. If f is continuous at x and δ > is small, then the probability that a random variable X falls into the small interval [x, x + δ] is approximated by δf(x). In differential notation, we can write P (x X x + dx) f(x)dx. Now consider the k-th order statistic X (k) and the event that x X (k) x + dx. This event occurs when (k 1) of X i s are less than x and (n k) of X i s are greater than x x + dx. Since there are ( n k 1,1,n k) arrangements for Xi s, we can obtain the probability of this event as which implies (1.1). f k (x)dx n! (k 1)!(n k)! (F (x))k 1 (f(x)dx)[1 F (x)] n k, Beta distribution. When X 1,..., X n be iid uniform random variables on [, 1], the pdf of the k-th order statistic X (k) becomes f k (x) n! (k 1)!(n k)! xk 1 (1 x) n k, x 1, which is known as the beta distribution with parameters α k and β n k + 1. Page 8 Probability and Statistics II/February 6, 18

9 Exercises Exercises Problem 1. Consider a sample X 1,..., X n of normally distributed random variables with variance σ 5. Suppose that n 31. (a) What is the value of c for which P (S c).9? (b) What is the value of c for which P (S c).95? Problem. Consider a sample X 1,..., X n of normally distributed random variables with mean µ. Suppose that n 16. (a) What is the value of c for which P ( 4( X µ)/s c).95? (b) What is the value of c for which P ( 4( X µ)/s c).99? Problem 3. Suppose that X 1,..., X 1 are iid normal random variables with parameters µ and σ >. (a) Let Z X 1 + X + X 3 + X 4. Then what is the distribution for Z? (b) Let W X 5 + X 6 + X 7 + X 8 + X 9 + X 1. Then what is the distribution for W σ? (c) Describe the distribution for the random variable (d) Find the value c for which P ( ) Z W c.95. Z/ W/6. Problem 4. Suppose that system s components are connected in series and have lifetimes that are independent exponential random variables X 1,..., X n with respective parameters λ 1,..., λ n. (a) Let V be the lifetime of the system. Show that V min(x 1,..., X n ). (b) Show that the random variable V is exponentially distributed with parameter n i1 λ i. Problem 5. Each component A i of the following system has lifetime that is iid exponential random variable X i with parameter λ. A 1 A 3 A 5 A A 4 A 6 (a) Let V be the lifetime of the system. Show that V max(min(x 1, X ), min(x 3, X 4 ), min(x 5, X 6 )). Page 9 Probability and Statistics II/February 6, 18

10 Exercises (b) Find the cdf and the pdf of the random variable V. Problem 6. A card contains n chips and has an error-correcting mechanism such that the card still functions if a single chip fails but does not function if two or more chips fail. If each chip has an independent and exponentially distributed lifetime with parameter λ, find the pdf of the card s lifetime. Problem 7. Suppose that a queue has n servers and that the length of time to complete a job is an exponential random variable. If a job is at the top of the queue and will be handled by the next available server, what is the distribution of the waiting time until service? What is the distribution of the waiting time until service of the next job in the queue? Page 1 Probability and Statistics II/February 6, 18

11 Optional problems Optional problems Joint distribution of order statistics. Let U and V be iid uniform random variable on [, 1], and let X min(u, V ) and Y max(u, V ). We have discussed that the probability P (x X x + dx) is approximated by f(x) dx. Similarly, we can find the following formula in the case of joint distribution. In particular, we can obtain P (x X x + dx, y Y y + dy) f(x, y) dx dy. P (x U x + dx, y V y + dy) dx dy. Problem 8. By showing that P (x X x + dx, y Y y + dy) P (x U x + dx, y V y + dy) + P (x V x + dx, y U y + dy), justify that the joint density function f(x, y) of X and Y is given by { if x y 1; f(x, y) otherwise. Problem 9. Continue Problem 8, and answer the following questions. (a) Find the conditional density f X Y (x y) and f Y X (y x). (b) Find the conditional expectations E(X Y y) and E(Y X x). Problem 1. Let f(x, y) {( n x) y x (1 y) n x if x, 1,..., n and y 1; otherwise. (a) Find the marginal density f Y (y) n x f(x, y) for y 1, and the conditional frequency function f X Y (x θ) given Y θ. Then identify the distribution for f X Y (x θ). (b) Find the marginal freqency function f X (x) 1 f(x, y) dy. (c) Given x k, find the conditional density and f Y X (y k), and identify the distribution for f Y X (y k). In Bayesian statistics (later in this semester) f Y (y) is called a prior distribution of parameter θ of binomial distribution (n, θ), and f Y X (y k) is a posterior distribution of parameter θ provided that we have observed X k from the binomial distribution. Page 11 Probability and Statistics II/February 6, 18

12 Answers to exercises Answers to exercises Problem 1. Observe that 3S /5 6S has a χ -distribution with 3 degrees of freedom. (a) P (S c) P (6S 6c).9 implies 6c χ.1, Thus, c 6.7. (b) P (S c) P (6S 6c).95 implies 6c χ.5, Thus, c 7.3. Problem. Observe that 4( X µ) S (a) c t.5, (b) c t.5, X µ S/ 16 has a t-distribution with 15 degrees of freedom. Problem 3. (a) N(, 4σ ). (b) Since W/σ (X 5 /σ) + +(X 1 /σ) and each (X i /σ) has the standard normal distribution, W/σ has the chi-square distribution with 6 degrees of freedom. (c) We can write Z/ W/6 standard normal distribution, (d) Since P Thus, c ( ) ( Z Z/ W c P W/6 Z/(σ) (W/σ )/6. Since Z/(σ) is independent of W/σ, and has the Z/ W/6 has the t-distribution with 6 degree of freedom. ) 6 c, the result of (c) implies that 6 c t.5,6 Problem 4. (b) P (V > t) P (X 1 > t) P (X 1 > t) e λ 1t e λnt exp ( n i1 λ i) is the survival function of V. Therefore, V is an exponential random variable with parameter n i1 λ i. Problem 5. (b) Let Y 1 min(x 1, X ), Y min(x 3, X 4 ), and Y 3 min(x 5, X 6 ). Then Y 1, Y and Y 3 are iid exponential random variables with parameter λ. Since V Y (3), we obtain f V (t) 6λe λt (1 e λt ), t >. By integrating it, we can find the cdf F V (t) (1 e λt ) 3. Problem 6. Let X i denote the lifetime of the i-th chip, and let V denote the lifetime of the card. Then X i s are iid exponential random variables with λ, and V X (). Thus, we obtain f V (t) n(n 1)λ[e λ(n 1)t e λnt ], t >. Problem 7. Each of n servers completes a job independently in an exponentially distributed time with λ. Then it is known that the sojourn time between the jobs is exponential with nλ, and that the sojourn times are independent each other. Therefore, (i) the distribution of the waiting time at the top of the queue is exponential with nλ, and (ii) the distribution of the waiting time for the second in the queue is given by the sum of two iid exponential random variables, that is, the gamma distribution with (, nλ). Page 1 Probability and Statistics II/February 6, 18

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Joint p.d.f. and Independent Random Variables

Joint p.d.f. and Independent Random Variables 1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Continuous Distributions

Continuous Distributions Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Statistics 3657 : Moment Generating Functions

Statistics 3657 : Moment Generating Functions Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

7.3 The Chi-square, F and t-distributions

7.3 The Chi-square, F and t-distributions 7.3 The Chi-square, F and t-distributions Ulrich Hoensch Monday, March 25, 2013 The Chi-square Distribution Recall that a random variable X has a gamma probability distribution (X Gamma(r, λ)) with parameters

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic? Biostatistics 62 - Statistical Inference Lecture 5 Hyun Min Kang 1 What is an ancillary statistic for θ? 2 Can an ancillary statistic be a sufficient statistic? 3 What are the location parameter and the

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 7: Special Distributions

Chapter 7: Special Distributions This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli

More information

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40 Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM Solutions to Question A1 a) The marginal cdfs of F X,Y (x, y) = [1 + exp( x) + exp( y) + (1 α) exp( x y)] 1 are F X (x) = F X,Y (x, ) = [1

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

1.12 Multivariate Random Variables

1.12 Multivariate Random Variables 112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

Order Statistics and Distributions

Order Statistics and Distributions Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

1. Point Estimators, Review

1. Point Estimators, Review AMS571 Prof. Wei Zhu 1. Point Estimators, Review Example 1. Let be a random sample from. Please find a good point estimator for Solutions. There are the typical estimators for and. Both are unbiased estimators.

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

1 Introduction. P (n = 1 red ball drawn) =

1 Introduction. P (n = 1 red ball drawn) = Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Random Vectors and Multivariate Normal Distributions

Random Vectors and Multivariate Normal Distributions Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Probability Density Functions

Probability Density Functions Probability Density Functions Probability Density Functions Definition Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f (x) such that

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Some Special Distributions (Hogg Chapter Three)

Some Special Distributions (Hogg Chapter Three) Some Special Distributions Hogg Chapter Three STAT 45-: Mathematical Statistics I Fall Semester 5 Contents Binomial & Related Distributions. The Binomial Distribution............... Some advice on calculating

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous

More information

Fundamentals of Statistics

Fundamentals of Statistics Chapter 2 Fundamentals of Statistics This chapter discusses some fundamental concepts of mathematical statistics. These concepts are essential for the material in later chapters. 2.1 Populations, Samples,

More information

Continuous random variables

Continuous random variables Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from

More information

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution Probability distributions Probability Distribution Functions G. Jogesh Babu Department of Statistics Penn State University September 27, 2011 http://en.wikipedia.org/wiki/probability_distribution We discuss

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84 Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if University of California, Los Angeles Department of Statistics Statistics 1A Instructor: Nicolas Christou Continuous probability distributions Let X be a continuous random variable, < X < f(x) is the so

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Lecture 11: Probability, Order Statistics and Sampling

Lecture 11: Probability, Order Statistics and Sampling 5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space

More information

[Chapter 6. Functions of Random Variables]

[Chapter 6. Functions of Random Variables] [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information