Continuous Random Variables

Size: px
Start display at page:

Download "Continuous Random Variables"

Transcription

1 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables that can take (at least theoretically) on an uncountable number of possible values, e.g., the weight of a randomly selected person in a population, the length of time that a randomly selected light bulb works, the error in experimentally measuring the speed of light. Example (Uniform Spinner, LNp.2-14): = (, ] For (a, b], P((a, b]) = b a/(2 ) Consider the random variables: X: R, and X( ) = for Y: R, and Y( ) = tan( ) for Then, X and Y are random variables that takes on an uncountable number of possible values. p. 5-1 Notice that: P X ({X = x})=, for any x R, But, for a<b, P X ({X (a, b]})=p((a, b]) = b a/(2 ) >. Q: Can we still define a probability mass function for X? If not, what can play a similar role like pmf for X? Probability Density Function and Continuous Random Variable Definition. A function f: R R is called a probability density function (pdf) if 1. f(x), for all x (, ), and 2. f(x)dx =1. p. 5-2 Definition: A random variable X is called continuous if there exists a pdf f such that for any set B of real numbers P X ({X B}) = B f(x) dx. For example, P X (a X b) = b a f(x)dx.

2 Theorem. If f is a pdf, then there must exist a continuous random variable with pdf f. Some properties P X ({X = x}) = x for any x R x f(y)dy = It does not matter if the intervals are open or close, i.e., P (X [a, b]) = P (X (a, b]) = P (X [a, b)) = P (X (a, b)). It is important to remember that the value a pdf f(x) is NOT a probability itself It is quite possible for a pdf to have value greater than 1 Q: How to interpret the value of a pdf f(x)? For small dx, P x dx 2 X x + dx 2 = x+ dx 2 f(y)dy f(x) dx. x dx 2 f(x) is a measure of how likely it is that X will be near x We can characterize the distribution of a continuous random variable in terms of its 1.Probability Density Function (pdf) 2.Cumulative Distribution Function (cdf) 3.Moment Generating Function (mgf, Chapter 7) Relation between the pdf and the cdf Theorem. If F X and f X are the cdf and the pdf of a continuous random variable X, respectively, then F X (x) =P (X x) = x f X(y)dy for all <x< f X (x) =F at continuity points of f X (x) = d dx F X(x) X p. 5-3 p. 5-4 Some Notes For a<b P (a <X b) =F X (b) F X (a) = The cdf for continuous random variables has the same interpretation and properties as in the discrete case The only difference is in plotting F X. In the discrete case, there are jumps. In the continuous case, F X is a continuous nondecreasing function. b a f X(x)dx.

3 Example (Uniform Distributions) If < < <, then 1, if α <x β, f(x) = β α, otherwise, is a pdf since 1. f(x) for all x R 2. f(x)dx = β 1 α β α dx = 1 (β α) =1. β α Its corresponding cdf is x, if x α, x α F (x) = f(y)dy = β α, if α <x β, 1, if x>β. (exercise) Conversely, it can be easily checked that F is a cdf and f(x)=f (x) except at x= and x= (Derivative does not exist when x= and x=, but it does not matter.) p. 5-5 An example of Uniform distribution is the r.v. X in the Uniform Spinner example where = and =. Transformation Q: Y=g(X), how to find the distribution of Y? p. 5-6 Suppose that X is a continuous random variable with cdf F X and pdf f X. Consider Y=g(X), where g is a strictly monotone (increasing or decreasing) function. Let R Y be the range of g. Note. Any strictly monotone function has an inverse function, i.e., g 1 exists on R Y. The cdf of Y, denoted by F Y 1.Suppose that g is a strictly increasing function. For y R Y, F Y (y) = P (Y y) = P (g(x) y) =P (X g 1 (y)) = F X (g 1 (y)). 2.Suppose that g is a strictly decreasing function. For y R Y, F Y (y) = P (Y y) = P (g(x) y) =P (X g 1 (y)) = 1 P (X <g 1 (y)) = 1 F X (g 1 (y)).

4 p. 5-7 Theorem. Let X be a continuous random variable whose cdf F X possesses a unique inverse F 1 X. Let Z=F X (X), then Z has a uniform distribution on [, 1]. Proof. For z 1, F Z (z) =F X (F 1 X (z)) = z. Theorem. Let U be a uniform random variable on [, 1] and F is a cdf which possesses a unique inverse F 1. Let X=F 1 (U), then the cdf of X is F. Proof. F X (x) =F U (F (x)) = P (U F (x)) = F (x). The 2 theorems are useful for pseudo-random number generation in computer simulation. X is r.v. F(X) is r.v. X 1,, X n : r.v. s with cdf F F(X 1 ),, F(X n ): r.v. s with distribution Uniform(, 1) U 1,, U n : r.v. s with distribution Uniform(, 1) F 1 (U 1 ),, F 1 (U n ): r.v. s with cdf F The pdf of Y, denoted by f Y 1.Suppose that g is a differentiable strictly increasing function. For y R Y, p Suppose that g is a differentiable strictly decreasing function. For y R Y, Theorem. Let X be a continuous random variable with pdf f X. Let Y=g(X), where g is differentiable and strictly monotone. Then, the pdf of Y, denoted by f Y, is f Y (y) =f X (g 1 (y)) dg 1 (y) dy, for y such that y=g(x) for some x, and f Y (y)= otherwise.

5 Q: What is the role of dg 1 (y)/dy? How to interpret it? p. 5-9 Some Examples. Given the pdf f X of random variable X, find the pdf f Y of Y=aX+b, where a. find the pdf f Y of Y=1/X. find the cdf F Y and pdf f Y of Y=X 2. p. 5-1 Expectation, Mean, and Variance Definition. If X has a pdf f X, then the expectation of X is defined by E(X) = R x f X(x) dx, provided that the integral converges absolutely.

6 Example (Uniform ½ Distributions). If 1, if α <x β, f X (x) = β α, otherwise, then p Some properties of expectation Expectation of Transformation. If Y=g(X), then E(Y )= R y f Y (y) dy = R g(x) f X(x) dx, provided that the integral converges absolutely. proof. (homework) Expectation of Linear Function. E(aX+b)=a E(X)+b, since p Definition. If X has a pdf f X, then the expectation of X is also called the mean of X or f X and denoted by X, so that μ X = E(X) = R x f X(x) dx. The variance of X is defined as Var(X) =E[(X μ x ) 2 ]= R (x μ X) 2 f X (x) dx, and denoted by X 2. The X is called the standard deviation. Some properties of mean and variance The mean and variance for continuous random variables have the same intuitive interpretation as in the discrete case. Var(X) = E(X 2 ) [E(X)] 2 Variance of Linear Function. Var(aX+b)=a 2 Var(X) Theorem. For a nonnegative continuous random variable X, E(X) = R 1 F X (x)dx = R P (X >x)dx. Proof. E(X) = R x f X (x) dx = R R x 1 dt f X (x) dx = R R x f X(x) dt dx = R R t f X (x)dx dt = R 1 F X (t)dt.

7 Example (Uniform Distributions) p Reading: textbook, Sec 5.1, 5.2, 5.3, 5.7 Some Common Continuous Distributions Uniform Distribution Summary for X ~ Uniform( ) ½ Pdf: 1/(β α), if α <x β, f(x) =, otherwise, Cdf:, if x α, F (x) = (x α)/(β α), if α <x β, 1, if x>β. Parameters: < < < Mean: E(X)=( + )/2 Variance: Var(X)= ( ) 2 /12 Exponential Distribution For >, the function ½ λe f(x) = λx, if x,, if x<, is a pdf since (1) f(x) for all x R, and (2) R f(x) dx = R λe λx dx = e λx =1. The distribution of a random variable X with this pdf is called the exponential distribution with parameter. The cdf of an exponential r.v. is F(x)= for x <, and for x, F (x) =P (X x) = R x λe λy dy = e λy x =1 e λx. Theorem. The mean and variance of an exponential distribution with parameter are μ =1/λ and 2 =1/λ 2. E(X) = R xλe λx dx = R y Proof. λ (λe y ) R 1 λ dy = 1 ye y dy = 1 λ λ Γ(2) = 1 λ. E(X 2 ) = R x 2 λe λx dx = R y 2 λ (λe y ) R 1 λ dy = 1 y 2 e y dy = 1 Γ(3) = 2. λ 2 λ 2 λ 2 p. 5-14

8 Some properties The exponential distribution is often used to model the length of time until an event occurs The parameter is called the rate and is the average number of events that occur in unit time. This gives an intuitive interpretation of E(X)=1/. p Theorem (relationship between exponential, gamma, and Poisson distributions, Sec. 9.1). Let 1. T 1, T 2, T 3,, be independent and ~ exponential( ), 2. S k =T 1 + +T k, k=1, 2, 3,, 3. X i be the number of S k s that falls in the time interval (t i 1, t i ], i=1,, m. Then, (i) X 1,, X m are independent, (ii) X i ~Poisson( (t i t i 1 )), (iii) S k ~ Gamma(k, ). (iv) The reverse statement is also true. The rate parameter is the same for both the Poisson and exponential random variable. The exponential distribution can be thought of as the continuous analogue of the geometric distribution. Theorem. The exponential distribution (like the geometric distribution) is memoryless, i.e., for s, t, p. 5-16

9 p This means that the distribution of the waiting time to the next event remains the same regardless of how long we have already been waiting. This only happens when events occur (or not) totally at random, i.e., independent of past history. Notice that it does not mean the two events {X > s+t} and {X > t} are independent. Summary for X ~ Exponential( ) ½ Pdf: λe λx, if x, Cdf: f(x) =, if x<. ½ 1 e F (x) = λx, if x,, if x<. Parameters: Mean: E(X)= 1/ Variance: Var(X)= 1/ 2. Gamma Distribution Gamma Function Definition. For >, the gamma function is defined as Γ(α) = R x α 1 e x dx. (1) = 1 and (1/2) = (exercise) ( +1) = ( ) Proof. By integration by parts, Γ(α +1) = R x α e x dx = x α e x + R αx α 1 e x dx = αγ(α). ( )=( 1)! if is an integer p Γ(α/2) = π(α 1)! if is an odd integer 2 α 1 ( α 1 2 )! Gamma function is a generalization of the factorial functions For, >, the function ½ λ α f(x) = Γ(α) xα 1 e λx, if x,, if x<,

10 is a pdf since (1) f(x) for all x R, and (2) R f(x) dx = R λ α The distribution of a random variable X with this pdf is called the gamma distribution with parameters and. The cdf of gamma distribution can be expressed in terms of the incomplete gamma function, i.e., F(x)= for x<, and for x, F (x) = R x = 1 Γ(α) λ α Γ(α) yα 1 e λy dy = 1 Γ(α) Γ(α) xα 1 e λx dx R y α 1 e y dy =1. Theorem. The mean and variance of a gamma distribution with parameter and are μ = α/λ and 2 = α/λ 2. Proof. E(X) = R x λα = λα Γ(α) Γ(α+1) λ α+1 Γ(α) xα 1 e λx dx R E(X 2 )= R x 2 λα Γ(α) xα 1 e λx dx = λα Γ(α) Γ(α+2) λ α+2 R p R λx z α 1 e z dz γ(α,λx) Γ(α). λ α+1 Γ(α+1) x(α+1) 1 e λx dx = α λ. λ α+2 Γ(α+2) x(α+2) 1 e λx dx = α(α+1). λ 2 (exercise) E(X k )= Γ(α+k), for <k,and λ k Γ(α) E( 1 )= λk Γ(α k) X k Γ(α), for <k<α. Some properties The gamma distribution can be used to model the waiting time until a number of random events occurs When =1, it is exponential( ) T 1,, T n : independent exponential( ) r.v. s T T n ~ Gamma(n, ) Gamma distribution can be thought of as a continuous analogue of the negative binomial distribution A summary Discrete Time Version Continuous Time Version number of events binomial Poisson waiting time until 1 event occurs geometric exponential waiting time until r events occur negative binomial gamma is called shape parameter and scale parameter (Q: how to interpret and from the viewpoint of waiting time?) p. 5-2

11 A special case of the gamma distribution occurs when =n/2 and =1/2 for some positive integer n. This is known as the Chi-squared distribution with n degrees of freedom (Chapter 6) Summary for X ~ Gamma(, ) ½ Pdf: λ α f(x) = Γ(α) xα 1 e λx, if x,, if x<. Cdf: F (x) =γ(α, λx)/γ(α). Parameters:, Mean: E(X) = / Variance: Var(X) = / 2. Beta Distribution Beta Function: For, >, the function f(x) = is a pdf (exercise). B(α, β) = R 1 xα 1 (1 x) β 1 dx = Γ(α)Γ(β) ( Γ(α+β) Γ(α)Γ(β) xα 1 (1 x) β 1, if x 1,, otherwise, Γ(α+β). p The distribution of a random variable X with this pdf is called the beta distribution with parameters and. The cdf of beta distribution can be expressed in terms of the incomplete beta function, i.e., F(x)= for x<, F(x)=1 for x>1, and for x 1, p Theorem. The mean and variance of a beta distribution with parameters and are μ = α α+β and 2 αβ = (α+β) 2 (α+β+1). Proof. E(X) = R x Γ(α+β) Γ(α)Γ(β) xα 1 (1 x) β 1 dx = Γ(α+β) Γ(α)Γ(β) α = α+β. Γ(α+1)Γ(β) Γ(α+β+1) R Γ(α+β+1) Γ(α+1)Γ(β) x(α+1) 1 (1 x) β 1 dx

12 E(X 2 )= R x 2 Γ(α+β) Γ(α)Γ(β) xα 1 (1 x) β 1 dx = Γ(α+β) Γ(α+2)Γ(β) R Γ(α)Γ(β) Γ(α+β+2) α(α+1) = (α+β)(α+β+1). Some properties When = =1, the beta distribution is the same as the uniform(, 1). Whenever =, the beta distribution is symmetric about x=.5, i.e., f(.5 )=f(.5+ ). As the common value of and increases, the distribution becomes more peaked at x=.5 and there is less probability outside of the central portion. p Γ(α+β+2) Γ(α+2)Γ(β) x(α+2) 1 (1 x) β 1 dx When >, values close to become more likely than those close to 1; when <, values close to 1 are more likely than those close to (Q: How to connect it with E(X)?) Summary for X ( ~ Beta(, ) Pdf: Γ(α+β) f(x) = Γ(α)Γ(β) xα 1 (1 x) β 1, if x 1,, otherwise, Cdf: F (x) =B(x; α, β)/b(α, β). Parameters:, Mean: E(X) = /( + Variance: Var(X) = [ /[. Normal Distribution For R and >, the function f(x) = 1 2π e (x μ)2 2 2, <x<, is a pdf since (1) f(x) for all x R, and (2) and R 1 2π e R (x μ)2 2 2 dx = 1 y2 e 2 2π dy I ³ R ³ 2 R I 2 = e x 2 dx R x2 +y 2 e e y2 2 dy = R 2 dxdy = R = 2π R re r2 2 dr = 2πe r2 2 2π, R 2π e r2 2 rdθdr =2π. p. 5-24

13 The distribution of a random variable X with this pdf is called the normal (Gaussian) distribution with parameters and, denoted by N(, 2 ). The normal pdf is a bell-shaped curve. It is symmetric about the point, i.e., f( + )=f( ) and falls off in the rate determined by. The pdf has a maximum at can be shown by differentiation and the maximum height is The cdf of normal distribution does not have a close form. Theorem. The mean and variance of a N(, 2 ) distribution are and 2, respectively. : location parameter; 2 : scale parameter Proof. E(X) = R x 1 2π e (x μ)2 2 2 R y2 = ye 2 2π dy + μ R = 2π +μ 1=μ. dx = R (y + μ) 1 2π e y π e y2 2 dy p dy E(X 2 )= R x2 1 2π e (x μ)2 2 2 = 2 R y2 1 2π e y2 2 +μ 2 R 1 2π e y2 2 p dx = R (y + μ)2 1 2π e y2 2 dy dy + 2μ 2π R ye y2 2 dy = μ 2π +μ 2 1= 2 + μ 2. dy Some properties Normal distribution is one of the most widely used distribution. It can be used to model the distribution of many natural phenomena. Theorem. Suppose that X~N(, 2 ). The random variable Y=aX+b, where a, is also normally distributed with parameters a +b and a 2 2, i.e., Y~N(a +b, a 2 2 ). Proof. f Y (y) =f X ³ y b a 1 a = 1 2π a e [y (aμ+b)]2 2 2 a 2.

14 Corollary. If X~N(, 2 ), then p is a normal random variable with parameters and 1, i.e., N(, 1), which is called standard normal distribution. The N(, 1) distribution is very important since properties of any other normal distributions can be found from those of the standard normal. The cdf of N(, 1) is usually denoted by. Theorem. Suppose that X~N(, 2 ). The cdf of X is Proof. F X (x) =F Z x μ. = Φ x μ F X (x) =Φ x μ Example. Suppose that ³ X~N(, 2 ). For <a<b<, P (a <X<b)=P a μ < X μ < b μ ³ ³ = P a μ <Z< b μ = P Z< b μ P Z< a μ ³ = Φ b μ Φ a μ.. Table 5.1 (textbook, p.21) gives values of. To read the table: 1.Find the first value of z up to the first place of decimal in the left hand column. 2.Find the second place of decimal across the top row. 3.The value of (z) is where the row from the first step and the column from the second step intersect. p For the values greater than z=3.49, (z) 1. For negative values of z, use ( z)=1 (z) Normal distribution plays a central role in the limit theorems of probability (e.g., CLT, chapter 8)

15 and Normal approximation to the Binomial Recall. Poisson approximation to Binomial Theorem. Suppose that X n ~binomial(n, p). Define Z n =(X n np)/ np(1 p). Then, as n, the distribution of Z n converge to the N(, 1) distribution, i.e., F Zn (z) =P (Z n z) Φ(z). Proof. It is a special case of the CLT in Chapter 8. p Plot the pmf of X n ~binomial(n, p) Superimpose the pdf of Y n ~N( n, n2 ) with n =np and n2 = np(1 p). When n is sufficiently large, the normal pdf approximates the binomial pmf. Z n (Y n n )/ n The size of n to achieve a good approximation depends on the value of p. For p near.5 moderate n is enough For p close to zero or one require much larger n Continuity Correction Q: Why need continuity correction? Ans. The binomial(n, p) is a discrete random variable and we are approximating it with a continuous random variable. For example, suppose X~binomial(5,.4) and we want to find P(X=18), which is larger than. With the normal pdf, however, P(Y=18)= since we are using a continuous distribution Instead, we make a continuity correction, p. 5-3 and can obtain the approximate value from Table 5.1. Similary,

16 p Summary for X ~ Normal(, ) Pdf: f(x) = 1 2π e (x μ)2 2 2, <x<, Cdf: no close form, but usually denoted by ((x )/ ). Parameters: R and > Mean: E(X) = Variance: Var(X) = 2. Weibull Distribution For, > and R, the function β x ν β 1 f(x) = α α e ( x ν α ) β, if x ν,, if x<ν, is a pdf since (1) f(x) for all x R, and (2) f(x) dx = = β x ν β 1 ν α α e ( x ν α ) β dx e y dy = e y =1. p The distribution of a random variable X with this pdf is called the Weibull distribution with parameters, and. (exercise) The cdf of Weibull distribution is F (x) = 1 e ( x ν α ) β, if x ν,, if x<ν. Theorem. The mean and variance of a Weibull distribution with parameter, and are μ = αγ 1+ 1 β + ν and 2 = α 2 Γ 1+ 2 β Γ 1+ 1 β E(X) = x β x ν β 1 Proof. v α α e ( x ν α ) β dx = (αy1/β + μ)e y dy = α y 1/β e y dy + μ e y dy = αγ 1 β +1 + μ E(X 2 )= x 2 β x ν β 1 v α α e ( x ν α ) β dx = (αy1/β + μ) 2 e y dy = α 2 y 2/β e y dy +2αμ y 1/β e y dy + μ 2 e y dy = α 2 Γ 2 β +1 +2αμΓ μ2 β 2.

17 Some properties Weibull distribution is widely used to model lifetime. : scale parameter; : shape parameter; : location parameter Theorem. If X~exponential( ), then Y = α (λx) 1/β + μ is distributed as Weibull with parameter,, and (exercise). Cauchy Distribution For R and >, the function f(x) = π 1 2 +(x μ) 2, <x<, is a pdf since (1) f(x) for all x R, and (2) f(x) dx = 1 π 2 +(x μ) dx 2 = 1 π The distribution of a random variable X with this pdf is called the Cauchy distribution with parameters and, denoted by Cauchy(, ). The cdf of Cauchy distribution is F (x) = x π p y 2 dy = 1 π tan 1 (y) = (y μ) dy = π tan 1 x μ for <x<. (exercise) The mean and variance of Cauchy distribution do not exist because the integral does not converge absolutely Some properties Cauchy is a heavy tail distribution : location parameter; : scale parameter Theorem. If X~Cauchy(, ), then ax+b~cauchy(a +b, a ). (exercise) p Reading: textbook, Sec 5.4, 5.5, 5.6

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Common ontinuous random variables

Common ontinuous random variables Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Continuous random variables

Continuous random variables Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Lecture 4. Continuous Random Variables and Transformations of Random Variables Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Probability Density Functions

Probability Density Functions Probability Density Functions Probability Density Functions Definition Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f (x) such that

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Review for the previous lecture

Review for the previous lecture Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Chapter 3 Common Families of Distributions

Chapter 3 Common Families of Distributions Lecture 9 on BST 631: Statistical Theory I Kui Zhang, 9/3/8 and 9/5/8 Review for the previous lecture Definition: Several commonly used discrete distributions, including discrete uniform, hypergeometric,

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

MATH : EXAM 2 INFO/LOGISTICS/ADVICE MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:

More information

Exponential & Gamma Distributions

Exponential & Gamma Distributions Exponential & Gamma Distributions Engineering Statistics Section 4.4 Josh Engwer TTU 7 March 26 Josh Engwer (TTU) Exponential & Gamma Distributions 7 March 26 / 2 PART I PART I: EXPONENTIAL DISTRIBUTION

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

3 Modeling Process Quality

3 Modeling Process Quality 3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Continuous Distributions

Continuous Distributions Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study

More information

Continuous Random Variables and Probability Distributions

Continuous Random Variables and Probability Distributions Continuous Random Variables and Probability Distributions Instructor: Lingsong Zhang 1 4.1 Probability Density Functions Probability Density Functions Recall from Chapter 3 that a random variable X is

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Chapter 4: Continuous Random Variable

Chapter 4: Continuous Random Variable Chapter 4: Continuous Random Variable Shiwen Shen University of South Carolina 2017 Summer 1 / 57 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if University of California, Los Angeles Department of Statistics Statistics 1A Instructor: Nicolas Christou Continuous probability distributions Let X be a continuous random variable, < X < f(x) is the so

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Gamma and Normal Distribuions

Gamma and Normal Distribuions Gamma and Normal Distribuions Sections 5.4 & 5.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 15-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Will Landau. Feb 21, 2013

Will Landau. Feb 21, 2013 Iowa State University Feb 21, 2013 Iowa State University Feb 21, 2013 1 / 31 Outline Iowa State University Feb 21, 2013 2 / 31 random variables Two types of random variables: Discrete random variable:

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Continuous random variables and probability distributions

Continuous random variables and probability distributions and probability distributions Sta. 113 Chapter 4 of Devore March 12, 2010 Table of contents 1 2 Mathematical definition Definition A random variable X is continuous if its set of possible values is an

More information

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X. 10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Transformations and Expectations

Transformations and Expectations Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

Continuous Random Variables 1

Continuous Random Variables 1 Continuous Random Variables 1 STA 256: Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 32 Continuous Random Variables: The idea Probability is area

More information

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm Midterm Examination STA 205: Probability and Measure Theory Thursday, 2010 Oct 21, 11:40-12:55 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

Continuous Probability Distributions. Uniform Distribution

Continuous Probability Distributions. Uniform Distribution Continuous Probability Distributions Uniform Distribution Important Terms & Concepts Learned Probability Mass Function (PMF) Cumulative Distribution Function (CDF) Complementary Cumulative Distribution

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Probability Background

Probability Background CS76 Spring 0 Advanced Machine Learning robability Background Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu robability Meure A sample space Ω is the set of all possible outcomes. Elements ω Ω are called sample

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Midterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm

Midterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm Midterm Examination STA 205: Probability and Measure Theory Wednesday, 2009 Mar 18, 2:50-4:05 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information