1 Review of Probability and Distributions

Size: px
Start display at page:

Download "1 Review of Probability and Distributions"

Transcription

1 Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote random variables by uppercase letters X,Y,Z,U,V,... from the end of the alphabet. In particular, a discrete random variable is a random variable that can take values on a finite set {a,a 2,...,a n } of real numbers (usually integers), or on a countably infinite set {a,a 2,a 3,...}. The statement such as X = a i is an event since {ω : X(ω) = a i } is a subsetof the sample space Ω. Thus, we can consider the probability of the event {X = a i }, denoted by P(X = a i ). The function p(a i ) := P(X = a i ) over thepossiblevaluesofx, say a,a 2,..., iscalled afrequency function, oraprobability mass function. Continuous random variable. A continuous random variable is a random variable whose possible values are real values such as 78.6, 5.7, 0.24, and so on. Examples of continuous random variables include temperature, height, diameter of metal cylinder, etc. In what follows, a random variable always means a continuous random variable, unless it is particularly said to be discrete. The probability distribution of a random variable X specifies how its values are distributed over the real numbers. This is completely characterized by the cumulative distribution function (cdf). The cdf F(t) := P(X t). represents the probability that the random variable X is less than or equal to t. Then we often say that the random variable X is distributed as F. Probability distributions. It is often the case that the probability of the random variable X being in any particular range of values is given by the area under a curve over that range of values. This curve is called the probability density function (pdf) of the random variable X, denoted by f(x). Thus, the probability that a X b can be expressed as P(a X b) = b a f(x)dx. Furthermore, we can find the following relationship between cdf and pdf: F(t) = P(X t) = t f(x)dx. Joint distributions. When a pair (X, Y) of random variables is considered, a joint density function f(x, y) is used to compute probabilities constructed from the random variables X and Y simultaneously. Given the joint density function f(x,y), the distribution for each of X and Y is called the marginal distribution. The marginal density functions of X and Y, denoted by f X (x) and f Y (y), are given respectively by f X (x) = f(x,y)dy and f Y (y) = f(x,y)dx. Lecture Note Page

2 Conditional probability distributions. Suppose that two random variables X and Y has a joint density function f(x,y). If f X (x) > 0, then we can define the conditional density function f Y X (y x) given X = x by f Y X (y x) = f(x,y) f X (x). Similarly we can define the conditional density function f X Y (x y) given Y = y by f X Y (x y) = f(x,y) f Y (y) if f Y (y) > 0. Then, clearly we have the following relation f(x,y) = f Y X (y x)f X (x) = f X Y (x y)f Y (y). Independence of random variables. If the joint density function f(x, y) for continuous random variables X and Y is expressed in the form then X and Y are said to be independent. f(x,y) = f X (x)f Y (y) for all x,y, Expectation. Let X be a random variable, and let f is the pdf of X. Then the expectation (or expected value, or mean) of the random variable X is given by E[X] = xf(x)dx. () We denote the expected value () by E(X), µ, or µ X. For a function g, we define the expectation E[g(X)] of the function g(x) of the random variable X by E[g(X)] = g(x)f(x)dx. One can think of the expectation E(X) as an operation on a random variable X which returns the average value for X. It is also useful to observe that this is a linear operator satisfying the following properties: [ ] n n E a+ b i X i = a+ b i E[X i ] with random variables X,...,X n and constants a and b,...,b n. Variance and covariance. The variance of a random variable X, denoted by Var(X), is the expected value of the squared difference between the random variable and its expected value E(X). Thus, it is given by Var(X) := E[(X E(X)) 2 ] = E[X 2 ] (E[X]) 2. Thesquare-root Var(X) is called the standard error (SE) (or standard deviation (SD)) of the random variable X. Now suppose that we have two random variables X and Y. Then the covariance of two random variables X and Y can be defined as Cov(X,Y) := E((X µ x )(Y µ y )) = E(XY) E(X) E(Y), where µ x = E(X) and µ y = E(Y). The covariance Cov(X,Y) measures the strength of the dependence of the two random variables. Lecture Note Page 2

3 In contrast to the expectation, the variance is not a linear operator: The variance of ax + b with constants a and b is given by Var(aX +b) = a 2 Var(X). For two random variables X and Y, we have Var(X +Y) = Var(X)+Var(Y)+2Cov(X,Y). (2) However, if X and Y are independent, then Cov(X,Y) = 0, and consequently Var(X+Y) = Var(X)+ Var(Y). In general, when we have a sequence of independent random variables X,...,X n, we have Var(X + +X n ) = Var(X )+ +Var(X n ). Normal distributions. A normal distribution is represented by a family of distributions which have the same general shape, sometimes described as bell shaped. The normal distribution has the pdf f(x) := ] [ σ 2π exp (x µ)2 2σ 2, (3) which depends upon two parameters µ and σ 2. Here π = is the famous pi (the ratio of the circumference of a circle to its diameter), and exp(u) is the exponential function e u with the base e = of the natural logarithm. Much celebrated integrations are the following: σ 2π σ 2π σ 2π e (x µ)2 2σ 2 dx =, (4) xe (x µ)2 2σ 2 dx = µ, (x µ) 2 e (x µ)2 2σ 2 dx = σ 2. The integration (4) guarantees that the density function (3) always represents a probability density no matter what values the parameters µ and σ 2 are chosen. We say that a random variable X is normally distributed with parameter (µ,σ 2 ) when X has the density function (3). The parameter µ, called mean (or, location parameter), provides the center of the density, and the density function f(x) is symmetric around µ. The parameter σ is a standard deviation (or, a scale parameter); small values of σ lead to high peaks but sharp drops. Larger values of σ lead to flatter densities. The shorthand notation X N(µ,σ 2 ) is often used to express that X is a normal random variable with parameter (µ,σ 2 ). Linear transform of normal random variable. One of the important properties of normal distribution is that if X is a normal random variable with parameter (µ,σ 2 ) [that is, the pdf of X is given by the density function (3)]. then Y = ax + b is also a normal random variable having parameter (aµ+b,(aσ) 2 ). In particular, X µ (5) σ becomes a normal random variable with parameter (0, ), called the z-score. Then the normal density φ(x) with parameter (0, ) becomes φ(x) := e x2 2, 2π Lecture Note Page 3

4 which is known as the standard normal density. We define the cdf Φ by Φ(t) := t φ(x) dx. Sample statistics. Let X,...,X n be independent and identically distributed random variables ( iid random variables for short). Then the random variable X = n is called the sample mean, and the random variable S 2 = n (X i n X) 2 is called the sample variance. n Sample statistics under normal assumption. Assume that X,...,X n are iid normal random variables with parameter (µ,σ 2 ). Then the sample statistics X and S 2 have the following properties: (a) E[ X] = µ and Var( X) = σ2 n. (b) The sample variance S 2 can be rewritten as ( n ) ( n ) S 2 = Xi 2 n n X 2 = (X i µ) 2 n( X µ) 2. n In particular, we can obtain ( n ) E[S 2 ] = E[(X i µ) 2 ] ne[( n X µ) 2 ] X i ( n ) = Var(X i ) nvar( n X) = σ 2. (c) X has the normal distribution with parameter (µ,σ 2 /n), and X and S 2 are independent. Furthermore, W n = (n )S2 σ 2 = n σ 2 (X i X) 2 has the chi-square distribution with (n ) degrees of freedom. Transformation of one random variable. Let X be a random variable with pdf f X (x), and let g(x) be a differentiable and strictly monotonic function (that is, either strictly increasing or strictly decreasing) on real line. Then the inverse function g exists, and the random variable Y = g(x) has the pdf f Y (y) = f X (g (y)) d dy g (y). (6) Linear transform of random variable. Let X be a random variable with pdf f X (x). Then we can define the linear transform Y = ax + b with real values a and b. To compute the density function f Y (y) of the random variable Y, we can apply (6) to obtain f Y (y) = f X ( y b a ) a. Lecture Note Page 4

5 For example, suppose that X is a normal random variable with parameter (µ,σ 2 ). Then the density function f Y (y) for the linear transform Y = ax +b becomes ] f Y (y) = [ ( a σ) 2π exp (y (aµ+b))2 2(aσ) 2, which implies that Y is a normal random variable with parameter (aµ+b,(aσ) 2 ). Change of variables in double integral. For a rectangle A = {(x,y) : a x b, a 2 y b 2 } on a plane, the integration of a function f(x,y) over A is formally written as A f(x,y)dxdy = b2 b a 2 a f(x,y)dxdy (7) Suppose that a transformation (g (x,y),g 2 (x,y)) is differentiable and has the inverse transformation (h (u,v),h 2 (u,v)) satisfying { { x = h (u,v) u = g (x,y) if and only if. (8) y = h 2 (u,v) v = g 2 (x,y) Then we can state the change of variable in the double integral (7) as follows: (i) define the inverse image of A by h (A) = g(a) = {(g (x,y),g 2 (x,y)) : (x,y) A}, (ii) calculate the Jacobian [ J h (u,v) = det u h (u,v) v h ] (u,v) u h 2(u,v) v h, 2(u,v) (iii) finally, if it does not vanish for all x,y, then we can obtain f(x,y)dxdy = f(h (u,v),h 2 (u,v)) J h (u,v) dudv. A g(a) Transformation of two random variables. Suppose that (a) random variables X and Y has a joint density function f X,Y (x,y), (b) a transformation (g (x,y),g 2 (x,y)) is differentiable, and (c) its inverse transformation (h (u,v),h 2 (u,v)) satisfies (8). Then the random variables U = g (X,Y) and V = g 2 (X,Y) has the joint density function f U,V (u,v) = f X,Y (h (u,v),h 2 (u,v)) J h (u,v). Order statistics. Let X,...,X n be independent and identically distributed random variables (iid random variables) with the common cdf F(x) and the pdf f(x). Then we can define the random variable V = min(x,...,x n ) of the minimum of X,...,X n. To find the distribution of V, consider the survival function P(V > x) of V and calculate as follows. P(V > x) = P(X > x,...,x n > x) = P(X > x) P(X n > x) = [ F(x)] n. Thus, we can obtain the cdf F V (x) and the pdf f V (x) of V F V (x) = [ F(x)] n and f V (x) = d dx F V(x) = nf(x)[ F(x)] n. Lecture Note Page 5

6 In particular, if X,...,X n be independent exponential random variables with the common parameter λ, then the minimum min(x,...,x n ) has the density (nλ)e (nλ)x which is again the exponential density function with parameter (nλ). Let X,...,X n be iid random variables with the common cdf F(x) and the pdf f(x). When we sort X,...,X n as X () < X (2) < < X (n), the random variable X (k) is called the k-th order statistic. Then we have the following theorem Theorem. The pdf of the k-th order statistic X (k) is given by f k (x) = n! (k )!(n k)! f(x)fk (x)[ F(x)] n k. Beta distribution. When X,...,X n be iid uniform random variables on [0,θ], the pdf of X (k) /θ becomes n! f k (x) = (k )!(n k)! xk ( x) n k, 0 x, which is known as the beta distribution with parameters α = k and β = n k+. Uniform on [0,θ] Beta(k,n+ k) f(x) = θ, 0 x θ E[X] = θ/2, Var(X) = θ 2 /2 X (k) /θ f(x) = Γ(n+) Γ(k)Γ(n+ k) xk ( x) n k, 0 < x < E[X] = k k(n+ k), Var(X) = n+ (n+) 2 (n+2) Assignments.. Suppose that X has the pdf f(x) = (x+2)/8, 2 x 4. Calculate E[X], E[(X +2) 3 ], and E[6X 2(X +2) 3 ]. 2. Let f(x) = 6x( x),0 x, be a pdf for X. Find the mean and the variance. 3. Complete the following calculations using Interactive Toolbox at (a) If X N(75,00), calculate P(X 60) and P(70 X 00). (b) If X N(µ,σ 2 ), find b so that P( b < (X µ)/σ < b) = Suppose that X and Y have the joint density f(x,y) = 2e x y, 0 < x < y <. Find the joint density for U = 2X and V = Y X. 5. Suppose that X and Y have the joint density f(x,y) = 8xy, 0 < x < y <. Find the joint density for U = X/Y and V = Y. 6. LetX (k), k =,2,3,4, betheorderstatisticsofarandomsampleofsize4fromthepdff(x) = e x, 0 < x <. Find P(X (4) > 3). Lecture Note Page 6

7 Summary of discrete random variables. (a) X is the number of Bernoulli trials until the first success occurs. (b) X is the number of Bernoulli trials until the rth success occurs. If X,,X r are independent geometric random variables with parameter p, then X = r k= X k becomes a negative binomial random variable with parameter (r, p). (c) r =. (d) X is the number of successes in n Bernoulli trials. If X,,X n are independent Bernoulli trials, then X = n k= X k becomes a binomial random variable with (n,p). (e) If X and X 2 are independent binomial random variables with respective parameters (n,p) and (n 2,p), then X = X +X 2 is also a binomial random variable with (n +n 2,p). (f) Poisson approximation is used for binomial distribution by letting n and p 0 while λ = np. (g) If X and X 2 are independent Poisson random variables with respective parameters λ and λ 2, then X = X +X 2 is also a Poisson random variable with λ +λ 2. Bernoulli trial (a) P(X = ) = p and P(X = 0) = q := p E[X] = p, Var(X) = pq The probability p of success and the probability q := p of failure (d) (e) Geometric(p) P(X = i) = q i p, i =,2,... E[X] = p, Var(X) = q p 2 Binomial(n, p) ( ) n P(X = i) = p i q n i, i = 0,...,n i E[X] = np, Var(X) = npq (b) (c) (f) Negative binomial(r, p) ( ) i P(X = i) = p r q i r, i = r,r+,... r E[X] = r rq, Var(X) = p p 2 Poisson(λ) P(X = i) = e λλi i!, i = 0,,... E[X] = λ, Var(X) = λ (g) Lecture Note Page 7

8 Summary of continuous random variables. (a) If X and X 2 are independent normal random variables with respective parameters (µ,σ 2) and (µ 2,σ2 2), then X = X +X 2 is a normal random variable with (µ +µ 2,σ 2 +σ2 2 ). m ( ) (b) If X,,X m are iid normal random variables with parameter (µ,σ 2 Xk µ 2 ), then X = σ is a chi-square (χ 2 ) random variable with m degrees of freedom. (c) n = m 2 and λ = 2. (d) IfX,...,X n areindependentexponentialrandomvariableswithrespectiveparametersλ,,λ n, then X = min{x,...,x n } is an exponential random variable with parameter n λ i. (e) If X,,X n are iid exponential random variables with parameter λ, then X = gamma random variable with (n, λ). (f) n =. k= n X k is a (g) If X and X 2 are independent random variables having gamma distributions with respective parameters (n,λ) and (n 2,λ), then X = X +X 2 is a gamma random variable with parameter (n +n 2,λ). k= (d) Normal(µ,σ 2 ) f(x) = 2πσ e (x µ)2 2σ 2, < x < E[X] = µ, Var(X) = σ 2 Exponential(λ) f(x) = λe λx x 0 E[X] = λ, Var(X) = λ 2 (a) (b) (c) (e) (f) Gamma(n, λ) Chi-square (with m degrees of freedom) f(x) = 2 e x 2( x 2 )m 2 Γ( m 2 ), x 0 E[X] = m, Var(X) = 2m f(x) = λe λx (λx) n, x 0 Γ(n) E[X] = n λ, Var(X) = n λ 2 (g) Γ(n) = (n )!, when n is a positive integer. Lecture Note Page 8

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Review of Statistics I

Review of Statistics I Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

Joint p.d.f. and Independent Random Variables

Joint p.d.f. and Independent Random Variables 1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

ORF 245 Fundamentals of Statistics Great Expectations

ORF 245 Fundamentals of Statistics Great Expectations ORF 245 Fundamentals of Statistics Great Expectations Robert Vanderbei Fall 2015 Slides last edited on November 16, 2015 http://www.princeton.edu/ rvdb Definition The expectation of a random variable is

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Ch3. Generating Random Variates with Non-Uniform Distributions

Ch3. Generating Random Variates with Non-Uniform Distributions ST4231, Semester I, 2003-2004 Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

Set Theory Digression

Set Theory Digression 1 Introduction to Probability 1.1 Basic Rules of Probability Set Theory Digression A set is defined as any collection of objects, which are called points or elements. The biggest possible collection of

More information

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4. UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information