Introduction to Probability Theory

Size: px
Start display at page:

Download "Introduction to Probability Theory"

Transcription

1 Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39

2 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random Variables 5 Conditional Distributions and Expectation 6 Transformations 7 The Normal and Related Distributions Ping Yu (HKU) Probability 2 / 39

3 Foundations Foundations Ping Yu (HKU) Probability 2 / 39

4 Foundations Founder of Modern Probability Theory Andrey N. Kolmogorov ( ), Russian Ping Yu (HKU) Probability 3 / 39

5 Foundations Sample Space, Event and Probability The set Ω of all possible outcomes of an experiment is called the sample space for the experiment. - Take the simple example of tossing a coin. There are two outcomes, heads and tails, so we can write Ω = fh,t g. - If two coins are tossed in sequence, we can write the four outcomes as Ω = fhh,ht,th,tt g. An event A is any collection of possible outcomes of an experiment. An event is a subset of Ω, including Ω itself and the null set /0. - Continuing the two coin example, one event is A = fhh,ht g, the event that the first coin is heads. We say that A and B are disjoint or mutually exclusive if A \ B = /0. - For example, the sets fhh,ht g and fthg are disjoint. If the sets A 1,A 2, are pairwise disjoint and [ i=1 A i = Ω, then the collection A 1,A 2, is called a partition of Ω. A probability function P assigns probabilities (numbers between 0 and 1) to events A in Ω. - P satisfies P(Ω) = 1, P(A c ) = 1 P(A), 1 P(A) P(B) if A B. 1 This implies P (/0) = 0. Ping Yu (HKU) Probability 4 / 39

6 Foundations Conditional Probability and Independence of Two Events If P(B) > 0 the conditional probability of the event A given the event B is P(AjB) = - Rearranging the definition, we can write P(A \ B). P(B) P(A \ B) = P(AjB)P(B). We say that the events A and B are statistically independent when P(A \ B) = P(A)P(B), which implies P(AjB) = P(A), i.e., the occurrence of B has no information about the likelihood of event A. We say that the collection of events A 1,...,A k are mutually independent when for any subset fa i : i 2 Ig,! P \ A i = P(A i ). i2i i2i Ping Yu (HKU) Probability 5 / 39

7 Foundations Bayes Rule or Theorem Theorem (Bayes Rule or Theorem) For any set B and any partition A 1,A 2, of the sample space, then for each i = 1,2, P (A i jb) = P(BjA i )P(A i ) j=1 P(BjA j )P(A j ). Example (Monty Hall Problem) The Monty Hall problem is a brain teaser, first appearing on the American television game show Let s Make a Deal and named after its original host, Monty Hall. Problem: Suppose you re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what s behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice? Ping Yu (HKU) Probability 6 / 39

8 Foundations Thomas Bayes ( ), English Thomas Bayes ( ) was an English statistician. He never published what would eventually become his most famous accomplishment; his notes were edited and published after his death by Richard Price. Ping Yu (HKU) Probability 7 / 39

9 Foundations Figure: Tree Showing the Probability of Every Possible Outcome If the Player Initially Picks Door 1 Ping Yu (HKU) Probability 8 / 39

10 Foundations Monty Hall Problem: A Rigorous Analysis Consider the events C1, C2 and C3 indicating the car is behind respectively door 1 2 or 3. All these 3 events have probability 1/3. The player initially choosing door 1 is described by the event X1. As the first choice of the player is independent of the position of the car, also the conditional probabilities are P(CijX1) = 1/3, i = 1,2,3. The host opening door 3 is described by H3. For this event it holds: P (H3jC1,X1) = 1/2, P (H3jC2,X1) = 1 and P (H3jC3,X1) = 0. If the player initially selects door 1, and the host opens door 3, the conditional probability of winning by switching is P (C2jH3,X1) = = = = P (H3jC2,X1)P (C2 \ X1) P (H3 \ X1) P (H3jC2,X1)P (C2 \ X1) 3 i=1 P (H3jCi,X1)P (Ci \ X1) P (H3jC2,X1) 3 i=1 P (H3jCi,X1) 1 1/ = 2 3. Ping Yu (HKU) Probability 9 / 39

11 Random Variables Random Variables Ping Yu (HKU) Probability 10 / 39

12 Random Variables Random Variables and CDFs A random variable (r.v.) X is a function from a sample space Ω into the real line. - The r.v. transforms the abstract elements in Ω to analyzable real values. - Notations: we denote r.v. s by uppercase letters such as X, and use lowercase letters such as x for potential values and realized values. - Caution: Be careful about the difference between a random variable and its realized value. The former is a function from outcomes to values while the latter is a value associated with a specific outcome. For a r.v. X we define its cumulative distribution function (cdf) as F (x) = P(X x). - Notations: Sometimes we write this as F X (x) to denote that it is the cdf of X. The support of a r.v. X is the closure of the set of possible values that X can take, denoted as supp(x ). Ping Yu (HKU) Probability 11 / 39

13 Random Variables Discrete Variables The r.v. X is discrete if F (x) is a step function. - The support of a discrete r.v. consists of finite or countably many values, x 1,,x J, where J can be The probability function for X takes the form of the probability mass function (pmf) where 0 p j 1 and J j=1 p j = 1. P X = x j = pj, j = 1,,J, (1) - F (x) = J j=1 p j1(x j x), where 1() is the indicator function which equals one when the event in the parenthesis is true and zero otherwise. A famous discrete r.v. is the Bernoulli (or binary) r.v., where J = 2, x 1 = 0, x 2 = 1, p 2 = p and p 1 = 1 p. [Figure here] - The Bernoulli distribution is often used to model sex, employment status, and other dichotomies. Ping Yu (HKU) Probability 12 / 39

14 Random Variables Jacob Bernoulli ( ), Swiss Jacob Bernoulli ( ) was one of the many prominent mathematicians in the Bernoulli family. Ping Yu (HKU) Probability 13 / 39

15 Random Variables Continuous Random Variables The r.v. X is continuous if F (x) is continuous in x. - In this case P (X = x) = 0 for all x 2 R so the representation (1) is unavailable. We instead represent the relative probabilities by the probability density function (pdf) f (x) = d dx F (x). - A function f (x) is a pdf iff f (x) 0 for all x 2 R and R f (x)dx = 1. - The support of a continuous r.v. is fx 2 Rjf (x) > 0g. By the fundamental theorem of calculus, Z x F (x) = f (u)du and Z b P (a X b) = F (b) F (a) = f (u)du. a Ping Yu (HKU) Probability 14 / 39

16 Random Variables Examples of Continuous R.V.s A famous continuous r.v. is the normal r.v. [Figure here] The standard normal density is φ(x) = 1 p 2π exp! x 2, < x <. 2 - Notations: write X N(0,1), and denote the standard normal cdf by Φ(x). Φ(x) has no closed-form solution. [Figure here] - The normal distribution is often used to model human heights and weights, test scores, and county unemployment rates. Another famous continuous r.v. is the standard uniform r.v. whose pdf is f (x) = 1(0 x 1), i.e., X can occur only on [0,1] and occurs uniformly. We denote X U[0,1]. [Figure here] - a generalization is X U[a,b], a < b. Ping Yu (HKU) Probability 15 / 39

17 Random Variables Carl F. Gauss ( ), Göttingen The normal distribution was invented by Carl F. Gauss ( ), so it is also known as the Gaussian distribution. Ping Yu (HKU) Probability 16 / 39

18 Random Variables 0.5 Bernoulli Distribution Standard Normal Distribution Uniform Distribution Figure: PMF, PDF and CDF: p = 0.5 in the Bernoulli Distribution A function F (x) is a cdf iff the following three properties hold: (i) lim F (x) = 0 x! and lim F (x) = 1; (ii) F (x) is nondecreasing in x; (iii) F (x) is right-continuous, x! i.e., F (x) = limf (t). t#x Ping Yu (HKU) Probability 17 / 39

19 Expectation Expectation Ping Yu (HKU) Probability 18 / 39

20 Expectation Expectation For any real function g, we define the mean or expectation E [g(x )] as follows: if X is discrete, J E [g(x )] = g(x j )p j, j=1 and if X is continuous Z E [g(x )] = g(x)f (x)dx. - The mean is a weighted average of all possible values of X with the weights determined by its pmf or pdf. To cover both the discrete and continuous case (and even more general cases), we often write Z E [g(x )] = g(x)df (x), which is a Riemann-Stieltjes integral (why intuitively correct? see Chapter 1). Since E [a + bx ] = a + b E [X ], we say that expectation is a linear operator. Ping Yu (HKU) Probability 19 / 39

21 Expectation Moments For m > 0, we define the m 0 th moment of X as E [X m ], the m 0 th central moment as E (X E[X ]) m, the m 0 th absolute moment of X as E jxj m, and the m 0 th absolute central moment as E jx E[X ]j m. Two special moments are the first moment - the mean µ = E[X ], which is a measureh of central tendency, and the second central moment - the variance σ 2 = E (X µ) 2i h = E X 2i µ 2, which is a measure of variability. - We call σ = p σ 2 the standard deviation of X. - We also write σ 2 = Var(X ), which allows the convenient expression Var(a + bx ) = b 2 Var(X ). The normal density has all moments finite. - Since it is symmetric about h zero all odd moments are zero. - We can also show that E X 2i h = 1 and E X 4i = 3. Ping Yu (HKU) Probability 20 / 39

22 Multivariate Random Variables Multivariate Random Variables Ping Yu (HKU) Probability 21 / 39

23 Multivariate Random Variables Bivariate Random Variables A pair of bivariate r.v. s (X,Y ) is a function from the sample space into R 2. The joint cdf of (X,Y ) is If F is continuous, the joint pdf is - For any set A R 2, F (x,y) = P (X x,y y). f (x,y) = 2 x y F (x,y). Z Z P ((X,Y ) 2 A) = f (x,y)dxdy. A - The discrete case can be parallely discussed. For any function g(x,y), E [g(x,y )] = Z Z g(x,y)f (x,y)dxdy. Ping Yu (HKU) Probability 22 / 39

24 Multivariate Random Variables Marginal Distributions The marginal distribution of X is Z x Z F X (x) = P(X x) = lim F (x,y) = f (x,y)dydx, y! So by the fundamental theorem of calculus, the marginal density of X is Similarly, the marginal density of Y is f X (x) = d Z dx F X (x) = f (x,y)dy. f Y (y) = Z f (x,y)dx. Ping Yu (HKU) Probability 23 / 39

25 Multivariate Random Variables Independence of Two R.V.s The r.v. s X and Y are defined to be independent if f (x,y) = f X (x)f Y (y). - X and Y are independent iff there exist functions g(x) and h(y) such that f (x,y) = g(x)h(y). (Exercise) If X and Y are independent, then Z Z E [g(x )h(y )] = g(x)h(y)f (x, y)dxdy (2) Z Z = g(x)h(y)f X (x)f Y (y)dxdy Z Z = g(x)f X (x)dx h(y)f Y (y)dy = E [g(x )]E [h(y )]. - For example, if X and Y are independent then E [XY ] = E [X ]E [Y ]. Ping Yu (HKU) Probability 24 / 39

26 Multivariate Random Variables Covariance and Correlation The covariance between X and Y is Cov(X,Y ) = σ XY = E [(X E [X ]) (Y E [Y ])] = E [XY ] E [X ]E [Y ]. The correlation between X and Y is Corr(X,Y ) = ρ XY = The Cauchy-Schwarz Inequality implies that jρ XY j 1. σ XY σ X σ Y. The correlation is a measure of linear dependence, free of units of measurement. [Figure here] If X and Y are independent, then σ XY = h 0 and ρ XY = 0. The reverse, however, is not true. For example, if E [X ] = 0 and E X 3i = 0, then Cov(X,X 2 ) = 0. Var(X + Y ) = Var(X ) + Var(Y ) + 2Cov(X,Y ). - if X and Y are independent, then Var(X + Y ) = Var(X ) + Var(Y ) - the variance of the sum is the sum of the variances. Ping Yu (HKU) Probability 25 / 39

27 Multivariate Random Variables More Than Two R.V.s A k 1 random vector X = (X 1,,X k ) 0 is a function from Ω to R k. The vector X has the distribution and density functions F (x) = P (X x), f (x) = where x = (x 1,,x k ) 0 denote a vector in R k. k x 1 x k F (x), For any function g : R k! R s, define the expectation Z E [g(x )] = g(x)f (x)dx, R k where the symbol dx denotes dx 1 dx k. - the k 1 multivariate mean µ = E [X ]. - the k k covariance matrix Σ = E (X µ) (X µ) 0 = E XX 0 µµ 0. If the elements of X are mutually independent, then Σ is a diagonal matrix and Var! k k X i = Var (X i ). i=1 i=1 Ping Yu (HKU) Probability 26 / 39

28 Conditional Distributions and Expectation Conditional Distributions and Expectation Ping Yu (HKU) Probability 27 / 39

29 Conditional Distributions and Expectation Conditional Distributions and Expectation The conditional density of Y given X = x is defined as if f X (x) > 0. f Y jx (yjx) = f (x,y) f X (x) The conditional mean or conditional expectation is the function Z m(x) = E [Y jx = x] = yf Y jx (yjx)dy. The conditional mean m(x) is a function, meaning that when X equals x, then the expected value of Y is m(x). The conditional variance of Y given X = x is h σ 2 (x) = Var (Y jx = x) = E (Y m(x)) 2 i X = x h = E Y 2 i X = x m(x) 2. If Y and X are independent, then E [Y jx = x] = E [Y ] and Var (Y jx = x) = Var (Y ). (why?) Ping Yu (HKU) Probability 28 / 39

30 Conditional Distributions and Expectation Example 8xy, if 0 < x < y < 1, Suppose f XY (x,y) = Are X and Y independent? What are 0, otherwise. the marginal densities of X and Y? What is the conditional density of Y given X? What are E [Y jx = x] and Var (Y jx = x)? Solution It is tempting to claim that X and Y are independent since it seems that f (x,y) = g(x)h(y) with g(x) = 8x and h(y) = y. However, such an argument neglects the support of (X,Y ) which is shown in the left panel of the following figure. The marginal density of X is f X (x) = R 1 x 8xydy = 4x(1 x 2 ),0 < x < 1, and the marginal density of Y is f Y (y) = R y 0 8xydx = 4y 3,0 < y < 1. Obviously, so X and Y are not independent. f XY (x,y) = 8xy 1(0 < x < y < 1) 6= 4x(1 x 2 ) 1(0 < x < 1)4y 3 1(0 < y < 1) = f X (x)f Y (y), Ping Yu (HKU) Probability 29 / 39

31 Conditional Distributions and Expectation Solution [continued] The conditional density of Y given X = x is f Y jx (yjx) = f (x,y) ( 8xy f X (x) = 4x(1 x 2 ) = 2y 1 x, 2 0, So when 0 < x < 1, and E [Y jx = x] = h i E Y 2 jx = x = h i Var (Y jx = x) = E Y 2 jx = x if 0 < x < y < 1, otherwise. Z 1 2y y x 1 x 2 dy = x + x 2, x Z 1 y 2 2y x 1 x 2 dy = x = x + x 2 + x x 1 + x + x 2 + x 3, E [Y jx = x] = (1 x)2 (1 + 4x + x 2 ) 18(1 + x) x + x x The right panel of the figure shows E [Y jx = x] and Var (Y jx = x) as a function of x. E [Y jx = x] is increasing with E [Y jx = 0] = 2/3 and E [Y jx = 1] = 1, and Var (Y jx = x) is decreasing with Var (Y jx = 0) = 1/18 and Var (Y jx = 1) = 0. Since E [Y jx = x] and Var (Y jx = x) are not constant, X and Y are not independent. Ping Yu (HKU) Probability 30 / 39

32 Conditional Distributions and Expectation Figure: supp(x,y ), E [Y jx = x] and Var (Y jx = x) Ping Yu (HKU) Probability 31 / 39

33 Transformations Transformations Ping Yu (HKU) Probability 32 / 39

34 Transformations Transformations Suppose that X 2 R k with continuous distribution function F X (x) and density f X (x). Let Y = g(x ) where g(x) : R k! R k is one-to-one, differentiable, and invertible. Let h(y) denote the inverse function of g(x). Recall that the Jacobian of h is defined as J(y) = det(dh(y)), where for a matrix A, det(a) is the determinant of A. Consider the univariate case k = 1. If g(x) is a strictly increasing function, then g(x ) y iff X h(y), so the distribution function of Y is F Y (y) = P(g(X ) y) = P (X h(y)) = F X (h(y)). Taking the derivative, the density of Y is f Y (y) = d dy F Y (y) = f X (h(y)) d dy h(y). Ping Yu (HKU) Probability 33 / 39

35 Transformations continue... If g(x) is a strictly decreasing function, then g(x ) y iff X h(y), so F Y (y) = P(g(X ) y) = P (X h(y)) = 1 F X (h(y)). and the density of Y is f Y (y) = d dy F Y (y) = f X (h(y)) d dy h(y). We can write these two cases jointly as f Y (y) = f X (h(y))jj(y)j, (3) where jj(y)j is the absolute value of J(y) (why? f Y (y) 0) This is known as the change-of-variables formula. - This same formula (3) holds for k > 1. Ping Yu (HKU) Probability 34 / 39

36 Transformations Example Suppose X U[0,1] and Y = log(x ). Here, g(x) = log(x) and h(y) = exp( y) so the Jacobian is J(y) = exp( y). As the range of X is [0,1], that for Y is [0, ]. Since f X (x) = 1 for 0 x 1, (3) shows that f Y (y) = exp( y), 0 y, an exponential density. Example If Z is standard normal and X = µ + σz, then the Jacobian is 1/σ and h(x) = x µ σ. Since Z has the density X has density φ(z) = 1 p 2π exp f (x) = 1 p 2πσ exp! z 2, < z <. 2! (x µ) 2 2σ 2, < x <. which is the univariate normal density. The mean and variance of the distribution are µ and σ 2, and it is conventional to write X N(µ,σ 2 ). Ping Yu (HKU) Probability 35 / 39

37 The Normal and Related Distributions The Normal and Related Distributions Ping Yu (HKU) Probability 36 / 39

38 The Normal and Related Distributions The Normal Distribution For x 2 R k, the multivariate normal density is! 1 f (x) = p 2π det(σ) 1/2 exp (x µ) 0 Σ 1 (x µ),x 2 R k. 2 - the mean and covariance matrix of the distribution are µ and Σ. - when X has the multivariate normal density, we call X follows the multivariate normal distribution and write X N(µ,Σ). If X 2 R k is multivariate n o normal and the elements of X are mutually uncorrelated, then Σ = diag σ 2 j is a diagonal matrix. In this case the density function can be written as f (x) = = 1 p 2πσ 1 σ k exp 0 k 1 B p exp@ j=1 2πσ j (x 1 µ 1 ) 2 /σ (x k µ k ) 2 /σ 2 k x j µ j 2σ 2 j which is the product of marginal univariate normal densities. This shows that if X is multivariate normal with uncorrelated elements, then they are mutually independent. Ping Yu (HKU) Probability 37 / 39 C A,!

39 The Normal and Related Distributions The Chi-Square Distribution If X N (µ,σ) is a k-dimensional multivariate normal vector, and Y = a + BX, where B is an m k matrix with full row rank, then Y N a + Bµ,BΣB 0 is a m-dimensional multivariate normal vector. - An affine transformation of a multivariate normal vector is normally distributed. Let X N (0,I r ). Then Q = X 0 X is distributed chi-square with r degree of freedom (or df for short), written as χ 2 r. - positive, asymmetric, E[X ] = r and Var(X ) = 2r (why?). [Figure here] If Z N (0,A) with A > 0, q q, then Z 0 A 1 Z χ 2 q. (why?) The t-distribution and F -distribution are also related to the normal distribution, but are used only when the sample size is small. In modern times, the sample size tends to be huge, so the chi-square distribution is the more suitable "large-sample" distribution. Ping Yu (HKU) Probability 38 / 39

40 Density The Normal and Related Distributions Figure: Chi-Square Density for r = 1,2,3,4,6,9 Ping Yu (HKU) Probability 39 / 39

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

L2: Review of probability and statistics

L2: Review of probability and statistics Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

1 Joint and marginal distributions

1 Joint and marginal distributions DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

(3) Review of Probability. ST440/540: Applied Bayesian Statistics

(3) Review of Probability. ST440/540: Applied Bayesian Statistics Review of probability The crux of Bayesian statistics is to compute the posterior distribution, i.e., the uncertainty distribution of the parameters (θ) after observing the data (Y) This is the conditional

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Statistika pro informatiku

Statistika pro informatiku Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 2 Evropský sociální

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

Multivariate Distribution Models

Multivariate Distribution Models Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Quantitative Methods in Economics Conditional Expectations

Quantitative Methods in Economics Conditional Expectations Quantitative Methods in Economics Conditional Expectations Maximilian Kasy Harvard University, fall 2016 1 / 19 Roadmap, Part I 1. Linear predictors and least squares regression 2. Conditional expectations

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Review of Statistics I

Review of Statistics I Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Lecture 1: Basics of Probability

Lecture 1: Basics of Probability Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning

More information

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS PROBABILITY AND INFORMATION THEORY Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Probability space Rules of probability

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Introduction to Normal Distribution

Introduction to Normal Distribution Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Chapter 5,6 Multiple RandomVariables

Chapter 5,6 Multiple RandomVariables Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Lecture 14: Multivariate mgf s and chf s

Lecture 14: Multivariate mgf s and chf s Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability Chapter 1 Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (1999. Other treatments of probability theory include Gallant (1997, Casella & Berger

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

Probability Theory Review Reading Assignments

Probability Theory Review Reading Assignments Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9 Probability Computer Science Tripos, Part IA R.J. Gibbens Computer Laboratory University of Cambridge Easter Term 2008/9 Last revision: 2009-05-06/r-36 1 Outline Elementary probability theory (2 lectures)

More information