Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Size: px
Start display at page:

Download "Examples of random experiment (a) Random experiment BASIC EXPERIMENT"

Transcription

1 Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the same conditions A basic outcome is a possible outcome of a random experiment Examples of random experiment (a) BASIC EXPERIMENT 1 mix the tickets in the box; 2 randomly select ONE ticket; 3 read the value on the ticket The structure of a random experiment is characterized by three objects: The sample space S; 1 1 the events set; the probability measure OUTCOME: number on the ticket Examples of random experiment (b) EXPERIMENT 1 1 Run the basic experiment; 2 do NOT insert the ticket in the box; 3 run the basic experiment Examples of random experiment (c) EXPERIMENT 2 1 Run the basic experiment; 2 reinsert the ticket in the box; 3 run the basic experiment OUTCOME: ordered pair of numbers OUTCOME: ordered pair of numbers 94 95

2 The sample space: further examples The sample space The sample space S the collection of all possible outcomes of a random experiment EXP B: S = {,1} 1 The tickets are drawn, with replacement, until a ticket with the number 1 is extracted In this case the sample space S = {1,1,1,1,} is made up of a countably infinite number of outcomes 2 Give a push to the hand and record the number it points In this case the sample space is S = [;1) that is uncountably infinite EXP 1: S = {(,),(,1),(1,),(1,1)} EXP 2: S = {(,),(,1),(1,),(1,1)} 3/4 1/4 1/ Event of a random experiment An event is a set of outcomes, that is a subset of the sample space, to which a probability is assigned Sometimes an event is described by means of a proposition, however it is always possible to represent it formally by a set of outcomes; we will denote an event by means of a capitol letter, for instance E, and it holds that E S; the tool used to deal and describe the relationships existing between event is set theory; an event occurs it the random experiment results in one of its constituent basic outcomes Examples of event (for the experiments 1 and 2) A = Two tickets with the same value are extracted ={(1,1),(,)} B = A ticket with the number 1 is obtained in the first extraction ={(1,1),(1,)} C = The product of the numbers on the extracted tickets is ={(,1),(1,),(,)} D = A ticket with the number 2 is obtained in the first extraction = E = A ticket with a numb smaller than 2 is obtained in the first extr ={(1,1),(1,),(,1),(,)} =S 98 99

3 Union and intersection of events The events set If A and B are two events in a sample space S For instance, if for the experiments 1 and 2 A = {(1,1),(,)} and B = {(1,1),(1,)} The basic outcomes of an experiment are singleton sets and are also known as elementary events; The events set is the set of all possible events A B= either A or B will occur, that is A B is the set of all outcomes in S that belong to either A or B: A B = {(1,1),(,),(1,)} A B= both A and B will occur, that is A B is the set of all outcomes in S that belong to both A and B: A B = {(1,1)} 1 11 Mutually exclusive and collectively exhaustive events Three important events A and B are mutually exclusive events if they have no basic outcomes in common; that is A B = ; mutually exclusive events are also called disjoint events; let E 1,E 2,,E k be k events of the sample space S If such events completely cover the sample space, formally E 1 E 2 E k = S then they are called collectively exhaustive Sure event : event that always occurs, whatever the result of the experiment is The sample space S is a sure event Impossible event : event that never occurs, whatever the result of the experiment is The empty set is an impossible event Complement : the complement of an event E, denoted by Ē; is the set of all basic outcomes in the sample space that do not belong to E The complement of E will occur if and only if Ē will occur We can also write Ē = S\E where \ is the set-difference operator 12 13

4 Assessing probability: basic experiment Set of the events: {,{},{1},{,1}} 1 P( ) = ; 2 P({}) = 3/5; Assessing probability: experiment 1 (a) The sample space S = {(,),(,1),(1,),(1,1)} is not made up of equally likely events; write the sample space in a different way so as to have equally likely events 3 P({1}) = 2/5; 4 P({,1}) = P(S) = 1 ( a, b ) ( a, c ) ( a,1 d ) ( a,1 e ) Note that P(S) = P({,1}) = P({} {1}) = P({}) + P({1}) = = 1 1 d a b 1 e c ( b, a ) ( b, c ) ( b,1 d ) ( b,1 e ) ( c, a ) ( c, b ) ( c,1 d ) ( c,1 e ) (1 d, a ) (1 d, b ) (1 d, c ) (1 d,1 e ) (1 e, a ) (1 e, b ) (1 e, c ) (1 e,1 d ) Assessing probability: experiment 1 (b) Event of interest: {(,),(1,1)} Number of possible orderings ( a, b ) ( a, c ) ( a,1 d ) ( a,1 e ) ( b, a ) ( b, c ) ( b,1 d ) ( b,1 e ) ( c, a ) ( c, b ) ( c,1 d ) ( c,1 e ) (1 d, a ) (1 d, b ) (1 d, c ) (1 d,1 e ) (1 e, a ) (1 e, b ) (1 e, c ) (1 e,1 d ) The number of possible ways of arranging x objects in order is given by x! = x (x 1) (x 2) 2 1 x! is read x factorial Recall that! = 1 P({(,),(1,1)}) =

5 Permutations Number of combinations The total number of permutations of x objects chosen from n, denoted by P n x, is the number of possible arrangements when x objects are to be selected from a total of n and arranged in order: P n x = n (n 1) (n 2) (n x+1) The number of combinations, denoted by C n x, of x objects chosen from n is the nuber of possible selections that can be made This number is C n x = n! x!(n x)! Note that P n x = n! (n x)! Note that C n n = C n = 1; alternative notation: Cx n = ( n) x Assessing probability (1) Probability is the chance that an uncertain event will occur Assessing probability (2) Classical probability : provided that all outcomes in the sample space are equally likely to occur, the probability of an event is the ratio between the number of outcomes that satisfy the event and the total number of outcomes in the sample space Relative frequency probability : when an experiment is performed, for any event only one of two possibilities can happen; it occurs or it does not occur The relative frequency of occurrence of an event, in a number of repetitions of the experiment, is a measure of the probability of that event More formally, frequentists sees probability as the long-run expected frequency of occurrence Subjective probability : a probability derived from an individual s personal judgment about whether a specific outcome is likely to occur Subjective probabilities contain no formal calculations and only reflect the subject s opinions and past experience BETTING APPROACH: find a specific amount to win or lose such that the decision maker is indifferent about which side of the bet to take

6 Probability rules Probability postulates Probability is a function defined on the set of the events that associates to every event A a real number P(A) that satisfies the following conditions 1 P(A) 2 P(S) = 1 3 if A and B are disjoint (A B = ) then P(A B) = P(A)+P(B) 1 For every event A it holds that P(Ā) = 1 P(A), that is called the complement rule 2 P( ) = 3 For every event A it holds that P(A) 1 4 If A = A 1 A 2 A k with A i A j = for every i j, then P(A) = P(A 1 )+P(A 2 )+ +P(A k ) 5 For every pair of events A and B it holds that P(A B) = P(A)+P(B) P(A B) this is called the addition rule Conditional probability Compute P(A2 A1) For the experiment 1 consider the following events: A1 = the result of the FIRST extraction is A2 = the result of the SECOND extraction is A2 A1 = the result of the SECOND extraction is GIVEN that is obtained in the FIRST extraction A1 A2 = the result of the FIRST extraction is GIVEN that is obtained in the SECOND extraction ( a, b ) ( a, c ) ( a,1 d ) ( a,1 e ) ( b, a ) ( b, c ) ( b,1 d ) ( b,1 e ) ( c, a ) ( c, b ) ( c,1 d ) ( c,1 e ) (1 d, a ) (1 d, b ) (1 d, c ) (1 d,1 e ) (1 e, a ) (1 e, b ) (1 e, c ) (1 e,1 d ) P(A 2 A 1 ) = # outcomes in A 1 and A 2 # outcomes in A 1 = 1 2 P(A1) = 3 5 P(A2) = 3 5 P(A2 A1) =? P(A1 A2) =? = # outcomes in A 1 and A 2 # outcomes in S # outcomes in A 1 # outcomes in S = P(A 2 A 1 ) P(A 1 )

7 Computing P(A1 A2) Multiplication rule ( a, b ) ( a, c ) ( a,1 d ) ( a,1 e ) ( b, a ) ( b, c ) ( b,1 d ) ( b,1 e ) ( c, a ) ( c, b ) ( c,1 d ) ( c,1 e ) (1 d, a ) (1 d, b ) (1 d, c ) (1 d,1 e ) (1 e, a ) (1 e, b ) (1 e, c ) (1 e,1 d ) For every pair of events A and B the probability of A given B can be computed as P(A B) = P(A B) P(B) so that P(A 1 A 2 ) = # outcomes in A 1 and A 2 # outcomes in A 2 = 1 2 P(A B) = P(A B) P(B) = # outcomes in A 1 and A 2 # outcomes in S # outcomes in A 2 # outcomes in S = P(A 2 A 1 ) P(A 2 ) or, equivalently, P(A B) = P(B A) P(A) Independence Two events A and B are said to be independent if Example with the experiment 2 P(A B) = P(A) or, equivalently, P(B A) = P(B) If two event, A and B are independent, then the multiplication rule simplifies as 1 d a b 1 e c ( a, a ) ( a, b ) ( a, c ) ( a,1 d ) ( a,1 e ) ( b, a ) ( b, b ) ( b, c ) ( b,1 d ) ( b,1 e ) ( c, a ) ( c, b ) ( c, c ) ( c,1 d ) ( c,1 e ) (1 d, a ) (1 d, b ) (1 d, c ) (1 d,1 d ) (1 d,1 e ) (1 e, a ) (1 e, b ) (1 e, c ) (1 e,1 d ) (1 e,1 e ) P(A B) = P(A) P(B) In this case it holds that Also the revers implication holds true, that is the factorization of P(A B) as P(A B) = P(A) P(B) is a sufficient condition to prove that A and B are independent P(A1) = 3 5 P(A2) = 3 5 P(A2 A1) = 3 5 P(A1 A2) =

8 Law of total probability If the events A 1,A 2,,A k form a partition of the sample space, so that 1 S = A 1 A 2 A k (collectively exhaustive); 2 A i A j = per ogni i j (mutually exclusive) Then for every event B it holds that P(B) = P(B A 1 )+P(B A 2 )+ +P(B A k ) = P(B A 1 )P(A 1 )+P(B A 2 )P(A 2 )+ +P(B A k )P(A k ) Bayes theorem Bayes formula provides an alternative way to compute conditional probabilities For every pair of events A and B it holds that P(A B) = P(B A)P(A) P(B) Typically the denominator can be computed by applying the law of total probability P(B) = P(B A)P(A)+P(B Ā)P(Ā) Which experiment? The envelopes riddle One of my friends carries out either the experiment 1 or the experiment 2; it is unknown which experiment has been carried out P(E1) = P(E2) = 1 2 the result of the experiment is {(,)} QUESTION: which experiment is most likely to have been executed? SOLUTION: it is necessary to compute P(E1 {(,)}) P(E2 {(,)}) Suppose you re on a game show, and you re given the choice of three labeled envelopes: A B C two envelops are empty and one contains 1 euro The host knows where the money is you choose one of the three envelops, say A ; the host opens one of the remaining envelops, say C, and shows that it is empty; now you are allowed to switch your envelope with the host; that is take B and handle A to the host; QUESTION: is it better for you to switch, or better not to switch?

9 The rare diseases problem (1) The accuracy of medical diagnostic test, in which a positive results indicates the presence of a disease, is often stated in terms of its sensitivity, the proportion of diseased people that test positive, and its specificity, the proportion of people without the disease who test negative D= a person has the disease ; += A person s test result is POSITIVE ; = A person s test result is NEGATIVE ; SENSITIVITY: probability that the test result is positive for a person who has the disease, P(+ D); SPECIFICITY: probability that the test result is negative for a person who has not the disease, P( D); 124 For instance P(D) = 1/1 P(+ D) =,99 P( D) =,98 The rare diseases problem (2) QUESTION: A person s test result is positive What is the probability that the person actually has the disease, P(D +) =? 125 Example of random variable: a gambling game Roughly speaking, a random variable is a numerical description of the outcome of an experiment Random variables Aim: define tools that make it possible 1 to deal more easily and effectively with random experiments; 2 to develop a general theory that can be applied to all the random experiments that share a common probabilistic structure (even though apparently distinct form each other) 3 draws with replacement; receive one euro for every 1 extracted; pay one euro for every extracted outcome gain (,, ) -3 (1,,) ց (, 1, ) -1 (,,1) ր 1 (1,1,) ց (,1,1) 1 (1,,1) ր (1,1,1)

10 Definition of random variable Discrete vs continuous random variables DEFINITION: a random variable is a function from the sample space to the real line, ie a function that maps every element of the sample space onto a single real number: X(s) IR The value taken by a random variable depends on the outcome of the experiment, and it is not known before the experiment is performed; it is important to distinguish between a random variable and the possible values that it can take Capitol letters, such as X, are used to denote random variables The corresponding lowercase letter, x, denotes the possible value; in the example of the gambling game, if X is the random variable corresponding to the gain, then X((,,)) = 3, X((,1,1)) = 1, etc 128 A random variable is said CONTINUOUS if it can take on any numerical value in an interval or collection of intervals; a random variable is said DISCRETE if it can take on either a finite number of values or a countable number of values; every value of a discrete random variable can be associated with a probability value outcome gain probability (,, ) -3 1/8 (1,,) ց (, 1, ) -1 3/8 (,,1) ր (1,1,) ց (, 1, 1) 1 3/8 (1,,1) ր (1, 1, 1) 3 1/8 129 Probability distribution A probability distribution is a function that describes the probability of a random variable taking certain values Characterization values prob 1/8 3/8 3/8 1/8 probability A discrete random variable X is characterized by its support, denoted by S X, and defined as the set of all possible values which the random variable can take on; in general gain its probability mass function or, shortly, its probability function values of X x 1 x 2 x 3 P(X = x) P(X = x 1 ) P(X = x 2 ) P(X = x 3 )

11 The probability mass function (pmf) DEFINITION: the probability mass function of a discrete random variable X is a function defined on S X that gives the probability that X is exactly equal to x S X, formally p(x) = P(X = x) for every x S X Properties of the probability mass function: 1 p(x) 2 x S X p(x) = 1 Any function that takes on values in S X and that fulfills the two properties above is a probability mass function for X 132 The (cumulative) distribution function (cdf) DEFINITION: for every real value x IR the cumulative distribution function of X is defined as F(x) = P(X x) = p(y) y S x;y x Properties of the distribution function: 1 F(x) is (not necessarily strictly) non-decreasing ; 2 lim x F(x) = and lim x + F(x) = 1; 3 F(x) is right-continuous, that is lim x x + F(x) = F(x ) Every function that satisfies the three properties above is a distribution function 133 Example of probability distribution function Graph of the distribution function for the gambling game example F(x) The distribution function is discontinuous at the points -3, -1, 1, x 3 and constant in between In the discontinuity points it takes values 1/8, 4/8, 7/8 and Expected value of a discrete random variable The expected value (or mean) of a discrete random variable X is the number E(X) = µ X = x S X x p(x) The expected value is a measure of central tendency of the probability distribution; note the similarity with the mean of a population; the expected value can be thought of as the arithmetic mean of and infinite number of realizations of the random variable; for the gambling game example E(X) = = 135

12 The variance of a discrete random variable The variance of a discrete random variable X is the number Var(X) = σ 2 X = E { [X E(X)] 2} = x S X (x µ) 2 p(x) the variance is a measure of dispersion (around the expected value) of the probability distribution; similarly to the result shown for the variance of a population, it holds that Var(X) = E(X 2 ) E(X) 2 = x S X x 2 p(x) E(X) 2 The standard deviation of a random variable The variance is not expressed in the same units as the random variable, but square units Therefore, it is necessary to transform its value by computing the square root to obtain the standard deviation of the variable SD(X) = In the gambling game example Var(X) E(X 2 ) = = 3 hence the variance is Var(X) = 3 2 = 3; and the standard deviation is SD(X) = 3 = The discrete uniform distribution The discrete uniform random variable, X, takes on a finite number of values, x 1,,x K with constant probabilities all equal to 1/K, that is values x 1 x 2 x K prob 1 1 K K 1 K hence its probability function can be written as p(x i ) = 1 for i = 1,,K K whereas is cumulative distribution function is F(x) = numb of x i x K for x IR Example of discrete uniform distribution (1) The random variable X relative to the roll of one die has discrete uniform distribution with values x 1 = 1, x 2 = 2, x 3 = 3, x 4 = 4, x 5 = 5, x 6 = 6 and p(x i ) = 1/6 p(x) probability mass function distribution function F(x) x x

13 Example of discrete uniform distribution (2) The expected value of the random variable X relative to the roll of a die is Furthermore, E(X) = 1 6 i = 35 6 i=1 E(X 2 ) = 1 6 i 2 = i=1 Var(X) = = 292 SD(X) = 292 = 171 Experiment: Box with r +s tickets: Bernoulli distribution (1) r with 1 and s with the proportion of 1 s in the box is π = r r +s extract ONE ticket Random variable Y = number on the extracted ticket Bernoulli distribution (2) The support of Y is {,1} and its probability function is p(y) = π y (1 π) 1 y π for y = 1 that is p(y) = 1 π for y = the expected value of Y is E(Y) = π and the variance Var(Y) = π(1 π); graphical representation of the probability distribution for some values of π: π = 2 π = 5 π = Experiment: Box with r +s tickets: Binomial distribution (1) r with 1 and s with the proportion of tickets with 1 is π = r r +s n tickets are extracted with replacement Random variable X= (two equivalent definitions) 1 sum of the values on the extracted tickets; 2 number of tickets with

14 Binomial distribution (2) The support of X is {,1,,n} and its probability function is p(x) = ( n) π x (1 π) n x for x =,1,,n x The expected value of X is E(X) = n π and the variance is Var(X) = n π(1 π); graphical representation of the probability distribution for some values of π with n = 1: π = 2 π = 5 π = General formulation: Binomial distribution (3) Random experiment with two possible outcomes coded as SUCCESS and FAILURE: P(SUCCESS) = π repetition of the experiment n times under the same conditions; independently X = exact number of successes in n trials then X follows a binomial distribution with parameters n and π 145 Discrete random variable: notation Continuous random variables If the distribution of X is discrete uniform, then we write Experiment: give a push to the hand X Ud{x 1,,x K } if the distribution of Y is Bernoulli with parameter π, then we write Y Bernoulli(π) or, more compactly, Y Be(π) X = value pointed by the hand when it stops Example of events: X < 5 X [; 5) 3/4 1/2 1/4 If the distribution of X is binomial with parameters n and π then we write X Binomial(n, π) or, more compactly, X Bin(n, π) 4 < X 7 X (4; 7] X = 5 X [5] X 5 X [; 5) (5; 1)

15 Assessing the probability of events If all the points are equally likely (same probability of being pointed) P (X < 5) = 5 P (4 < X 7) = 7 4 = 3 P (X = 5) =? Consider the interval [5 2 ǫ ; ǫ ] where ǫ > is a small number, then ( P 5 ǫ 2 X 5+ ǫ ) = ǫ 2 for ǫ one obtains P (X = 5) = Some comments Even though we know for sure that X will take on some real number, the probability that it takes any fixed real number is equal to zero; as a consequence, it is not possible to describe the probabilistic structure of a continuous random variable by means of a probability mass function; OBJECTIVE: identify and effective way to describe the probabilistic structure of a continuous random variable furthermore P (X 5) = 1 P (X = 5) = The cumulative distribution function An event of a continuous random variable can always be represented by an interval or by the union of disjoint intervals; the probability of any event can be computed from the probability of the events corresponding to the following family of intervals Example of cumulative distribution function For the experiment of the wheel, for x [;1), it holds that F(x) = P(X x) = P(X (; x]) = x so that ( ; x] for x IR or, equivalently, from the cumulative distribution function of X F(x) = P(X x) for x IR in the continuous case, the cumulative distribution function is characterized by the same three properties of the discrete case 15 for x < 1 F(x) = x for x < for x

16 Probability of an interval How can P(a < X b) be computed? The probability density function (pdf) The probability density function of a continuous random variable X is defined as f(x) = d dx F(x) so that F(b) = P(X b) = P(X ( ;a] (a,b]) = P(X ( ;a])+p(x (a,b]) = P(X a)+p(a < X b) = F(a)+P(a < X b) for the fundamental theorem of integral calculus it holds that b f(x) dx = F(b) F(a) = P(a < X b) a and, consequently, 1 f(x) for every x IR + 2 f(x) dx = 1 P(a < X b) = F(b) F(a) 152 Note that it is NOT required that f(x) 1 because the values of a probability density function are not probabilities 153 Example of probability density function Interpretation of the density function In the experiment of the wheel f(x) = dx = 1 for x [;1) dx and zero otherwise f(x) The values of the density function are not probabilities, f(a) P(X = a) note that a P(X = a) = f(x) dx = a P(3 X 7) = dx = 4 P(3 < X < 7) = dx = 4 P(3 X < 7) = dx = 4 P(3 < X 7) = dx = f(x) x x 154 however, for a small ǫ > it holds that a+ǫ/2 P(a ǫ/2 X a+ǫ/2) = f(x) dx ǫf(a) a ǫ/2 and therefore the probability that the outcome of the experiment is a value close to a point with higher density is larger than the corresponding probability for a point with lower density 155

17 Expected value and variance of a continuous random variable From the density function to the cum distribution function The cumulative distribution function can be computed from the density function as follows The expected value (or mean) of a continuous random variable X is the number + E(X) = µ X = x f(x) dx x F(x) = f(t) dt per x IR the variance is Var(X) = σx 2 = E { [X E(X)] 2} = (x µ)2 f(x) dx the standard deviation is SD(X) = σ X = Var(X) The continuous uniform distribution For an interval [a; b], the continuous uniform distribution has density function f(x) = 1 b a for x [a,b] and zero outside the interval The exponential distribution A random variable X has exponential distribution with parameter λ if its support is the interval [, ) and has probability density function: cumulative distribution function: f(x) = λe λx F(x) = 1 e λx E(X) = b+a 2 and Var(X) = (b a)2 ; 12 we write X Exp(λ) we write X U(a;b)

18 Exp(1) For λ = 1 the density function of the exponential distribution is f(x) = e x The memoryless property Let X be the random variable associated with the arrival-time of a given process For instance, E(X) = Var(X) = xe x dx = 1 x2 e x dx E(X) 2 = the time it takes before your next telephone call; 2 the time until default (on payment to company debt holders) in reduced form credit risk modeling; 3 the time until a radioactive particle decays, or the time between clicks of a geiger counter The memoryless property means that the future is independent of the past, ie the fact that an event hasn t happened yet, tells us nothing about how much longer it will take before it does happen This says that the conditional probability that we need to wait, for example, more than another 1 seconds before the first arrival, given that the first arrival has not yet happened after 3 seconds, is equal to the initial probability that we need to wait more than 1 seconds for the first arrival 161 Mathematical formulation of the memoryless property PROBLEM: we want to characterize the probability distribution of a random variable X describing the arrival-time of a memoryless process In mathematical terms: That is: P(X > x+y X > x) = P(X > y) for every x > P(X > x+y) = P(X > x+y X > x)p(x > x) = P(X > y)p(x > x) PROBLEM: P(X > x) must be a function G( ) such that SOLUTION: G(x+y) = G(x)G(y) for every x, y > G(x) = e Cx G(x) = e Cx is a probability for C < Hence, if λ > we can write P(X > x) = e λx then X Exp(λ) and F(x) = 1 P(X > x) = 1 e λx

19 Transformations of random variables X random variable (discrete or continuous); Y = g(x) function of X, for instance 1 Y = 5X + 3 (linear transformation); 2 Y = X 2 +1 (non-linear transformation); the expected value of Y is E(Y) = E[g(X)] and, in general, E[g(X)] g(e[x]) but it holds that E(Y) = + g(x) p(x) and E(Y) = g(x) f(x) dx x S X for the discrete and continuous case respectively Linear transformations If Y is a linear transformation of X, that is Y = a X +b then 1 E(Y) = E(aX +b) = ae(x)+b 2 Var(Y) = Var(aX +b) = a 2 Var(X) Example of linear transformation For X Exp(1) let Y = X/λ Then F Y (y) = P(Y y) = P(X/λ y) = P(X λy) = F X (λy) = 1 e λy RESULT: Y Exp(λ) APPLICATION: The standardization A specially relevant linear transformation is called STANDARD- IZATION of a random variable X and is give by it is easy to see that 1 E(Z) = ; Z = X E(X) SD(X) E(Y) = 1 λ Var(Y) = 1 λ 2 2 Var(Z) = SD(Z) =

20 Introduction to the normal (or Gaussian) distribution The normal distribution is considered the most prominent probability distribution in statistics There are several reasons for this: The STANDARD normal distribution A random variable Z has standard normal distribution if its density function is 5-1 the bell shape of the normal distribution makes it a convenient choice for modelling a large variety of random variables encountered in practice; 2 the normal distribution arises as the outcome of the central limit theorem, which states that under mild conditions the sum of a large number of random variables is distributed approximately normally; 3 the normal distribution is very tractable analytically, that is, a large number of results involving this distribution can be derived in explicit form 169 f(z) = 1 2π exp { 1 2 z2} Features of the standard normal distribution: 1 E(Z) = and Var(Z) = SD(Z) = 1 and we write Z N(,1); 2 it is symmetric around the mean and unimodal; 3 its pdf is strictly positive for every z IR but the area under the curve outside the interval ( 4;4) is close to zero z 17 Probability of some relevant intervals area between 1 and 1: 6826% The normal distribution Strictly speaking, it is not correct to talk about the normal distribution since there are many normal distributions Normal distributions can differ in their means and in their standard deviations The probability density function of a random variable X with (arbitrary) normal distribution is area between 2 and 2: 9544% area between -196 and 196: 95% f(x) = 1 { σ 2π exp 1 } 2σ 2(x µ)2 area between 3 and 3: 9974% E(X) = µ and Var(X) = σ 2 ; we write X N(µ,σ 2 )

21 Normal distribution and linear transformations If Z N(,1) then σ Z +µ = X N(µ,σ 2 ); Multiple random variables: the bivariate case Let X 1 and X 2 be two random variables: if X N(µ,σ 2 ) then X µ σ = Z N(,1); X 1 and X 2 are INDEPENDENT if and only if hence for the pdf of X it holds that 1 the area between µ σ and µ+σ è 6826%; 2 the area between µ 2σ and µ+2σ è 9544%; 3 the area between µ 196σ and µ+196σ è 95%; 4 the area between µ 3σ and µ+3σ è 9974% the cumulative distribution function of Z is denoted by Φ(z) P({X 1 x 1 } {X 2 x 2 }) = P(X 1 x 1 ) P(X 2 x 2 ) for every pair x 1 and x 2 Two (or more) random variables are IDENTICALLY DIS- TRIBUTED if they have the same probability distribution; iid=independent and identically distributed Linear combination of two random variables Let X 1 and X 2 be two random variables: a linear combination of X 1 e X 2 is a random variable defined as Expected value of a linear combination of two random variables The expected value of Y = a 1 X 1 +a 2 X 2 +b is Y = a 1 X 1 +a 2 X 2 +b where a 1, a 2 and b are real constants EXAMPLE: let X 1 and X 2 be the result of two die-rolls (a black and a white die, say) X 1 and X 2 are iid and Y = X 1 + X 2 is the linear combination corresponding to the sum of the two resulting values E(Y) = a 1 E(X 1 )+a 2 E(X 2 )+b EXAMPLE: if X 1 and X 2 are the results of the black and white die-roll then E(X 1 +X 2 ) = = 7 and E(X 1 X 2 ) =

22 Variance of a linear combination of two random variables If X 1 and X 2 are INDEPENDENT, then the variance of Y = a 1 X 1 +a 2 X 2 +b is Multiple random variables Consider the sequence of random variables X 1, X 2,, X n Var(Y) = a 2 1 Var(X 1)+a 2 2 Var(X 2) EXAMPLE: if X 1 and X 2 are the results of the black and white die-roll then, Var ( X 1 + X 2 ) = Var(X1 ) + Var(X 2 ) = = 584 Var(X 1 X 2 ) = Var(X 1 ) + Var(X 2 ) = = 584 These n random variables are MUTUALLY INDEPENDENT if and only if for every x 1,,x n it holds that P(X 1 x 1 X 2 x 2 X n x n ) = P(X 1 x 1 ) P(X 2 x 2 ) P(X n x n ) iid random variables: examples E1: A box contains tickets labeled with either or 1 Let π be the proportion of tickets with 1 n tickets are extracted with replacement from the box For i = 1,,n let X i denote the result of the ith extraction The n random variables are iid with X i Be(π) E2: The experiment of the wheel is repeated n times For i = 1,,n let X i denote the result of the ith repetition of the experiment The n random variables are iid with X i U(;1) E3: n married couples go to a dinner Husbands seat on the same side of the table, and wives on the opposite side Everybody chooses her/his seat randomly For i = 1,,n let X i be equal to 1 if the ith couple is sitting opposite each other and otherwise These n random variables are identically distributed with X i Be(1/n) but NOT INDEPENDENT Linear combination of random variables A linear combination of the random variables X 1,,X n is a random variable defined as Y = a 1 X 1 +a 2 X 2 + +a n X n +b where a 1,,a n,b are real constants Example: E1: the total number of tickets with 1 in the n extractions is Y = X 1 + +X n and its distribution is Bin(n,π); E2: the sum of the n results of the experiment is Y = X 1 + +X n ; E3: the number of married couples sitting opposite each other is Y = X 1 + +X n, but Y is NOT a binomial random variable

23 Expected value of a linear combination of random variables The expected value of Y = a 1 X 1 +a 2 X 2 + +a n X n +b is E(Y) = a 1 E(X 1 )+a 2 E(X 2 )+ +a n E(X n )+b and, more specifically, if X 1,,X n are identically distributed with it holds that EXAMPLE: E(X i ) = µ and Y = X 1 +X 2 + +X n E(Y) = n µ E1: E(X i ) = π and therefore E(Y) = n π; E2: E(X i ) = 1 2 and therefore E(Y) = n 2 ; E3: E(X i ) = 1 n and therefore E(Y) = Variance of a linear combination of random variables If the random variables X 1,,X n are INDEPENDENT, the variance of Y = a 1 X 1 +a 2 X 2 +a n X n +b is equal to Var(Y) = a 2 1 Var(X 1)+a 2 2 Var(X 2) +a 2 n Var(X n) and, more specifically, if X 1,,X n are iid with Var(X i ) = σ 2 and Y = X 1 +X 2 + +X n it holds that EXAMPLE: Var(Y) = n σ 2 E1: Var(X i ) = π(1 π) and therefore Var(Y) = nπ(1 π); E2: Var(X i ) = 1 12 and therefore Var(Y) = n 12 ; E3: Var(X i ) = 1 n ( 1 1 n ) However, Var(Y) =?: in this case independence does not hold 182 Linear combination of normally distributed random variables If X 1,,X n are INDEPENDENT and NORMALLY DISTRIBUTED then the linear combination Y = a 1 X 1 + a 2 X a n X n + b has the following properties 1 E(Y) = a 1 E(X 1 )+a 2 E(X 2 )+ +a n E(X n )+b; 2 Var(Y) = a 2 1 Var(X 1)+a 2 2 Var(X 2)+ +a 2 n Var(X n) 3 Y is normally distributed More specifically, if X 1,,X n are iid with E(X i ) = µ and Var(X i ) = σ 2 and, furthermore, Y = X 1 +X 2 + +X n, then Y N(n µ; n σ 2 ) The central limit theorem If X 1,,X n are iid with E(X i ) = µ and Var(X i ) = σ 2 then the distribution of the random variable is approximatively normal S n = X 1 +X 2 + +X n S n N(n µ; n σ 2 ) The symbol means approximatively distributed as The quality of the normal approximation increases with n, but also depends on the probability distribution of X i

24 Central limit theorem: example X 1,,X n are iid with density function f(x) = e x for x > Hence, E(X i ) = 1 and Var(X i ) = 1 From the central limit theorem it follows that X 1 +X 2 + +X n = S n N(n;n) pdf of S 4 and N(4;4) pdf of S 8 and N(8;8) pdf of S 1 = X 1 and N(1;1) pdf of S 2 and N(2;2) pdf of S 2 and N(2;2) pdf of S 5 and N(5;5)

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

1 INFO Sep 05

1 INFO Sep 05 Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur 4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

Chapter 1: Revie of Calculus and Probability

Chapter 1: Revie of Calculus and Probability Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid

More information

Probability and Probability Distributions. Dr. Mohammed Alahmed

Probability and Probability Distributions. Dr. Mohammed Alahmed Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

ISyE 6739 Test 1 Solutions Summer 2015

ISyE 6739 Test 1 Solutions Summer 2015 1 NAME ISyE 6739 Test 1 Solutions Summer 2015 This test is 100 minutes long. You are allowed one cheat sheet. 1. (50 points) Short-Answer Questions (a) What is any subset of the sample space called? Solution:

More information

Math Bootcamp 2012 Miscellaneous

Math Bootcamp 2012 Miscellaneous Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.

More information

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables CSE 312, 2017 Winter, W.L. Ruzzo 7. continuous random variables The new bit continuous random variables Discrete random variable: values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability

More information

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all Lecture 6 1 Lecture 6 Probability events Definition 1. The sample space, S, of a probability experiment is the collection of all possible outcomes of an experiment. One such outcome is called a simple

More information

2.1 Elementary probability; random sampling

2.1 Elementary probability; random sampling Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur 3rd IIA-Penn State Astrostatistics School 19 27 July, 2010 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Bhamidi V Rao Indian Statistical Institute,

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Making Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering

Making Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering CHAPTER Duxbury Thomson Learning Making Hard Decision Probability asics Third Edition A. J. Clark School of Engineering Department of Civil and Environmental Engineering 7b FALL 003 y Dr. Ibrahim. Assakkaf

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

continuous random variables

continuous random variables continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability X is positive integer i with probability

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Lecture 10. Variance and standard deviation

Lecture 10. Variance and standard deviation 18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB) 1 16.584: Lecture 2 : REVIEW Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] if AB = P[A B] = P[A] + P[B] P[AB] P[A] = 1 P[A

More information

Introduction to Probability

Introduction to Probability LECTURE NOTES Course 6.041-6.431 M.I.T. FALL 2000 Introduction to Probability Dimitri P. Bertsekas and John N. Tsitsiklis Professors of Electrical Engineering and Computer Science Massachusetts Institute

More information

STAB52. Probability Models. Contact information. Measuring uncertainty. Welcome to. (on Intranet: intranet.utsc.utoronto.

STAB52. Probability Models. Contact information. Measuring uncertainty. Welcome to. (on Intranet: intranet.utsc.utoronto. Welcome to Contact information STAB52 Instructor: Dr. Ken Butler (on Intranet: intranet.utsc.utoronto.ca, My Courses) E-mail: butler@utsc.utoronto.ca Office: H 47 Office hours: to be announced Phone: 5654

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Quick review on Discrete Random Variables

Quick review on Discrete Random Variables STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 2017 Néhémy Lim Quick review on Discrete Random Variables Notations. Z = {..., 2, 1, 0, 1, 2,...}, set of all integers; N = {0, 1, 2,...}, set of natural

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Senior Math Circles November 19, 2008 Probability II

Senior Math Circles November 19, 2008 Probability II University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Senior Math Circles November 9, 2008 Probability II Probability Counting There are many situations where

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Decision making and problem solving Lecture 1. Review of basic probability Monte Carlo simulation

Decision making and problem solving Lecture 1. Review of basic probability Monte Carlo simulation Decision making and problem solving Lecture 1 Review of basic probability Monte Carlo simulation Why probabilities? Most decisions involve uncertainties Probability theory provides a rigorous framework

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Laws of Probability, Bayes theorem, and the Central Limit Theorem

Laws of Probability, Bayes theorem, and the Central Limit Theorem Laws of, Bayes theorem, and the Central Limit Theorem 6th Penn State Astrostatistics School David Hunter Department of Statistics Penn State University Adapted from notes prepared by Rahul Roy and RL Karandikar,

More information

A random variable is a quantity whose value is determined by the outcome of an experiment.

A random variable is a quantity whose value is determined by the outcome of an experiment. Random Variables A random variable is a quantity whose value is determined by the outcome of an experiment. Before the experiment is carried out, all we know is the range of possible values. Birthday example

More information

Exam 1 - Math Solutions

Exam 1 - Math Solutions Exam 1 - Math 3200 - Solutions Spring 2013 1. Without actually expanding, find the coefficient of x y 2 z 3 in the expansion of (2x y z) 6. (A) 120 (B) 60 (C) 30 (D) 20 (E) 10 (F) 10 (G) 20 (H) 30 (I)

More information

Origins of Probability Theory

Origins of Probability Theory 1 16.584: INTRODUCTION Theory and Tools of Probability required to analyze and design systems subject to uncertain outcomes/unpredictability/randomness. Such systems more generally referred to as Experiments.

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Introduction to Information Entropy Adapted from Papoulis (1991)

Introduction to Information Entropy Adapted from Papoulis (1991) Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Chapter 4 : Discrete Random Variables

Chapter 4 : Discrete Random Variables STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable.

More information

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails. Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails. (In Mathematical language, the result of our toss is a random variable,

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Math Camp. September 16, 2017 Unit 2. MSSM Program Columbia University Dr. Satyajit Bose

Math Camp. September 16, 2017 Unit 2. MSSM Program Columbia University Dr. Satyajit Bose Math Camp September 16, 2017 Unit 2 MSSM Program Columbia University Dr. Satyajit Bose Math Camp Interlude Partial Derivatives Recall:The derivative f'(x) of a function f(x) is the slope of f(x). Alternatively,

More information