Introduction to Probability and Stocastic Processes - Part I
|
|
- Howard Ryan
- 5 years ago
- Views:
Transcription
1 Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stocastic Processes - Part I p. 1/50
2 Set Definitions I A set is a collection of elements. Sets are denoted by CAPITAL letters, and elements by small letters. The symbol means is an element of, and / means is not an element of. The symbol denotes the empty set. The entire space is denoted by S (Sample Space). A set is countable if it is finite, or its elements has a one-to-one correspondence with the integers. A set A is contained in a set B, denoted A B ( or B A), if every element of A is also in B. The following three statements are always satisfied: A S, A and A A Introduction to Probability and Stocastic Processes - Part I p. 2/50
3 Set Definitions II A = B if and only if A B and B A. The union of two sets A and B, denote A B, is the set of all elements that belongs to A or B or both. The intersection of two sets A and B, denote A B, is the set of elements that belong to both A and B. Two sets A and B are mutually exclusive if A B = AB =. The complement, A, of a set A relative to a set S, consists of the elements of S, which are not in A. Introduction to Probability and Stocastic Processes - Part I p. 3/50
4 Set Definitions III The Commutative Laws A B = B A A B = B A The Associative Laws (A B) C = A (B C) = A B C (A B) C = A (B C) = A B C The Distributive laws A (B C) = (A B) (A C) A (B C) = (A B) (A C) Introduction to Probability and Stocastic Processes - Part I p. 4/50
5 Set Definitions IV DeMorgan s Laws (A B) = A B (A B) = A B Introduction to Probability and Stocastic Processes - Part I p. 5/50
6 The Sample Space The Sample Space for an experiment is the set of all possible outcomes of the experiment. The Sample Space is denoted S. An event is a subset of S (incl. S). A class S, of sets defined on S is called completely additive if 1. S S 2. If A k S for k = 1, 2, 3,..., then n k=1 A k S for n = 1, 2, 3, If A S, then A S. where A is the complement of A. Introduction to Probability and Stocastic Processes - Part I p. 6/50
7 Probabilities of Random Events Definition: A probability measure is a set function whose domain is a completely additive class S of events defined on the sample space S such that the measure satisfies the following conditions: 1. P(S) = 1 2. P(A) 0 for all A S ( N 3. P k=1 k) A = N k=1 P(A k) if A i A j = for i j, and N may be infinite A random experiment is completely described by a sample space, a probability measure, and a class of sets forming the domain set of the probability measure. The combination of these three items are called a probabilistic model. Introduction to Probability and Stocastic Processes - Part I p. 7/50
8 Probabilities of Random Events Relative Frequency Definition (Probability based on experiment): The number of times the experiment is performed is n, and n A is the number of times, where the outcome belongs to A S. P(A) = lim n n A n Classical Definition: N is the total number of outcomes, and N A is the number of outcomes that belongs to A S. P(A) = N A N Introduction to Probability and Stocastic Processes - Part I p. 8/50
9 Ex. where the classical definition fails Willard H. Longcora drilled die experiment: Thrown a die with drilled pips over one million times, using a new die every 20,000 throw because the die wore down. Upface Total Rel. freq Classical Introduction to Probability and Stocastic Processes - Part I p. 9/50
10 Probabilities of Random Events I The following is a number of Laws which can be shown using the definition of a Probability measure 1. P( ) = 0 2. P(A) 1 3. P(A) = 1 P(A) 4. If A B then P(A) P(B) 5. P(A B) = P(A) + P(B) P(A B) 6. P(A B) P(A) + P(B) Introduction to Probability and Stocastic Processes - Part I p. 10/50
11 Probabilities of Random Events II 7. If A 1 A 2 A n = S and the A i A j = if i j, then P(A) = P(A S) = P(A (A 1 A 2 A n )) = P((A A 1 ) (A A 2 ) (A A n )) = P(A A 1 ) + P(A A 2 ) + + P(A A n ) 8. ( n P i=1 A i ) = P(A 1 ) + P(A 1 A 2 ) + P(A 1 A 2 A 3 ) + n 1 +P(A n i=1 A i ) Introduction to Probability and Stocastic Processes - Part I p. 11/50
12 Joint and Marginal Probability I Consider experiment E 1 having sample space S 1 consisting of outcomes a 1,a 2,...,a n1 and experiment E 2 having sample space S 2 consisting of outcomes b 1,b 2,...,b n2. The joint sample space of the experiments is defined as S = S 1 S 2 = {(a i,b j ) : i = 1, 2,...,n 1,j = 1, 2,...,n 2 } The probability of A i B j is called the joint probability, and is denoted P(A i B j ), often abbrevated P(A i B j ). Introduction to Probability and Stocastic Processes - Part I p. 12/50
13 Joint and Marginal Probability II If the events A 1,...,A n of the experiment E 1 are mutually exclusive and exhaustive, then for the event B j S 2, and S = S 1 S 2 : P(B j ) = P(B j S) = P(B j (A 1 A 2 A n )) n = P(A i B j ) i=1 Since B j is associated with subexperiment E 2, P(B j ) is called a marginal probability. Introduction to Probability and Stocastic Processes - Part I p. 13/50
14 Conditional Probability Using the following definitions of probability P(AB) = N AB N, P(A) = N A N the probability that B will happen given that A has happened is P(B A) = N AB = N AB/N N A N A /N and the conditional probability of B given A is defined as P(B A) = P(AB) P(A) Introduction to Probability and Stocastic Processes - Part I p. 14/50
15 Joint, Marginal and Conditional Prop. Relationships: 1. P(AB) = P(A B)P(B) = P(B A)P(A) 2. If AB =, then P(A B C) = P(A C) + P(B C) 3. P(ABC) = P(A)P(B A)P(C AB) 4. If B 1,B 2,...,B m are mutually exclusive and exhaustive, then m P(A) = P(A B j )P(B j ) j=1 5. Bayes rule P(B j A) = P(A B j )P(B j ) m j=1 P(A B j)p(b j ) = P(A B j)p(b j ) P(A) Introduction to Probability and Stocastic Processes - Part I p. 15/50
16 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals Introduction to Probability and Stocastic Processes - Part I p. 16/50
17 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals a) Probability of being from manufacturer M 2 and having the defect B 1? P(M 2 B 1 ) =? Introduction to Probability and Stocastic Processes - Part I p. 16/50
18 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals a) Probability of being from manufacturer M 2 and having the defect B 1? P(M 2 B 1 ) = % Introduction to Probability and Stocastic Processes - Part I p. 16/50
19 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals b) Probability of having the defect B 2? P(B 2 ) =? Introduction to Probability and Stocastic Processes - Part I p. 16/50
20 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals b) Probability of having the defect B 2? P(B 2 ) = % Introduction to Probability and Stocastic Processes - Part I p. 16/50
21 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals c) Probability of being from manufacturer M 1? P(M 1 ) =? Introduction to Probability and Stocastic Processes - Part I p. 16/50
22 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals c) Probability of being from manufacturer M 1? P(M 1 ) = % Introduction to Probability and Stocastic Processes - Part I p. 16/50
23 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals d) Probability of having defect B 2 given it is from M 2? P(B 2 M 2 ) =? Introduction to Probability and Stocastic Processes - Part I p. 16/50
24 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals d) Probability of having defect B 2 given it is from M 2? P(B 2 M 2 ) = , 3% Introduction to Probability and Stocastic Processes - Part I p. 16/50
25 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals d) Probability of having defect B 2 given it is from M 2? P(B 2 M 2 ) = P(B 2M 2 ) P(M 2 ) = = Introduction to Probability and Stocastic Processes - Part I p. 16/50
26 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals e) Probability of being from M 1, given it has defect B 2? P(M 1 B 2 ) =? Introduction to Probability and Stocastic Processes - Part I p. 16/50
27 Ex. on joint and marginal probabilities A number of components from manufacturer M i for is grouped in the following classes of defects B j, i,j = {1,...,4}. B 1 B 2 B 3 B 4 B 5 Totals M M M M Totals e) Probability of being from M 1, given it has defect B 2? P(M 1 B 2 ) = % Introduction to Probability and Stocastic Processes - Part I p. 16/50
28 Binary Communication Channel One and zero is transmitted. If A = one transmitted B = one is received P(A) = 0.6 P(B A) = 0.90 P(B A) = 0.05 Introduction to Probability and Stocastic Processes - Part I p. 17/50
29 Binary Communication Channel One and zero is transmitted. If A = one transmitted B = one is received P(A) = 0.6 P(B A) = 0.90 P(B A) = 0.05 Probability a one is received? P(B) = P(B A)P(A) + P(B A)P(A) = = 0.56 Introduction to Probability and Stocastic Processes - Part I p. 17/50
30 Binary Communication Channel One and zero is transmitted. If A = one transmitted B = one is received P(A) = 0.6 P(B A) = 0.90 P(B A) = 0.05 Probability a one was transmitted given a one was received? P(A B) = P(B A)P(A) P(B) = = % Introduction to Probability and Stocastic Processes - Part I p. 17/50
31 Statistical Independence Two events A i and B j are statistically independent if P(A i B j ) = P(A i )P(B j ) P(A i B j ) = P(A i ) Statistical independence is not the same as mutually exclusive!! Example: Tossing a die, let A = {2, 4, 6} and B = {5, 6} Then P(AB) = 1 6 = P(A)P(B) and P(A B) = 1 2 = P(A) A and B are statistically independent but not mutually exclusive. Introduction to Probability and Stocastic Processes - Part I p. 18/50
32 Random Variables Definition:A random variable X is a function X(λ) : S R, such that 1. The set {λ : X(λ) x} is an event for every x R. 2. P(X = ) = P(X = ) = 0. Hence for every A S there corresponds a set T R called the image of A. For every set T R there exists a set X 1 (T) S, called the inverse image of T, which satisfies X 1 (T) = {λ S : X(λ) T } Notation: P(X = x) = P {λ : X(λ) = x}. Introduction to Probability and Stocastic Processes - Part I p. 19/50
33 From Sample Space to Random Variable Tossing a die the Sample Space is: X: The random variable X maps # of eyes on the Up face side of the die to R. Assume the die is fair, then P(X = x i ) = 1 6. Introduction to Probability and Stocastic Processes - Part I p. 20/50
34 Distribution Function Definition:The distribution function of the random variable X is given by F X (x) = P(X x). A distribution function has the following properties: 1. F X ( ) = 0 2. F X ( ) = 1 3. lim ǫ 0 ǫ>0 F X (x + ǫ) = F X (x) 4. F X (x 1 ) F X (x 2 ) if x 1 < x 2 5. P(x 1 < X x 2 ) = F X (x 2 ) F X (x 1 ) Introduction to Probability and Stocastic Processes - Part I p. 21/50
35 Distribution Function for a fair die x i F X (x i ) Introduction to Probability and Stocastic Processes - Part I p. 22/50
36 Joint Distribution Function Definition:The joint distribution function for the two random variables X and Y is given by F X,Y (x,y) = P[(X x) (Y y)] From this definition note, that F X,Y (, ) = 0, F X,Y (,y) = 0, F X,Y (,y) = F Y (y) F X,Y (x, ) = 0, F X,Y (, ) = 1, F X,Y (x, ) = F X (x) Introduction to Probability and Stocastic Processes - Part I p. 23/50
37 Discrete Random Variables Definition:A discrete random variable only takes on a finite set of values. The probability P(X = x i ) for i = 1, 2,...,n is called the probability mass function. A probability mass function has the following properties: 1. P(X = x i ) > 0 for i = 1, 2,...,n n 2. P(X = x i ) = 1 i=1 3. P(X x) = F X (x) = x i x P(X = x i ) 4. P(X = x i ) = lim ǫ 0 ǫ>0 (F X (x i ) F X (x i ǫ)) Introduction to Probability and Stocastic Processes - Part I p. 24/50
38 Probability Mass function for a die P(X = x i ) Fair die x i P(X = x i ) Drilled die x i Introduction to Probability and Stocastic Processes - Part I p. 25/50
39 Relationships for two Random Variables 1. P(X x,y y) = P(X = x i,y = y j ) x i x y j y m 2. P(X = x i ) = P(X = x i,y = y j ) j=1 m = P(X = x i Y = y j )P(Y = y j ) j=1 3. P(X = x i Y = y j ) = P(X = x i,y = y j ) P(Y = y j ) P(Y = y j X = x i )P(X = x i ) = n i=1 P(Y = y j X = x i )P(X = x i ) 4. X and Y are statistically independent if for all i,j P(X = x i,y = y j ) = P(X = x i )P(Y = y j ) Introduction to Probability and Stocastic Processes - Part I p. 26/50
40 Example: Joint distribution Joint distribution for two fair dies: X : # of eyes on the Up Face side of die 1. Y : # of eyes on the Up Face side of die 2. The joint probability mass function for X and Y is P(X = i,y = j) = 1 36 for i,j {1, 2,..., 6} and the joint distribution function F X,Y (x,y) = x i=1 y i= = xy 36 for x,y {1, 2,..., 6} Introduction to Probability and Stocastic Processes - Part I p. 27/50
41 Expected values, mean and variance The average or expected value of a function g of a discrete random variable X is n E{g(X)} = g(x i )P(X = x i ) i=1 The mean is defined as the expected value of the variable: n E{X} = µ X = x i P(X = x i ) i=1 The variance of a discrete random variable is defined as n E{(X µ X ) 2 } = σx 2 = (x i µ X ) 2 P(X = x i ) i=1 σ X is called the standard deviation. Introduction to Probability and Stocastic Processes - Part I p. 28/50
42 Die example!! X : # of eyes on the Up Face side of a fair die. The mean value: E{X} = µ X = 6 x i P(X = x i ) = ( ) 1 6 = 3.5 i=1 The variance E{(X µ X ) 2 } = σ 2 X = 6 i=1 (x i µ X ) 2 P(X = x i ) = If g(x i ) = x 2 i then: E{g(X)} = 6 x 2 i P(X = x i ) = i=1 Introduction to Probability and Stocastic Processes - Part I p. 29/50
43 Expected values, mean and variance I If the probability mass function is not known, but the mean and the varians are known, Tchebycheff s inequality can be used to evaluate the probability of a random variable P[ X µ X > k] σ2 X k The expected value of a function of two random variables is defined as E{g(X,Y )} = n m g(x i,y j )P(X = x i,y = y j ) i=1 j=1 Introduction to Probability and Stocastic Processes - Part I p. 30/50
44 Expected values, mean and variance II The correlation coefficient is defined as ρ XY = E{(X µ X)(Y µ Y )} σ X σ Y = σ XY σ X σ Y σ XY is called the covariance. The correlation coefficient ρ XY [ 1, 1], if X and Y are independent, then ρ XY = 0 and if they are linearly dependent, then ρ XY = 1. Note: ρ XY = 0 does NOT imply statistical independence! Two Random variables are said to be orthogonal if E XY = 0 Introduction to Probability and Stocastic Processes - Part I p. 31/50
45 Expected values, mean and variance III The conditional expected values are defined as E{g(X,Y ) Y = y j } = E{g(X,Y ) X = x i } = it can be shown that n i=1 m j=1 g(x i,y j )P(X = x i Y = y j ) g(x i,y j )P(Y = y j X = x i ) E{g(X,Y )} = E XY {g(x,y )} = E X {E Y X [g(x,y ) X]} The conditional mean value is E{X Y = y j } = µ X Y =yj = i x i P(X = x i Y = y j ) Introduction to Probability and Stocastic Processes - Part I p. 32/50
46 The Uniform Probability Mass Function The Uniform Probability Mass Function is given by Example: Fair die P(X = x i ) P(X = x i ) = 1, i = 1, 2,...,n n x i Introduction to Probability and Stocastic Processes - Part I p. 33/50
47 The Binomial Probability Mass Function The Binomial Probability Mass Function. If P(A) = p, and the experiment is repeated n times, let X be a random variable representing the number of times A occurs, then ( ) n P(X = k) = p k (1 p) n k, k = 1, 2,...,n k where ( n) k = n! k!(n k)!. The mean value and variance of a binomial random variable are µ X = np and σ 2 X = np(1 p) Introduction to Probability and Stocastic Processes - Part I p. 34/50
48 Example: The Binomial Prob. Mass Fct For n = 10 and p = Introduction to Probability and Stocastic Processes - Part I p. 35/50
49 Example: The Binomial Prob. Mass Fct For n = 10 and p = Introduction to Probability and Stocastic Processes - Part I p. 36/50
50 Poisson Probability Mass Function Assume 1. The number of events occurring in a small time interval t λ t as t The number of events occurring in non overlapping time intervals are independent. Then the number of events in a time interval T have a Poisson Probability Mass Function of the form where λ = λ T. P(X = k) = λk k! e λ, k = 0, 1, 2,... The mean and the variance are µ X = σ 2 X = λ. Introduction to Probability and Stocastic Processes - Part I p. 37/50
51 Example: Poisson Prob. Mass Fct For λ = Introduction to Probability and Stocastic Processes - Part I p. 38/50
52 Binary communication system Ia X: input, 0 or 1, with P(X = 0) = 3 4 and P(X = 1) = 1 4 Y : output, due to noise: P(Y = 1 X = 1) = 3 4 and P(Y = 0 X = 0) = 7 8 Find P(Y = 1) and P(Y = 0) P(Y = 1) = P(Y = 1 X = 0)P(X = 0) +P(Y = 1 X = 1)P(X = 1) ( = 1 7 )( ) ( )( ) = P(Y = 0) = 1 P(Y = 1) = Introduction to Probability and Stocastic Processes - Part I p. 39/50
53 Binary communication system Ib X: input, 0 or 1, with P(X = 0) = 3 4 and P(X = 1) = 1 4 Y : output, due to noise: P(Y = 1 X = 1) = 3 4 and P(Y = 0 X = 0) = 7 8 Find P(X = 1 Y = 1) P(X = 1 Y = 1) = = P(Y = 1 X = 1)P(X = 1) P(Y = 1) ( 34 )( 14 ) 9 32 = 2 3 Introduction to Probability and Stocastic Processes - Part I p. 40/50
54 Binary communication system IIa Binary data are sent in blocks of 16 digits over a noisy communication channel. p = 0.1 is the probability that a digit is in error (independent of whether other digits are in error). X: Number of errors per block. X has a binomial distribution. ( ) 16 P(X = k) = (0.1) k (0.9) 16 k, k = 1, 2,...,16 k Find the average number of errors per block E{X} = np = (16)(0.1) = 1.6 Introduction to Probability and Stocastic Processes - Part I p. 41/50
55 Binary communication system IIb X: Number of errors per block. X has a binomial distribution. ( ) 16 P(X = k) = (0.1) k (0.9) 16 k, k = 1, 2,...,16 k Find the variance of X σ 2 X = np(1 p) = (16)(0.1)(0.9) = 1.44 Find P(X 5) P(X 5) = 1 P(X 4) 4 ( ) 16 = 1 (0.1) k (0.9) 16 k k k=0 Introduction to Probability and Stocastic Processes - Part I p. 42/50
56 Continuous Random Variables A continuous random variable can take any value in an interval of the real line. For a continuous random variable the probability density function (pdf) is defined by Properties: 1. f X (x) 0 2. f X(x)dx = 1 f X (x) = df X(x) dx 3. P(X a) = F X (a) = a f X(x)dx 4. P(a X b) = a b f X(x)dx 5. P(X = a) = a a f X(x)dx = lim x 0 f X(a) x = 0 Introduction to Probability and Stocastic Processes - Part I p. 43/50
57 Two Continuous Random Variables The joint probability density function f X,Y (x,y) = d2 F X,Y (x,y) dxdy 0 The joint distribution function can be found as F X,Y (x,y) = y x f X,Y (µ,η)dµdη Since F X,Y (, ) = 1, then f X,Y (µ,η)dµdη = 1 Introduction to Probability and Stocastic Processes - Part I p. 44/50
58 Two Continuous Random Variables I The marginal and conditional density functions are obtained as follows: f X (x) = f Y (y) = f X,Y (x,y)dy f X,Y (x,y)dx f X Y (x y) = f X,Y (x,y), f Y (y) > 0 f Y (y) f Y X (x y) = f X,Y (x,y) f Y (y) = f X Y (x y)f Y (y) f X Y (x λ)f Y (λ)dλ Introduction to Probability and Stocastic Processes - Part I p. 45/50
59 Two Continuous Random Variables II Two random variables are statistically independent if f X,Y (x,y) = f X (x)f Y (y) Introduction to Probability and Stocastic Processes - Part I p. 46/50
60 Expected values I E{g(X,Y )} = µ X = E{X} = g(x,y)f X,Y (x,y)dxdy σ 2 X = E{(X µ X) 2 } = xf X (x)dx σ XY = E{(X µ X )(Y µ Y )} = ρ XY = E{(X µ X)(Y µ Y )} σ X σ Y (x µ X ) 2 f X (x)dx (x µ X )(y µ Y )f X,Y (x,y)dxdy Introduction to Probability and Stocastic Processes - Part I p. 47/50
61 Expected values II Conditional expected values are defined as E{g(X,Y ) Y = y} = g(x,y)f X Y (x y)dx If X and Y are independent then E{g(X)h(Y )} = E{g(X)}E{h(Y )} Introduction to Probability and Stocastic Processes - Part I p. 48/50
62 Example The joint density function of X and Y is Find a: f X,Y (x,y) = axy 1 x 3, 2 y 4 f X,Y (x,y) = 0 elsewhere 1 = = a axy dx dy = a 4y dy = 24a 4 2 y [ x 2] 3 2 dy 1 so a = 1 24 Introduction to Probability and Stocastic Processes - Part I p. 49/50
63 Example (continued) The marginal pdf of X: f X (x) = 24 1 f X (x) = xy dy = x 4 1 x 3 elsewhere The distribution function of Y is F Y (y) = 0 y 2 F Y (y) = 1 y > 4 F Y (y) = 24 1 y xv dx dv = 1 y 6 2 v dv = 12 1 (y2 4) 2 y 4 Introduction to Probability and Stocastic Processes - Part I p. 50/50
Introduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationEcon 325: Introduction to Empirical Economics
Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationQuantitative Methods in Economics Conditional Expectations
Quantitative Methods in Economics Conditional Expectations Maximilian Kasy Harvard University, fall 2016 1 / 19 Roadmap, Part I 1. Linear predictors and least squares regression 2. Conditional expectations
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationLecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability
Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic
More informationChapter 2. Probability
2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationGiven a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)
1 16.584: Lecture 2 : REVIEW Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] if AB = P[A B] = P[A] + P[B] P[AB] P[A] = 1 P[A
More informationStatistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions
Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationIntroduction to Stochastic Processes
Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationSingle Maths B: Introduction to Probability
Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationProbability Theory Review
Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationProbability Theory for Machine Learning. Chris Cremer September 2015
Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationProbability Theory. Patrick Lam
Probability Theory Patrick Lam Outline Probability Random Variables Simulation Important Distributions Discrete Distributions Continuous Distributions Most Basic Definition of Probability: number of successes
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationMotivation and Applications: Why Should I Study Probability?
Motivation and Applications: Why Should I Study Probability? As stated by Laplace, Probability is common sense reduced to calculation. You need to first learn the theory required to correctly do these
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationA review of probability theory
1 A review of probability theory In this book we will study dynamical systems driven by noise. Noise is something that changes randomly with time, and quantities that do this are called stochastic processes.
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationWhat is Probability? Probability. Sample Spaces and Events. Simple Event
What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5
More information18.440: Lecture 19 Normal random variables
18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationExamples of random experiment (a) Random experiment BASIC EXPERIMENT
Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationEE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178
EE 178 Lecture Notes 0 Course Introduction About EE178 About Probability Course Goals Course Topics Lecture Notes EE 178: Course Introduction Page 0 1 EE 178 EE 178 provides an introduction to probabilistic
More informationMath 341: Probability Eighth Lecture (10/6/09)
Math 341: Probability Eighth Lecture (10/6/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationIntroduction to Probability Theory
Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random
More informationRecap of Basic Probability Theory
02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationStatistics for Managers Using Microsoft Excel (3 rd Edition)
Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationMath-Stat-491-Fall2014-Notes-I
Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,
More informationProbability Theory Review Reading Assignments
Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"
More informationReview of Basic Probability Theory
Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More informationRecap of Basic Probability Theory
02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationNotes on Mathematics Groups
EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationRandom Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.
Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationOrigins of Probability Theory
1 16.584: INTRODUCTION Theory and Tools of Probability required to analyze and design systems subject to uncertain outcomes/unpredictability/randomness. Such systems more generally referred to as Experiments.
More informationMaking Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering
CHAPTER Duxbury Thomson Learning Making Hard Decision Probability asics Third Edition A. J. Clark School of Engineering Department of Civil and Environmental Engineering 7b FALL 003 y Dr. Ibrahim. Assakkaf
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More information4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur
4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationProperties of Probability
Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More information