Review of Statistics I

Size: px
Start display at page:

Download "Review of Statistics I"

Transcription

1 Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17,

2 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution functions Probability density function Expectation operator Normal distribution and related distributions: t-distribution, Chi-square distribution, F-distribution 2

3 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution functions Probability density function Expectation operator Normal distribution and related distributions: t-distribution, Chi-square distribution, F-distribution 2

4 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution functions Probability density function Expectation operator Normal distribution and related distributions: t-distribution, Chi-square distribution, F-distribution 2

5 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution functions Probability density function Expectation operator Normal distribution and related distributions: t-distribution, Chi-square distribution, F-distribution 2

6 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution functions Probability density function Expectation operator Normal distribution and related distributions: t-distribution, Chi-square distribution, F-distribution 2

7 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

8 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

9 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

10 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

11 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

12 Random Variables A random variable is a function from the sample space S to the real numbers. We may denote a random variable as X(S) which is a usual, measurable function The value of a random variable cannot be known before the realization of event - uncertainty- Notation: use capital letters X, Y, or Z or any other letter to denote random variables Specific values of random variable will be denoted using letters x, y or z Using this notation we may calculate P(X = x) for discrete random variables Two types of rv: discrete vs. continuous 3

13 Discrete Random Variable Can take a countable number of distinct possible values Whenever all possible values a random variable can assume can be listed (or counted) the random variable is discrete Example: the sum of two numbers on the top when two dices are rolled: x = 2,3,...,12 Example: the number of sales made by a salesperson per week: x = 0,1,2,... Example: the number of errors made by an accountant 4

14 Discrete Random Variable Can take a countable number of distinct possible values Whenever all possible values a random variable can assume can be listed (or counted) the random variable is discrete Example: the sum of two numbers on the top when two dices are rolled: x = 2,3,...,12 Example: the number of sales made by a salesperson per week: x = 0,1,2,... Example: the number of errors made by an accountant 4

15 Discrete Random Variable Can take a countable number of distinct possible values Whenever all possible values a random variable can assume can be listed (or counted) the random variable is discrete Example: the sum of two numbers on the top when two dices are rolled: x = 2,3,...,12 Example: the number of sales made by a salesperson per week: x = 0,1,2,... Example: the number of errors made by an accountant 4

16 Discrete Random Variable Can take a countable number of distinct possible values Whenever all possible values a random variable can assume can be listed (or counted) the random variable is discrete Example: the sum of two numbers on the top when two dices are rolled: x = 2,3,...,12 Example: the number of sales made by a salesperson per week: x = 0,1,2,... Example: the number of errors made by an accountant 4

17 Discrete Random Variable Can take a countable number of distinct possible values Whenever all possible values a random variable can assume can be listed (or counted) the random variable is discrete Example: the sum of two numbers on the top when two dices are rolled: x = 2,3,...,12 Example: the number of sales made by a salesperson per week: x = 0,1,2,... Example: the number of errors made by an accountant 4

18 Continuous Random Variable Can take uncountable number of values within an interval on the real line Many variables from business and economics belong to this category Example: Average disposable income in a city Example: Closing value of a stock exchange index Example: Inflation rate for a given month 5

19 Continuous Random Variable Can take uncountable number of values within an interval on the real line Many variables from business and economics belong to this category Example: Average disposable income in a city Example: Closing value of a stock exchange index Example: Inflation rate for a given month 5

20 Continuous Random Variable Can take uncountable number of values within an interval on the real line Many variables from business and economics belong to this category Example: Average disposable income in a city Example: Closing value of a stock exchange index Example: Inflation rate for a given month 5

21 Continuous Random Variable Can take uncountable number of values within an interval on the real line Many variables from business and economics belong to this category Example: Average disposable income in a city Example: Closing value of a stock exchange index Example: Inflation rate for a given month 5

22 Continuous Random Variable Can take uncountable number of values within an interval on the real line Many variables from business and economics belong to this category Example: Average disposable income in a city Example: Closing value of a stock exchange index Example: Inflation rate for a given month 5

23 Probability Distributions for Discrete R.V. Probability Distribution Function f(x) 0 f(x) = P(X = x) f(x) = 1 x Cumulative Distribution Function P(X x 0 ) = F(x 0 ) = x x 0 f(x) The probability distribution of a discrete random variable is a formula, graph or table that specifies the probability associated with each possible value the random variable can assume 6

24 Example Three coins are tossed. Let X denote the number of Heads Sample space contains 8 points: (HHH), (HHT), (HTH), (THH), (HTT), (THT), (TTH), ve (TTT) These 8 events are mutually exclusive and each has the same probability: 1/8 X can take four values: 0, 1, 2, 3 7

25 Example Three coins are tossed. Let X denote the number of Heads Sample space contains 8 points: (HHH), (HHT), (HTH), (THH), (HTT), (THT), (TTH), ve (TTT) These 8 events are mutually exclusive and each has the same probability: 1/8 X can take four values: 0, 1, 2, 3 7

26 Example Three coins are tossed. Let X denote the number of Heads Sample space contains 8 points: (HHH), (HHT), (HTH), (THH), (HTT), (THT), (TTH), ve (TTT) These 8 events are mutually exclusive and each has the same probability: 1/8 X can take four values: 0, 1, 2, 3 7

27 Example Three coins are tossed. Let X denote the number of Heads Sample space contains 8 points: (HHH), (HHT), (HTH), (THH), (HTT), (THT), (TTH), ve (TTT) These 8 events are mutually exclusive and each has the same probability: 1/8 X can take four values: 0, 1, 2, 3 7

28 Example continued Distribution of X Results x f(x) TTT 0 1/8 TTH 1 THT 1 3/8 HTT 1 THH 2 HTH 2 3/8 HHT 2 HHH 3 1/8 8

29 Example continued Distribution Function of X x f(x) = P(X = x) Note that f(x) = 1 x Find P(X 1) =? P(1 X 3) =? 9

30 Example continued Cumulative Distribution Function of X F(x) = P(X x) = 0, x < 0; 1 8, 0 x < 1; 1 2, 1 x < 2; 7 8, 2 x < 3; 1, x 3. 10

31 Example continued Distribution Function of X 0.4 OLASILIK FONKSIYONU f(x) x 1 BIRIKIMLI OLASILIK FONKSIYONU F(x) x 11

32 Expectations of Discrete R.V. Expected Value of discrete r.v. X E(X) = x xf(x) If g(x) is a function of X then the expected value of g(x) is E(g(X)) = x g(x)f(x) If g(x) = x 2 then E(g(X)) = E(X 2 ) = x x 2 f(x) 12

33 Expectations of Discrete R.V. Expected Value of discrete r.v. X E(X) = x xf(x) If g(x) is a function of X then the expected value of g(x) is E(g(X)) = x g(x)f(x) If g(x) = x 2 then E(g(X)) = E(X 2 ) = x x 2 f(x) 12

34 Expectations of Discrete R.V. Expected Value of discrete r.v. X E(X) = x xf(x) If g(x) is a function of X then the expected value of g(x) is E(g(X)) = x g(x)f(x) If g(x) = x 2 then E(g(X)) = E(X 2 ) = x x 2 f(x) 12

35 Example Let s find the expected value of X in the previous example E(X) = = 3 2 Find the expected value of g(x) = x 2 E(X 2 ) = = 3 Find the expected value of g(x) = 2x+3x 2 13

36 Example Let s find the expected value of X in the previous example E(X) = = 3 2 Find the expected value of g(x) = x 2 E(X 2 ) = = 3 Find the expected value of g(x) = 2x+3x 2 13

37 Example Let s find the expected value of X in the previous example E(X) = = 3 2 Find the expected value of g(x) = x 2 E(X 2 ) = = 3 Find the expected value of g(x) = 2x+3x 2 13

38 Variance of a Discrete RV Definition: Var(X) = E [(X E(X)) 2] = E [ (X 2 2XE(X) +(E(X)) 2 ) ] = E ( X 2) 2E(XE(X)) +E ( (E(X)) 2 ) ) = E ( X 2) 2E(X) 2 +(E(X)) 2 = E ( X 2) E(X) 2 Let µ x = E(X) then the variance can be written as Var(X) = E ( X 2) µ 2 x 14

39 Variance of a Discrete RV Definition: Var(X) = E [(X E(X)) 2] = E [ (X 2 2XE(X) +(E(X)) 2 ) ] = E ( X 2) 2E(XE(X)) +E ( (E(X)) 2 ) ) = E ( X 2) 2E(X) 2 +(E(X)) 2 = E ( X 2) E(X) 2 Let µ x = E(X) then the variance can be written as Var(X) = E ( X 2) µ 2 x 14

40 Moments of a Discrete RV Definiton: kth moment of a discrete rv is defined as µ k = E(X k ) = x x k f(x) k = 0,1,2, moment µ 1 = E(X) = population mean 2. moment µ 2 = E(X 2 ) = Var(X)+µ moment µ 3 = E(X 3 ) 4. moment µ 4 = E(X 4 ) 15

41 Central Moments of a Discrete RV Definiton: kth central moment of a discrete rv is defined as m k = E((X µ 1 ) k ) = x (x µ 1 ) k f(x) k = 0,1,2, central moment m 1 = 0 2. central moment m 2 = E((X µ 1 ) 2 ) = Var(X) 3. central moment m 3 = E((X µ 1 ) 3 ) 4. central moment m 4 = E((X µ 1 ) 4 ) 16

42 Standard Moments of a Discrete RV Definiton: kth standard moment of a discrete rv is defined as γ k = m k σ k k = 0,1,2,... σ is the population standard deviation: σ = Var(X) = E [(X µ 1 ) 2] 1. standart moment γ 1 = 0 2. standart moment γ 2 = 1 why? 3. standart moment γ 3 = m 3 skewness σ 3 4. standart moment γ 4 = m 4 kurtosis σ 4 17

43 Some Discrete Distributions Bernoulli Binomial Hypergeometric Poisson 18

44 Some Discrete Distributions Bernoulli Binomial Hypergeometric Poisson 18

45 Some Discrete Distributions Bernoulli Binomial Hypergeometric Poisson 18

46 Some Discrete Distributions Bernoulli Binomial Hypergeometric Poisson 18

47 Bernoulli(p) Distribution Expected Value: f(x) = { p, if X = 1 1 p, if X = 0 E(X) = x xf(x) = p 1+(1 p) 0 = p Second Moment: E(X 2 ) = x x 2 f(x) = p 1+(1 p) 0 = p Variance (second central moment): Var(X) = E(X 2 ) (E(X)) 2 = p p 2 = p(1 p) 19

48 Binomial Distribution Let X be the number of successes in an n independent trials of Bernoulli random experiment. If Y is distributed as Bernoulli(p) then X = Y has a Binomial(n,p) distribution. X denotes total number of successes. f(x) = ( n x ) p x (1 p) n x = n! x!(n x)! px (1 p) n x, x = 0,1,2,...,n E(X) = np Var(X) = np(1 p) 20

49 Hypergeometric Distribution If Bernoulli trials are not independent then the total number of successes does not have a Binomial distribution but follows hypergeometric distribution. In an N-element population containing B successes total number of successes X in a random sample n has the following distribution f(x) = ( B x )( ) N B n x ( ) N n Here x can take integer values between max(0,n (N B)) and min(n, B) E(X) = np, Var(X) = N n N 1 np(1 p), p = B N 21

50 Poisson Distribution Describes the distribution of the number of realizations of a certain event in a given period of time Notation: X P oisson(λ) Probability DF: f(x,λ) = λx e λ, x = 0,1,2,... x! E(X) = λ Var(X) = λ skewness = 1 λ excess kurtosis = 1 λ 22

51 Joint Distributions of Discrete RV Let X and Y be two discrete rv. Then the joint pd is f(x,y) = P(X = x Y = y) More generally, joint pd of k discrete rv denoted X 1,X 2,...,X k is f(x 1,x 2,...,x k ) = P(X 1 = x 1 X 2 = x 2,,..., X k = x k ) 23

52 Example X: number of customers waiting in line for Counter 1 in a bank, Y: number of customers waiting in line for Counter 2 in a bank. Joint pd for these two rv is given by Joint probability distribution of X and Y y \x Total Total

53 Marginal Distribution Function Marginal DF for X: f(x) = y f(x,y) Marginal DF Y: f(y) = x f(x,y) 25

54 Conditional Distribution Function Conditional DF for X given Y = y: f(x y) = f(x,y) f(y) Conditional DF for Y given X = x: f(y x) = f(x,y) f(x) 26

55 Statistical Independence Discrete rvs X and Y are said to be independent if and only if f(x,y) = f(x)f(y) or, in other words and f(x y) = f(x), f(y x) = f(y) 27

56 Covariance between two discrete rv Let g(x,y) be a function of X and Y. The expected value of this function is: E[g(X,Y)] = g(x,y)f(x,y) x y Now let g(x,y) = (X µ x )(Y µ y ). The expected value of this function is called covariance: Cov(X,Y) = (x µ x )(y µ y )f(x,y) x y It measures the linear association between two random variables. Statistical independence implies that the covariance is zero but the reverse is not true. 28

57 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

58 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

59 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

60 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

61 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

62 Probability Distributions for Continuous Random Variables A continuous rv can assume any value within some interval. Probability distribution for a continuous rv is a smooth function f(x) called probability density function (pdf). The probability associated with a certain value of X is 0. Properties of pdf f(x): f(x) 0 f(x)dx = 1 Pr(a < X < b) = b a f(x)dx 29

63 A Probability Density Function f(x) P(a<X<b) = b a f(x)dx 0 a b x 30

64 Cumulative Density Function Cumulative density function (cdf), denoted F(x), shows the probability that a random variable does not exceed a given value x Definition: F(x) = P(X x) = x f(t)dt From this definition cdf and pdf are related with f(x) = df(x) dx 31

65 Cumulative Density Function Cumulative density function (cdf), denoted F(x), shows the probability that a random variable does not exceed a given value x Definition: F(x) = P(X x) = x f(t)dt From this definition cdf and pdf are related with f(x) = df(x) dx 31

66 Cumulative Density Function Cumulative density function (cdf), denoted F(x), shows the probability that a random variable does not exceed a given value x Definition: F(x) = P(X x) = x f(t)dt From this definition cdf and pdf are related with f(x) = df(x) dx 31

67 Properties of CDF F( ) = 0, F(+ ) = 1 F(x) is a nondecreasing function of x: x 1 x 2, F(x 1 ) F(x 2 ). P(a < X < b) = F(b) F(a) = b a f(x)dx P( < X < + ) = P( < X < a)+p(a < X < b)+p(b < X < + ) a f(x)dx = f(x)dx + b a f(x)dx + + b f(x)dx F(+ ) F( ) = [F(a) F( )]+P(a < X < b)+[f(+ ) F(b)] 1 = F(a) 0+P(a < X < b)+1 F(b) P(a < X < b) = F(b) F(a) 32

68 Properties of CDF F( ) = 0, F(+ ) = 1 F(x) is a nondecreasing function of x: x 1 x 2, F(x 1 ) F(x 2 ). P(a < X < b) = F(b) F(a) = b a f(x)dx P( < X < + ) = P( < X < a)+p(a < X < b)+p(b < X < + ) a f(x)dx = f(x)dx + b a f(x)dx + + b f(x)dx F(+ ) F( ) = [F(a) F( )]+P(a < X < b)+[f(+ ) F(b)] 1 = F(a) 0+P(a < X < b)+1 F(b) P(a < X < b) = F(b) F(a) 32

69 Properties of CDF F( ) = 0, F(+ ) = 1 F(x) is a nondecreasing function of x: x 1 x 2, F(x 1 ) F(x 2 ). P(a < X < b) = F(b) F(a) = b a f(x)dx P( < X < + ) = P( < X < a)+p(a < X < b)+p(b < X < + ) a f(x)dx = f(x)dx + b a f(x)dx + + b f(x)dx F(+ ) F( ) = [F(a) F( )]+P(a < X < b)+[f(+ ) F(b)] 1 = F(a) 0+P(a < X < b)+1 F(b) P(a < X < b) = F(b) F(a) 32

70 Properties of CDF F( ) = 0, F(+ ) = 1 F(x) is a nondecreasing function of x: x 1 x 2, F(x 1 ) F(x 2 ). P(a < X < b) = F(b) F(a) = b a f(x)dx P( < X < + ) = P( < X < a)+p(a < X < b)+p(b < X < + ) a f(x)dx = f(x)dx + b a f(x)dx + + b f(x)dx F(+ ) F( ) = [F(a) F( )]+P(a < X < b)+[f(+ ) F(b)] 1 = F(a) 0+P(a < X < b)+1 F(b) P(a < X < b) = F(b) F(a) 32

71 CDF f(x) F(+ ) F(b) = 1 F(b) F(a) F( ) = F(a) b a f(x)dx = F(b) F(a) 0 a b x 33

72 Expectations of Continuous Random Variables E(X) µ x = If g(x) is a continuous function of X, then E(g(X)) = Var(X) = xf(x)dx g(x)f(x)dx (x µ x ) 2 f(x)dx 34

73 Variance of Continuous RV Var(X) = E [ (X E(X)) 2] = = = E(X 2 ) µ 2 x x 2 f(x)dx+µ 2 x ( x 2 f(x)dx Note that we used f(x)dx = 1 and xf(x)dx = E(X) µ x (x µ x ) 2 f(x)dx f(x)dx 2µ x xf(x)dx xf(x)dx ) 2 35

74 Moments of Continuous RV Definition: kth moment of a continuous rv X is given by µ k = E(X k ) = x k f(x)dx k = 0,1,2,... x X 1. moment µ 1 = E(X) = population mean 2. moment µ 2 = E(X 2 ) = Var(X)+µ moment µ 3 = E(X 3 ) 4. moment µ 4 = E(X 4 ) 36

75 Central Moments of Continuous RV Definition: kth central moment of X is m k = E((X µ 1 ) k ) = (x µ 1 ) k f(x)dx k = 0,1,2,... x X 1. central moment m 1 = 0 2. central moment m 2 = E((X µ 1 ) 2 ) = Var(X) 3. central moment m 3 = E((X µ 1 ) 3 ) 4. central moment m 4 = E((X µ 1 ) 4 ) 37

76 Properties of Expectation Operator Linearity: Let Y = a+bx then the expected value of Y is: E[Y] = E[a+bX] = a+be(x) More generally let X 1,X 2,...,X n be n continuous rvs and let Y be a linear combination of these rvs: Y = b 1 X 1 +b n X b n X n Expected value of Y is given by: E[Y] = b 1 E[X 1 ]+b 2 E[X 2 ]+...+b n E[X n ] or more compactly ( n ) E(Y) = E b i X i = i=1 n b i E(X i ) i=1 38

77 Properties of Expectation Operator Linearity: Let Y = a+bx then the expected value of Y is: E[Y] = E[a+bX] = a+be(x) More generally let X 1,X 2,...,X n be n continuous rvs and let Y be a linear combination of these rvs: Y = b 1 X 1 +b n X b n X n Expected value of Y is given by: E[Y] = b 1 E[X 1 ]+b 2 E[X 2 ]+...+b n E[X n ] or more compactly ( n ) E(Y) = E b i X i = i=1 n b i E(X i ) i=1 38

78 Properties of Expectation Operator For a nonlinear function of X we have (in general) E[h(X)] h(e(x)) For example, E(X 2 ) (E(X)) 2, E(ln(X)) ln(e(x)) For two continuous rvs X and Y ( ) X E E(X) Y E(Y) 39

79 Properties of Expectation Operator For a nonlinear function of X we have (in general) E[h(X)] h(e(x)) For example, E(X 2 ) (E(X)) 2, E(ln(X)) ln(e(x)) For two continuous rvs X and Y ( ) X E E(X) Y E(Y) 39

80 Properties of Expectation Operator For a nonlinear function of X we have (in general) E[h(X)] h(e(x)) For example, E(X 2 ) (E(X)) 2, E(ln(X)) ln(e(x)) For two continuous rvs X and Y ( ) X E E(X) Y E(Y) 39

81 Properties of Variance Let c be any constant then Var(c) = 0 Variance of Y = bx where b is a constant Var(Y) = Var(bX) = b 2 Var(X) Variance of Y = a+bx Var(Y) = Var(a+bX) = b 2 Var(X) For two independent continuous rvs X and Y Var(X +Y) = Var(X)+Var(Y) Var(X Y) = Var(X)+Var(Y) This can easily be generalized to n independent rvs. 40

82 Properties of Variance Let c be any constant then Var(c) = 0 Variance of Y = bx where b is a constant Var(Y) = Var(bX) = b 2 Var(X) Variance of Y = a+bx Var(Y) = Var(a+bX) = b 2 Var(X) For two independent continuous rvs X and Y Var(X +Y) = Var(X)+Var(Y) Var(X Y) = Var(X)+Var(Y) This can easily be generalized to n independent rvs. 40

83 Properties of Variance Let c be any constant then Var(c) = 0 Variance of Y = bx where b is a constant Var(Y) = Var(bX) = b 2 Var(X) Variance of Y = a+bx Var(Y) = Var(a+bX) = b 2 Var(X) For two independent continuous rvs X and Y Var(X +Y) = Var(X)+Var(Y) Var(X Y) = Var(X)+Var(Y) This can easily be generalized to n independent rvs. 40

84 Properties of Variance Let c be any constant then Var(c) = 0 Variance of Y = bx where b is a constant Var(Y) = Var(bX) = b 2 Var(X) Variance of Y = a+bx Var(Y) = Var(a+bX) = b 2 Var(X) For two independent continuous rvs X and Y Var(X +Y) = Var(X)+Var(Y) Var(X Y) = Var(X)+Var(Y) This can easily be generalized to n independent rvs. 40

85 Continuous Standard Uniform Distribution Notation: X U(0,1), pdf: { 1, if 0 < x < 1, f(x) = 0, otherwise. E(X) = 1 2 Var(X) =

86 General Uniform Distribution Notation: X U(a,b), pdf: f(x;a,b) = { 1 b a if a < x < b, 0, otherwise. E(X) = b+a 2 Median = b+a 2 Var(X) = (b a)2 12 Skewness = 0 Excess kurtosis =

87 General Uniform Distribution Expectation of X U(a,b): b x E(X) = a b a dx [ 1 b 2 a 2 ] = b a 2 = (b a)(b+a) 2(b a) = a+b 2 43

88 General Uniform Distribution X U(a,b) Expected value of g(x) = x 2 b E[g(x)] = x 2 1 a b a = b3 a 3 3(b a) = (b a)(b2 +ab+a 2 ) 3(b a) = a2 +ab+b 2 = E[X 2 ]. 3 44

89 General Uniform Distribution Variance of X U(a,b): Var(X) = E[(X E(X)) 2 ] = E(X 2 ) [E(X)] 2 = (a2 +ab+b 2 ) 3 = (b a)2 12 (a+b)2 4 45

90 General Uniform Distribution Cumulative density function (cdf) of U (a,b) : F(x) = P(X x) x 1 = a b a dt t x = b a a = x a b a, for a x b 0, for x < a; x a F(x) = b a, for a x b; 1, for x > b. 46

91 Uniform Distribution f(x) F(x) 1 b a 1 0 a b x 0 a b x 47

92 Example { e f(x) = x, 0 < x < ise; 0, deilse. 1 Show that this function is a pdf. 2 Draw the graph of this function and mark the area for X > 1. 3 Calculate probability: P(X > 1). 4 Find cdf. 48

93 Example { e f(x) = x, 0 < x < ise; 0, deilse. 1 Show that this function is a pdf. 2 Draw the graph of this function and mark the area for X > 1. 3 Calculate probability: P(X > 1). 4 Find cdf. 48

94 Example { e f(x) = x, 0 < x < ise; 0, deilse. 1 Show that this function is a pdf. 2 Draw the graph of this function and mark the area for X > 1. 3 Calculate probability: P(X > 1). 4 Find cdf. 48

95 Example { e f(x) = x, 0 < x < ise; 0, deilse. 1 Show that this function is a pdf. 2 Draw the graph of this function and mark the area for X > 1. 3 Calculate probability: P(X > 1). 4 Find cdf. 48

96 Example continued 1 Let us see if the conditions for pdf are satisfied: 1 (i) First of all, f(x) 0 condition is satisfied for all values within the interval 0 < x <. 2 (ii) Should integrate to 1: e = lim x e x = 0 0 e x dx = 1 e x 0 = 1 e ( e 0 ) = = 1 49

97 Example continued 1 Let us see if the conditions for pdf are satisfied: 1 (i) First of all, f(x) 0 condition is satisfied for all values within the interval 0 < x <. 2 (ii) Should integrate to 1: e = lim x e x = 0 0 e x dx = 1 e x 0 = 1 e ( e 0 ) = = 1 49

98 Example continued 1 Let us see if the conditions for pdf are satisfied: 1 (i) First of all, f(x) 0 condition is satisfied for all values within the interval 0 < x <. 2 (ii) Should integrate to 1: e = lim x e x = 0 0 e x dx = 1 e x 0 = 1 e ( e 0 ) = = 1 49

99 Example continued 1 P(X > 1) area shown on the figure. 2 P(X > 1) = 1 e x dx = e x 1 = e F(x) = x 0 e t dt = e t x 0 = e x +e 0 = 1 e x 50

100 Example continued 1 P(X > 1) area shown on the figure. 2 P(X > 1) = 1 e x dx = e x 1 = e F(x) = x 0 e t dt = e t x 0 = e x +e 0 = 1 e x 50

101 Example continued 1 P(X > 1) area shown on the figure. 2 P(X > 1) = 1 e x dx = e x 1 = e F(x) = x 0 e t dt = e t x 0 = e x +e 0 = 1 e x 50

102 Example continued CDF is F(x) = { 0, x < 0; 1 e x, 0 < x <. f ( 1x ) oyf: f( x) =e x F ( 1 x ) bof: f( x) =1 e x P( X > 1) = 1 e x dx x x 51

103 Joint Probability Density Function Let X and Y be two continuous random variables defined within the intervals < X < + and < Y < +. The joint pdf for X and Y, denoted f(x,y) is defined as f(x,y) 0, Pr(a < X < b,c < Y < d) = f(x,y)dxdy = 1, d b c a f(x, y)dxdy. 52

104 Joint Probability Density Function: Example Graph of joint density f(x,y) = xye (x2 +y 2), x > 0, y > 0, f(x,y) y x

105 Example { k(x+y), 0 < x < 1, 0 < y < 2 ise; f(x,y) = 0, deilse. Find the constant k Find P ( 0 < X < 1 2, 1 < Y < 2). 54

106 Example continued First for f(x,y) > 0 we must have k > 0. From the second condition we have k(x+y)dxdy = 1 2 ( ) 1 = k 0 2 +y dy = k = 3k = 1 ( ) 1 2 y + y Thus k = 1 3. The joint pdf is { 1 f(x,y) = 3 (x+y), 0 < x < 1, 0 < y < 2 ise; 0, deilse. 55

107 Example continued Probability is found as a volume measure: P (0 < X < 12 ), 1 < Y < 2 = = 1 3 = = (x+y)dxdy ( ) 2 y dy ) 2 ( 1 8 y + y

108 Marginal Density Function Definition: marginal pdf of X: f(x) = f(x,y)dy Bounds on the interval is the interval of Y. Marginal pdf of Y: f(y) = f(x,y)dx Bounds on the interval is the interval of X. 57

109 Example Using the joint pdf from the previous example find the marginal pdfs of X and Y. 2 1 f(x) = 0 3 (x+y)dy = 1 ) (xy + y = 2 3 (x+1) 0 58

110 Example continued Marginal pdf of X : Marginal pdf of Y: f(x) = g(y) = { 2 3 (x+1), 0 < x < 1 ; 0, otherwise. { 1 3 (y ), 0 < y < 2 ; 0, otherwise. 59

111 Conditional Density Function Given Y = y the conditional density of X is defined as : f(x y) = f(x,y) f(y) Similarly given X = x the conditional density of Y is defined as f(y x) = f(x,y) f(x) 60

112 Independence Recall that two events A and B are independent if only if: P(A B) = P(A)P(B) Similarly X and Y are independent if and only if the following condition is satisfied: f(x,y) = f(x)f(y) In other words if the joint density function can be written as the product of marginal densities then the two random variables are statistically independent. 61

113 Independence More generally let X 1,X 2,...,X n be n continuous random variables. If the joint density can be written as the product of marginal densities then these random variables are independent: f(x 1,x 2,...,x n ) = f 1 (x 1 ) f 2 (x 2 ),..., f n (x n ) n = f j (x j ) j=1 This property is useful for obtaining Maximum Likelihood estimators for certain parameters. 62

114 Independence: Example Determine if the random variables are independent in the previous example. f(x)g(x) = 2 3 (x+1)1 3 (y ) f(x,y) Thus X and Y are not independent. 63

115 Independence: Another Example Determine if X and Y are independent using the following joint pdf: { 1 f(x,y) = 9, for 1 < x < 4, 1 < y < 4; 0, otherwise. Marginal densities are Thus f(x) = g(y) = dy = dx = 1 3 f(x,y) = 1 9 = f(x)g(y) = ( 1 3 Therefore X and Y are statistically independent. )( )

116 Normal Distribution Notation: X N(µ,σ 2 ). pdf is given by ( 1 2σ 2(x µ)2 f(x;µ,σ 2 ) = 1 σ 2π exp E(X) = µ Var(X) = σ 2 skewness = 0 kurtosis = 3 ), < x < 65

117 Normal Distribution, σ 2 = 1 Different location parameters 0.4 Normal Dagilim, σ 2 = µ = 2 µ = 0 µ = 2 µ = µ = 5 φ(x) x 66

118 Normal Distribution, µ = o Different scale parameters 0.4 Normal Dagilim, µ= σ 2 = 1, µ = φ(x) 0.2 σ 2 = 2, µ = σ 2 = 3, µ = x 67

119 Standard Normal Distribution Define Z = X µ σ, pdf is given by φ(z) = 1 2π exp ( 12 z2 ), < z < E(Z) = 0 Var(Z) = 1 CDF is: Φ(z) = P(Z z) = z 1 exp ( 12 ) t2 dt 2π 68

120 Standard Normal Distribution 0.4 STANDART NORMAL DAGILIM φ(z) 1 STANDART NORMAL DAGILIM Φ(z)

121 Calculating Normal Distribution Probabilities Let X N(µ,σ 2 ). We want to calculate the following probability: P(a < X < b) = b a ( 1 σ 2π exp 1 ) 2σ 2(x µ)2 dx There is no closed-form solution to this integral. It can only be calculated using numerical methods. This requires using computational methods each time we need to evaluate a probability. Instead of this cumbersome method we can use normal distribution tables. Let us write the desired probability as follows: ( a µ P σ < X µ σ < b µ σ ) ( a µ = P σ < Z < b µ σ ) 70

122 Calculating Normal Distribution Probabilities ( a µ P σ < Z < b µ ) = Φ σ ( ) ( ) b µ a µ Φ σ σ Here Φ(z) = P(Z z) is the value of standard normal cdf at z. This can easily found by using standard normal tables. 71

123 Calculating Normal Distribution Probabilities Since the standard normal distribution is symmetric only positive values are listed in the tables. Using the symmetry property one can find the probabilities involving negative values by: e.g.: Φ( z) = P(Z z) = P(Z z) = 1 P(Z z) = 1 Φ(z) P(Z 1.25) = Φ( 1.25) = 1 Φ(1.25) = =

124 Calculating Normal Distribution Probabilities

125 Calculating Normal Distribution Probabilities P( 1.25<Z<1.25) = Φ(1.25) Φ( 1.25) = Φ(1.25) (1 Φ(1.25)) = = Φ( 1.25) = Φ(1.25) =

126 CENTRAL LIMIT THEOREM - CLT Let X 1,X 2,...,X n be a random sample of n iid random variables each having the same mean µ and variance σ 2. More compactly: X i i.i.d (µ,σ 2 ), i = 1,2,...,n iid means: identically and independently distributed Note that we do not mention the name of their distribution. Expected value and variance of the sum of these n iid variables will be: E[X 1 +X X n ] = E[X 1 ]+E[X 2 ]+...+E[X n ] = nµ Var[X 1 +X X n ] = Var[X 1 ]+Var[X 2 ]+...+Var[X n ] = nσ 2 75

127 CENTRAL LIMIT THEOREM - CLT Let X denote the sum of these rvs: X = X 1 +X X n, then Z = X E(X) Var(X) = X nµ nσ 2 = X n µ n 1/2 n σ = X µ σ/ n N(0,1) According to CLT as n, the expression above converges to standard normal distribution: Z N(0,1) 76

128 CENTRAL LIMIT THEOREM - CLT n= 1 n= n= n= n= n= n= n= n=

129 Law of Large Numbers - LLN According to LLN, the sample mean of n iid random variables converges to the population mean as the sample size n increases. Let X n = 1 n (X 1 +X X n ) be the sample mean then the LLN says that n, X n µ In other words, for a positive number ɛ that we can arbitrarily choose as small as possible we can write: lim P [ X n µ < ɛ ] = 1 n 78

130 CLT - Example Let X 1,X 2,...,X 12 be an iid random sample each distributed as U (0,b), b > 0. Using CLT show that the probability P( b 4 < X < 3b 4 ) is approximately Answer: These 12 iid rvs come from the same population. So we need to find the population mean and variance first. For Uniform(a, b) distribution these quantities are Hence in our example µ x = b+a 2, σ2 x = (b a)2 12 µ x = b 2, σ2 x = b

131 CLT - Example Let X 1,X 2,...,X 12 be an iid random sample each distributed as U (0,b), b > 0. Using CLT show that the probability P( b 4 < X < 3b 4 ) is approximately Answer: These 12 iid rvs come from the same population. So we need to find the population mean and variance first. For Uniform(a, b) distribution these quantities are Hence in our example µ x = b+a 2, σ2 x = (b a)2 12 µ x = b 2, σ2 x = b

132 CLT - Example, cont d Using CLT we have: P ( b 4 < X < 3b ) 4 Var(X) = σ2 x n = b2 144 ( b 4 = P b 2 < X µ 3b x b 12 σ 2 x /n < 4 ) b 2 b 12 = P( 3 < Z < 3) = Φ(3) (1 Φ(3)) = ( ) =

133 Chi-square Distribution Square of a standard normal random variable is distributed as chi-square with 1 degrees of freedom: If Z N(0,1) then Z 2 χ 2 1 Sum of squares of n independent standard normal variables is distributed as a chi-squared with n degrees of freedom: n If Z i i.i.d. N(0,1) i = 1,2,...,n then Zi 2 χ 2 n Chi-square distribution has one parameter: ν (nu) degrees of freedom. For χ 2 ν the pdf is 1 f(x) = 2 ν/2 Γ(ν/2) xν/2 1 e x/2, x > 0, ν > 0 where Γ is the gamma distribution: Γ(α) = 0 x α 1 e x dx, α > 0 81 i=1

134 Chi-square Distribution Square of a standard normal random variable is distributed as chi-square with 1 degrees of freedom: If Z N(0,1) then Z 2 χ 2 1 Sum of squares of n independent standard normal variables is distributed as a chi-squared with n degrees of freedom: n If Z i i.i.d. N(0,1) i = 1,2,...,n then Zi 2 χ 2 n Chi-square distribution has one parameter: ν (nu) degrees of freedom. For χ 2 ν the pdf is 1 f(x) = 2 ν/2 Γ(ν/2) xν/2 1 e x/2, x > 0, ν > 0 where Γ is the gamma distribution: Γ(α) = 0 x α 1 e x dx, α > 0 81 i=1

135 Chi-square Distribution Square of a standard normal random variable is distributed as chi-square with 1 degrees of freedom: If Z N(0,1) then Z 2 χ 2 1 Sum of squares of n independent standard normal variables is distributed as a chi-squared with n degrees of freedom: n If Z i i.i.d. N(0,1) i = 1,2,...,n then Zi 2 χ 2 n Chi-square distribution has one parameter: ν (nu) degrees of freedom. For χ 2 ν the pdf is 1 f(x) = 2 ν/2 Γ(ν/2) xν/2 1 e x/2, x > 0, ν > 0 where Γ is the gamma distribution: Γ(α) = 0 x α 1 e x dx, α > 0 81 i=1

136 Chi-square Distribution Let χ 2 ν be a chi-square random variable with ν degrees of freedom. Then the expected value and variance are: E(χ 2 ν ) = ν ve Var(χ2 ν ) = 2ν If the population is normal then the ratio of the sum of squared deviations from the sample mean to population variance has a chi-square distribution with n 1 degrees of freedom: (n 1)s 2 n i=1 σ 2 = (X i X) 2 σ 2 χ 2 n 1 Meaning of dof: the knowledge of n quantities (X i X) is equivalent to (n 1) mathematically independent components. Since we first estimate the sample mean from the observation set there are n 1 mathematically independent terms. 82

137 Chi-square Distribution Let χ 2 ν be a chi-square random variable with ν degrees of freedom. Then the expected value and variance are: E(χ 2 ν ) = ν ve Var(χ2 ν ) = 2ν If the population is normal then the ratio of the sum of squared deviations from the sample mean to population variance has a chi-square distribution with n 1 degrees of freedom: (n 1)s 2 n i=1 σ 2 = (X i X) 2 σ 2 χ 2 n 1 Meaning of dof: the knowledge of n quantities (X i X) is equivalent to (n 1) mathematically independent components. Since we first estimate the sample mean from the observation set there are n 1 mathematically independent terms. 82

138 Chi-square Distribution Let χ 2 ν be a chi-square random variable with ν degrees of freedom. Then the expected value and variance are: E(χ 2 ν ) = ν ve Var(χ2 ν ) = 2ν If the population is normal then the ratio of the sum of squared deviations from the sample mean to population variance has a chi-square distribution with n 1 degrees of freedom: (n 1)s 2 n i=1 σ 2 = (X i X) 2 σ 2 χ 2 n 1 Meaning of dof: the knowledge of n quantities (X i X) is equivalent to (n 1) mathematically independent components. Since we first estimate the sample mean from the observation set there are n 1 mathematically independent terms. 82

139 Chi-square Distribution Skewed to right, i.e., right tail is longer than the left. This means that the 3rd standard moment is positive. As the degrees of freedom parameter ν gets bigger distribution becomes more symmetric. In the limit, it converges to the normal distribution. Chi-square probabilities can easily be calculated using appropriate tables. 83

140 Chi-square Distribution Skewed to right, i.e., right tail is longer than the left. This means that the 3rd standard moment is positive. As the degrees of freedom parameter ν gets bigger distribution becomes more symmetric. In the limit, it converges to the normal distribution. Chi-square probabilities can easily be calculated using appropriate tables. 83

141 Chi-square Distribution Skewed to right, i.e., right tail is longer than the left. This means that the 3rd standard moment is positive. As the degrees of freedom parameter ν gets bigger distribution becomes more symmetric. In the limit, it converges to the normal distribution. Chi-square probabilities can easily be calculated using appropriate tables. 83

142 Chi-square Distribution 0.5 ν serbestlik dereceli ki kare dagilimi ν = ν = ν = ν =

143 Chi-square Distribution 0.1 ν=10 serbestlik dereceli ki kare dagiliminin olasilik yog.fonks f(χ 10 ) P(χ >15.99) = χ 2 (10) 85

144 Student t Distribution Let Z and Y be two rvs defined as follows: Z N(0,1), and Y χ 2 ν. The random variable defined below follows a t distribution with ν degrees of freedom: t ν = Z Y/ν t ν The rv t ν has a t distribution with ν dof defined in the denominator. PDF is given by: f(t) = Γ( ) ν+1 2 Γ(ν/2) πν 1 (1+(t 2 /ν)) (ν+1)/2, < t < It has one parameter (ν) and it has symmetric shape. For E(t ν ) = 0 and ν 3 Var(t ν ) = ν/(ν 2) ν, t ν N(0,1) 86

145 Student t Distribution Let Z and Y be two rvs defined as follows: Z N(0,1), and Y χ 2 ν. The random variable defined below follows a t distribution with ν degrees of freedom: t ν = Z Y/ν t ν The rv t ν has a t distribution with ν dof defined in the denominator. PDF is given by: f(t) = Γ( ) ν+1 2 Γ(ν/2) πν 1 (1+(t 2 /ν)) (ν+1)/2, < t < It has one parameter (ν) and it has symmetric shape. For E(t ν ) = 0 and ν 3 Var(t ν ) = ν/(ν 2) ν, t ν N(0,1) 86

146 Student t Distribution Let Z and Y be two rvs defined as follows: Z N(0,1), and Y χ 2 ν. The random variable defined below follows a t distribution with ν degrees of freedom: t ν = Z Y/ν t ν The rv t ν has a t distribution with ν dof defined in the denominator. PDF is given by: f(t) = Γ( ) ν+1 2 Γ(ν/2) πν 1 (1+(t 2 /ν)) (ν+1)/2, < t < It has one parameter (ν) and it has symmetric shape. For E(t ν ) = 0 and ν 3 Var(t ν ) = ν/(ν 2) ν, t ν N(0,1) 86

147 Student t Distribution 0.4 Standart Normal t ν=1 ν

148 Student t Distribution Standart Normal t ν ν=1 t ν ν=

149 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=

150 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=3 t ν ν=

151 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=3 t ν ν=4 0.3 t ν ν=

152 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=3 t ν ν=4 0.3 t ν ν=5 t ν ν=

153 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=3 t ν ν=4 0.3 t ν ν=5 t ν ν=10 t ν ν=

154 Student t Distribution 0.4 Standart Normal t ν ν= t ν ν=2 t ν ν=3 t ν ν=4 0.3 t ν ν=5 t ν ν=10 t ν ν= t ν ν=

155 Calculating Student t Probabilities s.d. Student t dagilimi α P(t 5 >2.015)=α= α t 5 95

156 Calculating Student t Probabilities s.d. t dagilimi ve standard normal dagilim Std Normal t

157 Calculating Student t Probabilities P( < t 10 < 2.228) = = s.d. Student t dagilimi P( 2.228<t 10 <2.228) = 1 α 0.05 α = α = 0.95 α =

158 F Distribution Let X 1 χ 2 k 1 and X 2 χ 2 k 2 be two independent chi-square random variables. Then the following random variable has an F distribution F = X 1/k 1 X 2 /k 2 F(k 1,k 2 ) F distribution has two parameters: k 1 is the numerator degrees of freedom k 2 is denominator degrees of freedom 98

159 F Distribution Let X 1 χ 2 k 1 and X 2 χ 2 k 2 be two independent chi-square random variables. Then the following random variable has an F distribution F = X 1/k 1 X 2 /k 2 F(k 1,k 2 ) F distribution has two parameters: k 1 is the numerator degrees of freedom k 2 is denominator degrees of freedom 98

160 F Distribution Let X 1 χ 2 k 1 and X 2 χ 2 k 2 be two independent chi-square random variables. Then the following random variable has an F distribution F = X 1/k 1 X 2 /k 2 F(k 1,k 2 ) F distribution has two parameters: k 1 is the numerator degrees of freedom k 2 is denominator degrees of freedom 98

161 F Distribution Let X 1 χ 2 k 1 and X 2 χ 2 k 2 be two independent chi-square random variables. Then the following random variable has an F distribution F = X 1/k 1 X 2 /k 2 F(k 1,k 2 ) F distribution has two parameters: k 1 is the numerator degrees of freedom k 2 is denominator degrees of freedom 98

162 F Distribution Let X 1 χ 2 k 1 and X 2 χ 2 k 2 be two independent chi-square random variables. Then the following random variable has an F distribution F = X 1/k 1 X 2 /k 2 F(k 1,k 2 ) F distribution has two parameters: k 1 is the numerator degrees of freedom k 2 is denominator degrees of freedom 98

163 F Distribution F(2,8) F(6,8) F(6,20) F(10,30)

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Continuous random variables

Continuous random variables Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Topic 4: Continuous random variables

Topic 4: Continuous random variables Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function

More information

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur 4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur 3rd IIA-Penn State Astrostatistics School 19 27 July, 2010 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Bhamidi V Rao Indian Statistical Institute,

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Mathematics. (  : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2 ( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random

More information

Discrete Probability distribution Discrete Probability distribution

Discrete Probability distribution Discrete Probability distribution 438//9.4.. Discrete Probability distribution.4.. Binomial P.D. The outcomes belong to either of two relevant categories. A binomial experiment requirements: o There is a fixed number of trials (n). o On

More information

Math 105 Course Outline

Math 105 Course Outline Math 105 Course Outline Week 9 Overview This week we give a very brief introduction to random variables and probability theory. Most observable phenomena have at least some element of randomness associated

More information

Bayesian statistics, simulation and software

Bayesian statistics, simulation and software Module 1: Course intro and probability brush-up Department of Mathematical Sciences Aalborg University 1/22 Bayesian Statistics, Simulations and Software Course outline Course consists of 12 half-days

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Analysis of Experimental Designs

Analysis of Experimental Designs Analysis of Experimental Designs p. 1/? Analysis of Experimental Designs Gilles Lamothe Mathematics and Statistics University of Ottawa Analysis of Experimental Designs p. 2/? Review of Probability A short

More information

continuous random variables

continuous random variables continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability X is positive integer i with probability

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Theory of probability and mathematical statistics

Theory of probability and mathematical statistics Theory of probability and mathematical statistics Tomáš Mrkvička Bibliography [1] J. [2] J. Andďż l: Matematickďż statistika, SNTL/ALFA, Praha 1978 Andďż l: Statistickďż metody, Matfyzpress, Praha 1998

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

ECON 5350 Class Notes Review of Probability and Distribution Theory

ECON 5350 Class Notes Review of Probability and Distribution Theory ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables CSE 312, 2017 Winter, W.L. Ruzzo 7. continuous random variables The new bit continuous random variables Discrete random variable: values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability

More information

III - MULTIVARIATE RANDOM VARIABLES

III - MULTIVARIATE RANDOM VARIABLES Computational Methods and advanced Statistics Tools III - MULTIVARIATE RANDOM VARIABLES A random vector, or multivariate random variable, is a vector of n scalar random variables. The random vector is

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

Sampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop

Sampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop Sampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop Slides prepared by Elizabeth Newton (MIT), with some slides by Jacqueline Telford (Johns Hopkins University) 1 Sampling

More information

Probability (10A) Young Won Lim 6/12/17

Probability (10A) Young Won Lim 6/12/17 Probability (10A) Copyright (c) 2017 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

SOME SPECIFIC PROBABILITY DISTRIBUTIONS. 1 2πσ. 2 e 1 2 ( x µ

SOME SPECIFIC PROBABILITY DISTRIBUTIONS. 1 2πσ. 2 e 1 2 ( x µ SOME SPECIFIC PROBABILITY DISTRIBUTIONS. Normal random variables.. Probability Density Function. The random variable is said to be normally distributed with mean µ and variance abbreviated by x N[µ, ]

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Examples of random experiment (a) Random experiment BASIC EXPERIMENT Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

II. The Normal Distribution

II. The Normal Distribution II. The Normal Distribution The normal distribution (a.k.a., a the Gaussian distribution or bell curve ) is the by far the best known random distribution. It s discovery has had such a far-reaching impact

More information

Conditional Probability

Conditional Probability Conditional Probability Conditional Probability The Law of Total Probability Let A 1, A 2,..., A k be mutually exclusive and exhaustive events. Then for any other event B, P(B) = P(B A 1 ) P(A 1 ) + P(B

More information

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability.

To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability. Probabilistic Models Example #1 A production lot of 10,000 parts is tested for defects. It is expected that a defective part occurs once in every 1,000 parts. A sample of 500 is tested, with 2 defective

More information

Statistics, Data Analysis, and Simulation SS 2013

Statistics, Data Analysis, and Simulation SS 2013 Statistics, Data Analysis, and Simulation SS 213 8.128.73 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 23. April 213 What we ve learned so far Fundamental

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information