Discrete Random Variables

Similar documents
Discrete Random Variables

Elementary Discrete Probability

First Digit Tally Marks Final Count

Great Theoretical Ideas in Computer Science

Basics on Probability. Jingrui He 09/11/2007

CSC Discrete Math I, Spring Discrete Probability

Discrete Probability

With Question/Answer Animations. Chapter 7

What is Probability? Probability. Sample Spaces and Events. Simple Event

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Topics in Discrete Mathematics

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Statistics for Economists. Lectures 3 & 4

Binomial and Poisson Probability Distributions

Chapter 1 Principles of Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Lecture 3 Probability Basics

Formalizing Probability. Choosing the Sample Space. Probability Measures

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Discrete Probability Distributions

Probabilistic models

1 True/False. Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao

Bivariate distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

CS 361: Probability & Statistics

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Relationship between probability set function and random variable - 2 -

Properties of Probability

Single Maths B: Introduction to Probability

Question Paper Code : AEC11T03

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

STAT509: Probability

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Probabilistic models

Discrete Random Variables

Lecture 1: Basics of Probability

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Discrete Structures for Computer Science

Econ 325: Introduction to Empirical Economics

(b). What is an expression for the exact value of P(X = 4)? 2. (a). Suppose that the moment generating function for X is M (t) = 2et +1 3

Discrete Random Variable

Class 26: review for final exam 18.05, Spring 2014

Conditional Probability. CS231 Dianna Xu

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Conditional Probability

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

success and failure independent from one trial to the next?

True/False. Math 10B with Professor Stankova Worksheet, Midterm #1; Friday, 2/16/2018 GSI name: Roy Zhao

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

The possible experimental outcomes: 1, 2, 3, 4, 5, 6 (Experimental outcomes are also known as sample points)

Recitation 2: Probability

Basic Concepts of Probability

Intermediate Math Circles November 15, 2017 Probability III

3 Multiple Discrete Random Variables

SDS 321: Introduction to Probability and Statistics

2011 Pearson Education, Inc

Conditional probability

Random Variables Example:

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Conditional Probability and Bayes

Introduction to Probability, Fall 2013

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Some Basic Concepts of Probability and Information Theory: Pt. 1

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Lecture 4: Probability and Discrete Random Variables

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

Lecture 3. January 7, () Lecture 3 January 7, / 35

1. Sample spaces, events and conditional probabilities. A sample space is a finite or countable set S together with a function. P (x) = 1.

Probability & Random Variables

2.4 Conditional Probability

1 Presessional Probability

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Discrete Distributions

6 The normal distribution, the central limit theorem and random samples

Lecture 16. Lectures 1-15 Review

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Review of Probability. CS1538: Introduction to Simulations

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Edexcel past paper questions

Normal Random Variables and Probability

P(A) = Definitions. Overview. P - denotes a probability. A, B, and C - denote specific events. P (A) - Chapter 3 Probability

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Lesson B1 - Probability Distributions.notebook

Introduction to Probability Theory, Algebra, and Set Theory

Topic 3: The Expectation of a Random Variable

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Sets and Set notation. Algebra 2 Unit 8 Notes

Statistics for Business and Economics

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Transcription:

Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan

Introduction The markets can be thought of as a complex interaction of a large number of random processes, out of which emerges a relatively simple model.

Introduction The markets can be thought of as a complex interaction of a large number of random processes, out of which emerges a relatively simple model. We must understand: random variables, elementary rules of probability, expected value, variance.

Basic Definitions Definition An experiment is any activity that generates an observable outcome. An event is an outcome or set of outcomes with a specified property (generally denoted with letters A, B,... ). The probability of an event is a real number measuring the likelihood of the occurrence of the event (generally denoted P (A), P (B),... ).

Discrete Events For the time being we will assume the outcomes of experiments are discrete in the sense that the outcomes will be from a set whose members are isolated from each other by gaps.

Discrete Events For the time being we will assume the outcomes of experiments are discrete in the sense that the outcomes will be from a set whose members are isolated from each other by gaps. Example 1 Coin flip 2 Roll of the dice 3 Drawing a card from a standard deck

Properties of Probability 1 0 P (A) 1

Properties of Probability 1 0 P (A) 1 2 If P (A) = 0, then A is said to be an impossible event. If P (A) = 1, then A is said to be a certain event.

Properties of Probability 1 0 P (A) 1 2 If P (A) = 0, then A is said to be an impossible event. If P (A) = 1, then A is said to be a certain event. To determine the P (A) we can: take the empirical approach and conduct (or at least simulate) the experiment N times, count the number of times x that event A occurred, and estimate P (A) = x/n. take the classical approach and determine the number of outcomes of the experiment (call this number M), assume the outcomes are equally likely, and determine the number of outcomes y among the M in which event A occurs. Then P (A) = y/m.

Addition Rule (1 of 2) Notation: P (A B) denoted the probability that event A or B occurs.

Addition Rule (1 of 2) Notation: P (A B) denoted the probability that event A or B occurs. P (A B) depends on P (A) and P (B) but the exact formula also depends on the relationship between events A and B.

Addition Rule (1 of 2) Notation: P (A B) denoted the probability that event A or B occurs. P (A B) depends on P (A) and P (B) but the exact formula also depends on the relationship between events A and B. Definition Two events are mutually exclusive if they cannot occur together.

Addition Rule (2 of 2) Theorem (Addition Rule) P (A B) = P (A) + P (B) P (A B) where P (A B) is the probability that events A and B occur together.

Addition Rule (2 of 2) Theorem (Addition Rule) P (A B) = P (A) + P (B) P (A B) where P (A B) is the probability that events A and B occur together. Corollary If A and B are mutually exclusive then P (A B) = P (A) + P (B).

Example Example Find the probability that when a pair of fair dice are thrown the total of the dice is less than 6 or odd.

Example Example Find the probability that when a pair of fair dice are thrown the total of the dice is less than 6 or odd. Solution: P (X < 6 X odd) = P (X < 6) + P (X odd) P (X < 6 X odd) = 10 36 + 18 36 6 36 = 11 18

Monty Hall Problem A game show host hides a prize behind one of three doors. A contestant must guess which door hides the prize. First, the contestant announces the door they have chosen. The host will then open one of the two doors, not chosen, in order to reveal the prize is not behind it. The host then tells the contestant they may keep their original choice or switch to the other unopened door. Should the contestant switch doors?

Conditional Probability Definition The probability that one event occurs given that another event has occurred is called conditional probability. The probability that event A occurs given that event B has occurred is denoted P (A B).

Conditional Probability Definition The probability that one event occurs given that another event has occurred is called conditional probability. The probability that event A occurs given that event B has occurred is denoted P (A B). What is the contestant s probability of winning if they do not switch doors? Should they switch?

Multiplication Rule Theorem (Multiplication Rule) For events A and B, the probability of A and B occurring is P (A B) = P (A) P (B A), or equivalently provided P (A) > 0. P (B A) = P (A B). P (A)

Roulette Example One type of roulette wheel, known as the American type, has 38 potential outcomes represented by the integers 1 through 36 and two special outcomes 0 and 00. The positive integers are placed on alternating red and black backgrounds while 0 and 00 are on green backgrounds. What is the probability that the outcome is less than 10 and more than 3 given that the outcome is an even number?

Roulette Example One type of roulette wheel, known as the American type, has 38 potential outcomes represented by the integers 1 through 36 and two special outcomes 0 and 00. The positive integers are placed on alternating red and black backgrounds while 0 and 00 are on green backgrounds. What is the probability that the outcome is less than 10 and more than 3 given that the outcome is an even number? P (3 < X < 10 X even) P (3 < X < 10 X even) = P (X even) = 3/38 20/38 = 3 20

Independent Events Definition If the occurrence of event A does not affect the occurrence of event B we say that the events are independent. P (B A) = P (B)

Independent Events Definition If the occurrence of event A does not affect the occurrence of event B we say that the events are independent. P (B A) = P (B) Corollary If events A and B are independent then P (A B) = P (A) P (B).

Example Example What is the probability of two red outcomes on successive spins of the roulette wheel?

Example Example What is the probability of two red outcomes on successive spins of the roulette wheel? Since the outcomes of spins of the roulette wheel are independent, then P (1st spin red 2nd spin red) = P (1st spin red) P (2nd spin red) ( ) ( ) 18 18 = 38 38 = 81 361 0.224377

Random Variables Definition A random variable is a function X : S R where S is the space of outcomes of an experiment.

Random Variables Definition A random variable is a function X : S R where S is the space of outcomes of an experiment. Example Two candidates are running for office, call them Alice and Bob. Suppose N people may vote in this fair election. We can think of a random variable X as the number of votes cast for Alice. What are the possible values for X?

Random Variables Definition A random variable is a function X : S R where S is the space of outcomes of an experiment. Example Two candidates are running for office, call them Alice and Bob. Suppose N people may vote in this fair election. We can think of a random variable X as the number of votes cast for Alice. What are the possible values for X? X {0, 1,...,N}

Random Variables Definition A random variable is a function X : S R where S is the space of outcomes of an experiment. Example Two candidates are running for office, call them Alice and Bob. Suppose N people may vote in this fair election. We can think of a random variable X as the number of votes cast for Alice. What are the possible values for X? X {0, 1,...,N} Remark: the value of X is unknown until the experiment is conducted.

Probability Functions Definition A probability function (or probability distribution) is a function assigning a probability to each element in the sample space S.

Probability Functions Definition A probability function (or probability distribution) is a function assigning a probability to each element in the sample space S. If S = {x 1, x 2,..., x N } and f is a probability function then: 1 0 f(x i ) 1 for i = 1, 2,...,N, and 2 1 = N f(x i ). i=1

Bernoulli and Binomial Random Variables (1 of 2) Definition A Bernoulli random variable can be thought of has having sample space S = {0, 1} and probability function f(1) = p and f(0) = 1 p.

Bernoulli and Binomial Random Variables (1 of 2) Definition A Bernoulli random variable can be thought of has having sample space S = {0, 1} and probability function f(1) = p and f(0) = 1 p. Definition A binomial random variable is the number of successes out of n independent Bernoulli experiments. The number of trials n is fixed and the Bernoulli probabilities remain fixed between trials.

Binomial Random Variables (2 of 2) Binomial RV sample space: S = {0, 1, 2,..., n}.

Binomial Random Variables (2 of 2) Binomial RV sample space: S = {0, 1, 2,..., n}. If the number of successes is x there are ( ) n n! = x x!(n x)! ways to choose x items out of a collection of n items.

Binomial Random Variables (2 of 2) Binomial RV sample space: S = {0, 1, 2,..., n}. If the number of successes is x there are ( ) n n! = x x!(n x)! ways to choose x items out of a collection of n items. Thus P (x) is given by ( n P (x) = x ) p x (1 p) n x = n! x!(n x)! px (1 p) n x.

Example (1 of 2) Example What is the probability that in a family of four children, exactly three of them are male?

Example (1 of 2) Example What is the probability that in a family of four children, exactly three of them are male? Assuming the genders of the children are independent and equally likely to be male or female, the probability of the desired outcome is the binomial probability P (exactly 3 of 4 children male) = ( 4 3 ) ( 1 2 ) 3 ( ) 1 = 1 2 4.

Example (2 of 2) Example The probability that a computer memory chip is defective is 0.02. A SIMM (single in-line memory module) contains 16 chips for data storage and a 17th chip for error correction. The SIMM can operate correctly if one chip is defective, but not if two or more are defective. What is the probability that the SIMM functions correctly?

Example (2 of 2) Example The probability that a computer memory chip is defective is 0.02. A SIMM (single in-line memory module) contains 16 chips for data storage and a 17th chip for error correction. The SIMM can operate correctly if one chip is defective, but not if two or more are defective. What is the probability that the SIMM functions correctly? The SIMM will function correctly if there is at most one malfunctioning memory chip. P (no bad chips one bad chip) = P (no bad chips) + P (one bad chip) ( ) 17 = (1 0.02) 17 + (0.02)(1 0.02) 16 1 0.955413

Expected Value of a Random Variable Definition If X is a discrete random variable with probability distribution P (X) then the expected value of X is denoted E [X] and defined as E [X] = (X P (X)). X It is understood that the summation is taken over all values that X may assume.

Example Example What is the expected value of the number of female children in a family of 5 children?

Example Example What is the expected value of the number of female children in a family of 5 children? If X represents the number of female children in a family with five children then E [X] = = 5 (X P (X)) X=0 ( 5 i i=0 = 2.5 ( 5 i ) ( 1 2 ) i ( ) ) 1 5 i 2

Expected Value of a Function of a Random Variable Definition If F is a function applied to X, then the expected value of F is E [F(X)] = X F(X)P (X).

Example Example What is the expected value of the square of the number of female children in a family of 5 children?

Example Example What is the expected value of the square of the number of female children in a family of 5 children? If X represents the number of female children in a family with five children then [ E X 2] = = 5 X=0 ( ) X 2 P (X) ( 5 i 2 i=0 = 7.5 ( 5 i )( 1 2 ) i ( ) ) 1 5 i 2

Linearity of Expected Value Theorem If X is a random variable and a is a constant, then E [ax] = ae [X].

Linearity of Expected Value Theorem If X is a random variable and a is a constant, then E [ax] = ae [X]. Proof. E [ax] = X ((ax) P (X)) = a X (X P (X)) = ae [X].

Joint Probability Function Definition If X and Y are random variables the joint probability function of X and Y is denoted P (X, Y) where P (X, Y) = P (X Y).

Joint Probability Function Definition If X and Y are random variables the joint probability function of X and Y is denoted P (X, Y) where P (X, Y) = P (X Y). A joint probability function has the properties: 1 0 P (X, Y) 1, 2 P (X, Y) = X Y Y P (X, Y) = 1. X

Marginal Probability Definition If the joint probability function of X and Y is P (X, Y) then the quantity P (X, Y) Y is called the marginal probability of X. We will then write P (X) = Y P (X, Y). We may define a marginal probability for Y similarly.

Expected Value of a Sum (1 of 2) Theorem If X 1, X 2,...,X k are random variables then E [X 1 + X 2 + + X k ] = E [X 1 ] + E [X 2 ] + + E [X k ].

Expected Value of a Sum (2 of 2) Proof. If k = 1 then the proposition is certainly true. If k = 2 then E [X 1 + X 2 ] = ((X 1 + X 2 )P (X 1, X 2 )) X 1 X 2 = X 1 P (X 1, X 2 ) + X 2 P (X 1, X 2 ) X 1 X 2 X 2 X 1 = X 1 P (X 1, X 2 ) + P (X 1, X 2 ) X 1 X 2 X 1 X 2 X 2 = X 1 P (X 1 ) + X 2 P (X 2 ) X 1 X 2 = E [X 1 ] + E [X 2 ]. The general case then holds by induction on k.

Expected Value of a Sum of Functions Corollary Let X 1, X 2,..., X k be random variables and let F i be a function defined on X i for i = 1, 2,...,k then E [F 1 (X 1 ) + + F k (X k )] = E [F 1 (X 1 )] + + E [F k (X k )].

Expected Value of Binomial Random Variable If X i is a Bernoulli random variable then E [X i ] = (1)(p) + (0)(1 p) = p.

Expected Value of Binomial Random Variable If X i is a Bernoulli random variable then E [X i ] = (1)(p) + (0)(1 p) = p. If the binomial random variable X = X 1 + + X n where each X i is a Bernoulli random variable, then E [X] = E [X 1 ] + + E [X n ] = np.

Independent Random Variables (1 of 2) If X and Y are independent random variables then P (X, Y) = P (X) P (Y) where P (X) is the probability function for X and P (Y) is the probability function for Y.

Independent Random Variables (1 of 2) If X and Y are independent random variables then P (X, Y) = P (X) P (Y) where P (X) is the probability function for X and P (Y) is the probability function for Y. Theorem Let X 1, X 2,..., X k be pairwise independent random variables, then E [X 1 X 2 X k ] = E [X 1 ] E [X 2 ] E [X k ].

Independent Random Variables (2 of 2) Proof. When k = 1 the theorem is true. Now let X 1 and X 2 be independent random variables with joint probability distribution P (X 1, X 2 ). E [X 1 X 2 ] = X 1,X 2 X 1 X 2 P (X 1, X 2 ) = X 1 X 2 X 1 X 2 P (X 1 ) P (X 2 ) = X 1 X 1 P (X 1 ) X 2 X 2 P (X 2 ) = E [X 1 ] E [X 2 ] The general case holds by induction on k.

Variance (1 of 2) Definition If X is a random variable, the variance of X is denoted Var(X) and [ Var(X) = E (X E [X]) 2]. The standard deviation of X is denoted σ(x) = Var (X).

Variance (2 of 2) Theorem Let X be a random variable, then the variance of X is [ Var(X) = E X 2] E [X] 2.

Variance (2 of 2) Theorem Let X be a random variable, then the variance of X is [ Var(X) = E X 2] E [X] 2. Proof. Var (X) = E [(X E [X]) 2] [ = E X 2] E [2XE [X]] + E [E [X] 2] [ = E X 2] 2E [X] E [X] + E [X] 2 [ = E X 2] E [X] 2.

Example (1 of 2) Example What is the variance in the number of female children in a family of 5 children?

Example (1 of 2) Example What is the variance in the number of female children in a family of 5 children? If X represents the number of female children in a family of five children, then [ Var(X) = E X 2] (E [X]) 2 = 7.5 (2.5) 2 = 1.25.

Example (2 of 2) Example Find the variance of a Bernoulli random variable for which the probability of success is p.

Example (2 of 2) Example Find the variance of a Bernoulli random variable for which the probability of success is p. If X represents the outcome of a Bernoulli trial, then [ Var(X) = E X 2] (E [X]) 2 [ ] = 1 2 (p) + 0 2 (1 p) [1(p) + 0(1 p)] 2 = p p 2 = p(1 p).

Variance of a Sum (1 of 3) Theorem Let X 1, X 2,..., X k be pairwise independent random variables, then Var (X 1 + X 2 + + X k ) = Var (X 1 ) + Var(X 2 ) + + Var(X k ).

Variance of a Sum (2 of 3) If k = 1 then the result is trivially true. Take the case when k = 2. By the definition of variance, Var(X 1 + X 2 ) = E [((X 1 + X 2 ) E [X 1 + X 2 ]) 2] = E [((X 1 E [X 1 ]) + (X 2 E [X 2 ])) 2] [ = E (X 1 E [X 1 ]) 2] [ + E (X 2 E [X 2 ]) 2] + 2E [(X 1 E [X 1 ])(X 2 E [X 2 ])] = Var (X 1 ) + Var(X 2 ) + 2E [(X 1 E [X 1 ])(X 2 E [X 2 ])]

Variance of a Sum (2 of 3) Since we are assuming that random variables X 1 and X 2 are independent, then E [(X 1 E [X 1 ])(X 2 E [X 2 ])] = E [X 1 E [X 1 ]] E [X 2 E [X 2 ]] = (E [X 1 ] E [X 1 ])(E [X 2 ] E [X 2 ]) = 0, and thus Var (X 1 + X 2 ) = Var (X 1 ) + Var(X 2 ). The result can be extended to any finite value of k by induction.

Example Example Find the variance of a Binomial random variable with n trials when the probability of success on a single trial is p.

Example Example Find the variance of a Binomial random variable with n trials when the probability of success on a single trial is p. Each trial of a binomial experiment is a Bernoulli experiment with probability of success p. Since the n trials of a binomial experiment are independent Var(X) = np(1 p).

Variance of a Product (1 of 2) Theorem Let X 1, X 2,..., X k be pairwise independent random variables, then [ ] [ ] [ ] Var(X 1 X 2 X k ) = E X1 2 E X2 2 E Xk 2 (E [X 1 ] E [X 2 ] E [X k ]) 2.

Variance of a Product (2 of 2) [ Var (X 1 X 2 X k ) = E (X 1 X 2 X k ) 2] (E [X 1 X 2 X k ]) 2 [ ] = E = E X1 2 X 2 2 X k 2 ] [ E X2 2 [ X 2 1 (E [X 1 ] E [X 2 ] E [X k ]) 2 [ ] Xk 2 ] E (E [X 1 ] E [X 2 ] E [X k ]) 2