Introductory Probability

Similar documents
Introductory Probability

Introductory Probability

Introductory Probability

Lecture 3. Discrete Random Variables

Conditional Probability (cont'd)

6.041/6.431 Fall 2010 Quiz 2 Solutions

Conditional Probability (cont...) 10/06/2005

Intro to Contemporary Math

CMPSCI 240: Reasoning Under Uncertainty

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

5. Conditional Distributions

Binomial and Poisson Probability Distributions

Name: Firas Rassoul-Agha

Discrete Probability Distribution

Random variables (discrete)

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

MA : Introductory Probability

The random variable 1

MULTINOMIAL PROBABILITY DISTRIBUTION

ECE 302: Probabilistic Methods in Electrical Engineering

M378K In-Class Assignment #1

MA : Introductory Probability

Sampling WITHOUT replacement, Order IS important Number of Samples = 6

Topic 9 Examples of Mass Functions and Densities

Chapter 3. Chapter 3 sections

success and failure independent from one trial to the next?

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

1. Discrete Distributions

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

MATH Notebook 5 Fall 2018/2019

Conditional Probability

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Discrete Random Variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Recap of Basic Probability Theory

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices.

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Conditional distributions (discrete case)

Recap of Basic Probability Theory

Massachusetts Institute of Technology

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Sample Spaces, Random Variables

Example A. Define X = number of heads in ten tosses of a coin. What are the values that X may assume?

Recitation 2: Probability


Combinations. April 12, 2006

Distribusi Binomial, Poisson, dan Hipergeometrik

Chapter 2: Random Variables

SOR201 Solutions to Examples 3

Chapter 1. Sets and probability. 1.3 Probability space

Methods of Mathematics

STAT 430/510: Lecture 15

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

SOLUTION FOR HOMEWORK 12, STAT 4351

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Topic 3: The Expectation of a Random Variable

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Basics on Probability. Jingrui He 09/11/2007

Notes 12 Autumn 2005

Great Theoretical Ideas in Computer Science

Math 493 Final Exam December 01

(It's not always good, but we can always make it.) (4) Convert the normal distribution N to the standard normal distribution Z. Specically.

Probability (10A) Young Won Lim 6/12/17

How are Geometric and Poisson probability distributions different from the binomial probability distribution? How are they the same?

Analysis of Engineering and Scientific Data. Semester

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11

CMPT 882 Machine Learning

Math 564 Homework 1. Solutions.

Random variables. Lecture 5 - Discrete Distributions. Discrete Probability distributions. Example - Discrete probability model

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS

Continuous-Valued Probability Review

CLASS 6 July 16, 2015 STT

More on Distribution Function

Math/Stat 352 Lecture 8

1 Random Variable: Topics

6.4 Type I and Type II Errors

Random Variable. Pr(X = a) = Pr(s)

Quiz 1 Date: Monday, October 17, 2016

Discrete Probability Distributions

Communication Theory II

6.3 Bernoulli Trials Example Consider the following random experiments

RVs and their probability distributions

Lecture 2: Repetition of probability theory and statistics

MA 1125 Lecture 33 - The Sign Test. Monday, December 4, Objectives: Introduce an example of a non-parametric test.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

CS1512 Foundations of Computing Science 2. Lecture 4

Guidelines for Solving Probability Problems

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

1: PROBABILITY REVIEW

Senior Math Circles November 19, 2008 Probability II

To find the median, find the 40 th quartile and the 70 th quartile (which are easily found at y=1 and y=2, respectively). Then we interpolate:

CS 630 Basic Probability and Information Theory. Tim Campbell

CS206 Review Sheet 3 October 24, 2018

Transcription:

Introductory Probability Joint Probability with Independence; Binomial Distributions Nicholas Nguyen nicholas.nguyen@uky.edu Department of Mathematics UK

Agenda Comparing Two Variables with Joint Random Variables and Double Integrals Binomial Distributions Review Announcements: The sixth homework is available and due next Monday. The seventh homework is available and due next Wednesday. The next quiz is on Friday.

Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2

Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2

Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2

Region Y X 2 1 Y Y = X 2 X 1

End Limits 1 Y Y = X 2 X 1 The x-coordinate can go from to 1 (left and right sides of the square).

End Limits 1 Y Y = X 2 X 1 For a xed x-value, the y-coordinate can range from (the bottom edge) to x 2 (on the graph of y = x 2 ).

Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.

Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.

Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.

Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.

Independent Trials Process (Continuous) A sequence of continuous random variables X 1,...,X n that are mutually independent and have the same density function f X (x) is called an independent trials process. If X = (X 1,...,X n ), then for any point (x 1,...,x n ), the density function of X is the product of each density function: f (x 1,...,x n ) = f X (x 1 )... f X (x n ).

Independent Trials Process (Continuous) A sequence of continuous random variables X 1,...,X n that are mutually independent and have the same density function f X (x) is called an independent trials process. If X = (X 1,...,X n ), then for any point (x 1,...,x n ), the density function of X is the product of each density function: f (x 1,...,x n ) = f X (x 1 )... f X (x n ).

Independent Trials Process (Discrete) A sequence of discrete random variables X 1,...,X n that are mutually independent and have the same distribution function m X is called an independent trials process. If X = (X 1,...,X n ), then if ω = (ω 1,...,ω n ) is a sequence of outcomes, the distribution function m of X is m(ω) = m X (ω 1 )... m X (ω n ).

Independent Trials Process (Discrete) A sequence of discrete random variables X 1,...,X n that are mutually independent and have the same distribution function m X is called an independent trials process. If X = (X 1,...,X n ), then if ω = (ω 1,...,ω n ) is a sequence of outcomes, the distribution function m of X is m(ω) = m X (ω 1 )... m X (ω n ).

Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k k

Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k k

Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k. k

An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k. k The distribution function b(n,p,k) (n and p xed) is called the binomial distribution function.

Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.

Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.

Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.

Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)

Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)

Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)

Next Time Please read Section 5.1 (you can skip the historical remarks). We will study another distribution associated with Bernoulli trials: the geometric distribution. Homework 6 is due next Monday. Homework 7 is due next Wednesday. A quiz is this Friday.