Hypergeometric, Poisson & Joint Distributions
|
|
- Rudolph Little
- 5 years ago
- Views:
Transcription
1 Hypergeometric, Poisson & Joint Distributions Sec Cathy Poliak, Ph.D. Office in Fleming 11c Department of Mathematics University of Houston Lecture Cathy Poliak, Ph.D. Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
2 Outline 1 Hypergeometric Distribution 2 Poisson Distribution 3 Joint Distributions Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
3 Beginning Example Each of 12 refrigerators of a certain type has been returned to a distributor because of an audible, high-pitched, oscillating noise when the refrigerator is running. Suppose that 7 of these refrigerators have a defective compressor and the other 5 have less serious problems. The technition looks at 6 refrigerators, what is the probability that exactly 5 have a defective compressor? Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
4 Conditions for a Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects or elements (a finite population). 2. Each individual can be characterized as a "success" or "failure." There are m successes in the population, and n failures in the population. Notice: m + n = N. 3. A sample size of k individuals is selected without replacement in such a way that each subset of size k is equally likely to be chosen. The parameters of a hypergeometric distribution is m, n, k. We write X Hyper(m, n, k). The probability mass function for a hypergeometric is: ( m )( n ) x f X (x) = P(X = x) = k x ( m+n k ) Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
5 Conditions for a Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects or elements (a finite population). 2. Each individual can be characterized as a "success" or "failure." There are m successes in the population, and n failures in the population. Notice: m + n = N. 3. A sample size of k individuals is selected without replacement in such a way that each subset of size k is equally likely to be chosen. The parameters of a hypergeometric distribution is m, n, k. We write X Hyper(m, n, k). The probability mass function for a hypergeometric is: ( m )( n ) x f X (x) = P(X = x) = k x ( m+n k ) Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
6 Using R R commands:p(x = x) = dhyper(x,m,n,k) and P(X x) = phyper(x, m, n, k) Example going back to the refrigerator example, m = 7, n = 5, k = 6. P(X = 5) > dhyper(5,7,5,6) [1] P(X 4) > phyper(4,7,5,6) [1] Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
7 Mean and Variance of a Hypergeometric Distribution Let Y have a hypergeometric distribution with parameter, m, n, and k. The mean of Y is: µ Y = E(Y ) = k ( ) m = kp. m + n The variance of Y is: ( σy 2 = var(y ) = kp(1 p) 1 k 1 ). m + n 1 1 k 1 m+n 1 is called the finite population correction factor. As, the population increases, this factor will get closer to 1. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
8 Digial Cameras A certain type of digital camera comes in either a 3-megapixel version or a 4-megapixel version. A camera store has received a shipment of 15 of these cameras, of which 6 have 3-megapixel resolution. Suppose that 5 of these cameras are randomly selected to be stored behind the counter; the other 10 are placed in a storeroom. Let X = the number of 3-megapixel cameras among the 5 selected for behind the counter storage. 1. What is the probability that exactly 2 of the 3-megapixel cameras are stored behind the counter? 2. What is the probability that at least one of the 3-megapixel cameras are stored behind the counter? 3. Calculate the mean and standard deviation of X. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
9 Cathy Poliak, Ph.D. Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
10 Example Suppose that the average number of s received by a particular student at UH is five s per hour. We want to know what is the probability that a particular student will get X number of s in any given hour. Suppose we want to know P(X = 3), the probability of getting exactly 3 s in one hour. What about in two hours? Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University of Houston Lecture 6) / 30
11 The Poisson Distribution The Poisson distribution is appropriate under the following conditions. 1. Let X be the number of successes occurring per unit of measure. X = 0, 1, 2, 3, Let µ be the mean number of successes occurring per unit of measure. 3. The number of successes that occur in two nonoverlapping units of measure are independent. 4. The probability that success will occur in a unit of measure is the same for all unites of equal size and is proportional to the size of the unit. 5. The probability that more than one event occurs in a unit of measure is negligible for very small-sized units. In other words, the events occur one at a time. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
12 The Probability Function of the Poisson Distribution A random variable X with nonnegative integer values has a Poisson distribution if its frequency function is: µ µx f (x) = P(X = x) = e x! for x = 0, 1, 2,..., where µ > 0 is a constant. If X has a Poisson distribution with parameter µ, we can write X Pois(µ). Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
13 The Mean and Variance of the Poisson Distribution Let X Pois(µ) The mean of X is µ per unit of measure. By the conditions of the Poisson distribution. The variance of X is also µ per unit of measure. The standard deviation of X is µ. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
14 Example The number of people arriving for treatment at an emergency room can be modeled by a Poisson process with a mean of five people per hour. 1. What is the probability that exactly four arrivals occur at a particular hour? 2. What is the probability that at least four people arrive during a particular hour? 3. How many people do you expect to arrive during a 45-min period? Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
15 Using R to Compute Probabilities of the Poisson Distribution R commands:p(x = x) = dpois(x,µ) and P(X x) = ppois(x, µ) P(X =4) > dpois(4,5) [1] P(X 4) = 1 P(X 3) > 1-ppois(3,5) [1] Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
16 Type of Probabiltiy Distributions For the following information indicate the type of distribution. a. Binomial b. Poisson c.hypergoemetric d. Neither 1. A company makes sports bikes. 90% pass final inspection (and 10% fail and need to be fixed). We randomly select 10 bikes and want to know what is the probability that 7 will pass. 2. A company makes sports bikes. 90% pass final inspection (and 10% fail and need to be fixed). What is the probability that the 10 th bike will not pass final inspection? 3. A small company made 100 bikes in a month. 10 of these bikes will need to be fixed. Suppose we select 4 bikes, what is the probability that at least one needs to be fixed? 4. A small company makes bicycles. The average amount that this company makes in a month is 100 bikes. What is the probability that the company will make less than 30 bikes in a week? Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
17 Begining Example Suppose two random variables X and Y have a joint function f (x, y) = 2x y 5 for X = 0.75 and 1 and Y = 0 and 1. What are the possible values of f (x, y)? Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
18 Joint Probability Distribution If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f (x, y) for any pair of values (x, y) within the range of the random variables X and Y. It is customary to refer this function as the joint probability function of X and Y. When X and Y are discrete the probability function is: f (x, y) = P(X = x and Y = y) The marginal frequency functions of X and Y are simply their individual frequency functions. f (x) = P(X = x) f (y) = P(Y = y) These functions is best represented in a joint probability table (contingency table). Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
19 Theorem 4.7 Let X and Y have the joint frequency function f (x, y). Then 1. f Y (y) = x f (x, y) for all y. 2. f X (x) = y f (x, y) for all x. 3. f Y X (y x) = f (x,y) f (x) function. if f X (x) > 0. This is the conditional frequency 4. X and Y are independent if and only if f (x, y) = f X (x)f Y (y) for all x, y. 5. X Y f (x, y) = 1. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
20 Popper 07 Questions Suppose that the following table represents the joint pmf of X and Y. Y /X /k 3/k 2 3/k 4/k 3 4/k 5/k 1. What should be the vaule of k? 2. Determine P(X = 2, Y 2) 3. Determine P(X = 2 Y = 2) a) 6 b) 21 c) 5 d) 100 a) 4 7 b) 3 7 c) 1 3 d) 4 21 a) 4 7 b) 3 7 c) 1 3 d) 4 21 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
21 Rule 2 for Means If X and Y are two different random variables, then the mean of the sums of the pairs of the random variable is the same as the sum of their means: E[X + Y ] = E[X] + E[Y ]. This is called the addition rule for means. The mean of the difference of the pairs of the random variable is the same as the difference of their means: E[X Y ] = E[X] E[Y ]. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
22 Rule 2 for Variances If X and Y are independent random variables and σ 2 X+Y σ 2 X Y = Var[X + Y ] = Var[X] + Var[Y ]. = Var[X Y ] = Var[X] + Var[Y ]. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
23 Example of Rule 2 Tamara and Derek are sales associates in a large electronics and appliance store. The following table shows their mean and standard deviation of daily sales. Assume that daily sales among the sales associates are independent. Sales associate Mean Standard deviation Tamara X E[X] = $1100 σ X = $100 Derek Y E[Y ] = $1000 σ Y = $80 1. Determine the mean of the total of Tamara s and Derek s daily sales, E[X + Y ]. 2. Determine the variance of the total of Tamara s and Derek s daily sales, σ 2 (X+Y ). 3. Determine the standard deviation of the total of Tamara s and Derek s daily sales, σ (X+Y ). Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
24 Example Cathy Poliak, Ph.D. Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
25 Expected Values Let X and Y be two random variables, then: E(X + Y ) = E(X) + E(Y ) E(XY ) = x 1 y 1 p 11 + x 1 y 1 p x n y n p nn For more than two random variables X 1, X 2, X 3,, X n, E(X 1 + X X n ) = E(X 1 ) + E(X 2 ) + + E(X n ) Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
26 Covariance of X and Y Let X and Y be jointly distributed random variables with respective means µ x and µ y and standard deviation σ x, σ y. The covariance of X and Y is cov(x, Y ) = E((x µ x )(Y µ y )). or an easier calculation: The correlation of X and Y is; cov(x, Y ) = E(XY ) E(X)E(Y ). cor(x, Y ) = cov(x, Y ) σ x σ y. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
27 Properties of Covariance 1. cov(x, Y ) = cov(y, X) 2. cov(x, X) = var(x) 3. If X, Y, and Z are jointly distributed and a and b are constants cov(x, ay + bz ) = a[cov(x, Y )] + b[cov(x, Z )]. 4. If X and Y are jointly distributed, var(x + Y ) = var(x) + var(y ) + 2cor(X, Y )sd(x)sd(y ) var(x Y ) = var(x) + var(y ) 2cor(X, Y )sd(x)sd(y ) 5. If X and Y are independent, cov(x, Y ) = If jointly distributed random variables X 1, X 2,, X n are pairwise uncorrelated, then var(x 1 + X X n ) = var(x 1 ) + var(x 2 ) + + var(x n ) Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
28 Example of a Joint Probability Distribution The following table is a joint probability table of X = number of sports involved and Y = age of high school students. 1. Determine E(X). 2. Determine E(Y ). 3. Determine E(XY ). 4. Determine cov(x, Y ). X = Number of sports involved Y = Age Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
29 R code I put the values in an Excel.csv file named "jointprob." > library(readr) > jointprob <- read_csv("f:/lectures uh/math 3339/jointprob > View(jointprob) > sum(jointprob$x*jointprob$y*jointprob$pxy) [1] > 1.35*16.1 [1] > sum(jointprob$x*jointprob$pxy) [1] 1.35 > sum(jointprob$y*jointprob$pxy) [1] 16.1 > sum(jointprob$x*jointprob$y*jointprob$pxy) [1] Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
30 Example 2 A class has 10 mathematics majors, 6 computer science majors and 4 statistics majors. A committee of two is selected at random to work an a problem. Let X be the number of mathematics majors and let Y be the number of computer science majors chosen. 1. Determine all possible (X, Y ) pairs for randomly selecting two students. 2. Determine f X,Y (0, 0), that is the probability that the two pick are neither a mathematics major nor a computer science major. 3. Determine f X (0), that is the probability that we do not pick any mathematics majors. Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
31 Joint Probability Table Y /X Total Total 1. Determine E(X). 2. Determine E(Y ). 3. Determine E(XY ). 4. Determine cov(x, Y ). 5. Determine cor(x, Y ). Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Sec (Department of Mathematics University oflecture Houston 6 ) / 30
Bernoulli Trials and Binomial Distribution
Bernoulli Trials and Binomial Distribution Sec 4.4-4.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationBernoulli Trials, Binomial and Cumulative Distributions
Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,
More informationBernoulli Trials and Binomial Distribution
Bernoulli Trials and Binomial Distribution Sec 4.4-4.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 10-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More informationBivariate Distributions
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationExpected Values, Exponential and Gamma Distributions
Expected Values, Exponential and Gamma Distributions Sections 5.2 & 5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13-3339 Cathy
More informationMath Key Homework 3 (Chapter 4)
Math 3339 - Key Homework 3 (Chapter 4) Name: PeopleSoft ID: Instructions: Homework will NOT be accepted through email or in person. Homework must be submitted through CourseWare BEFORE the deadline. Print
More informationExpected Values, Exponential and Gamma Distributions
Expected Values, Exponential and Gamma Distributions Sections 5.2-5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 14-3339 Cathy Poliak,
More informationLecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014
Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationGamma and Normal Distribuions
Gamma and Normal Distribuions Sections 5.4 & 5.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 15-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationHW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)
HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have
More informationSTAT509: Discrete Random Variable
University of South Carolina September 16, 2014 Motivation So far, we have already known how to calculate probabilities of events. Suppose we toss a fair coin three times, we know that the probability
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationBayes Theorem and Hypergeometric Distribution
Bayes Theorem Bayes Theorem is a theorem of probability theory that can be seen as a way of understanding how the probability that a theory is true is affected by a new piece of evidence. It has been used
More informationChapter 3: Discrete Random Variable
Chapter 3: Discrete Random Variable Shiwen Shen University of South Carolina 2017 Summer 1 / 63 Random Variable Definition: A random variable is a function from a sample space S into the real numbers.
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationStatistics 427: Sample Final Exam
Statistics 427: Sample Final Exam Instructions: The following sample exam was given several quarters ago in Stat 427. The same topics were covered in the class that year. This sample exam is meant to be
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationHW on Ch Let X be a discrete random variable with V (X) = 8.6, then V (3X+5.6) is. V (3X + 5.6) = 3 2 V (X) = 9(8.6) = 77.4.
HW on Ch 3 Name: Questions:. Let X be a discrete random variable with V (X) = 8.6, then V (3X+5.6) is. V (3X + 5.6) = 3 2 V (X) = 9(8.6) = 77.4. 2. Let X be a discrete random variable with E(X 2 ) = 9.75
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2
MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is
More informationECSE B Solutions to Assignment 8 Fall 2008
ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant
More informationExponential, Gamma and Normal Distribuions
Exponential, Gamma and Normal Distribuions Sections 5.4, 5.5 & 6.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationCHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable
CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a
More informationIntroduction to Statistical Data Analysis Lecture 3: Probability Distributions
Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis
More informationConfidence Intervals for Two Means
Confidence Intervals for Two Means Section 7.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 21-2311 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationThe Binomial distribution. Probability theory 2. Example. The Binomial distribution
Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationTest 1 Review. Review. Cathy Poliak, Ph.D. Office in Fleming 11c (Department Reveiw of Mathematics University of Houston Exam 1)
Test 1 Review Review Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Exam 1 Review Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationLecture 4: Random Variables and Distributions
Lecture 4: Random Variables and Distributions Goals Random Variables Overview of discrete and continuous distributions important in genetics/genomics Working with distributions in R Random Variables A
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationThe Normal Distribuions
The Normal Distribuions Sections 5.4 & 5.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 15-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationRandom Variable And Probability Distribution. Is defined as a real valued function defined on the sample space S. We denote it as X, Y, Z,
Random Variable And Probability Distribution Introduction Random Variable ( r.v. ) Is defined as a real valued function defined on the sample space S. We denote it as X, Y, Z, T, and denote the assumed
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationChapter 5 Class Notes
Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More informationJoint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1
Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 7 Prof. Hanna Wallach wallach@cs.umass.edu February 14, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationChapters 3.2 Discrete distributions
Chapters 3.2 Discrete distributions In this section we study several discrete distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more. For
More informationMath 510 midterm 3 answers
Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 8 Prof. Hanna Wallach wallach@cs.umass.edu February 16, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationLecture 16 : Independence, Covariance and Correlation of Discrete Random Variables
Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationMassachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis
6.04/6.43: Probabilistic Systems Analysis Question : Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given.
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationFunctions of two random variables. Conditional pairs
Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationOutline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II
Week 8 Jointly Distributed Random Variables Part II Week 8 Objectives 1 The connection between the covariance of two variables and the nature of their dependence is given. 2 Pearson s correlation coefficient
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationRandom Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.
Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Sta230/Mth230 Colin Rundel February 5, 2014 We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean Random
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationSTAT 515 MIDTERM 2 EXAM November 14, 2018
STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationDiscrete Distributions
Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationJointly Distributed Variables
Jointly Distributed Variables Sec 2.6, 9.1 & 9.2 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 7-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationDiscrete Probability Distribution
Shapes of binomial distributions Discrete Probability Distribution Week 11 For this activity you will use a web applet. Go to http://socr.stat.ucla.edu/htmls/socr_eperiments.html and choose Binomial coin
More informationChapter 3 Discrete Random Variables
MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected
More informationExpectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or
Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations
More informationSample Problems for the Final Exam
Sample Problems for the Final Exam 1. Hydraulic landing assemblies coming from an aircraft rework facility are each inspected for defects. Historical records indicate that 8% have defects in shafts only,
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More information