[Chapter 6. Functions of Random Variables]

Size: px
Start display at page:

Download "[Chapter 6. Functions of Random Variables]"

Transcription

1 [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating functions 1

2 6.1 Introduction Objective of statistics is to make inferences about a population based on information contained in a sample taken from that population. All quantities used to make inferences about a population are functions of the n random observations that appear in a sample. Consider the problem of estimating a population mean µ. One draw a random sample of n observations, y 1, y 2,..., y n, from the population and employ the sample mean ȳ = y 1 + y y n n = 1 n n i=1 for µ. How good is this sample mean for µ? The answer depends on the behavior of the random variables Y 1, Y 2,..., Y n and their effect on the distributions of a random variable Ȳ = (1/n) n i=1 Y i. y i. 2

3 To determine the probability distribution for a function of n random variables, Y 1, Y 2,..., Y n (say Ȳ ), one must find the joint probability functions for the random variable themselves P (Y 1,..., Y n ) or f(y 1,..., Y n ). The assumption that we will make is that Y 1, Y 2,..., Y n is a random sample from a population with probability function p(y) or probability density function f(y) : the random variables Y 1, Y 2,..., Y n are independent with common probability function p(y) or common density function f(y) : Y 1,..., Y n iid p(y) or f(y) 3

4 6.2 Finding the probability distribution of a function of random variables We will study two methods for finding the probability distribution for a function of r.v. s. Consider r.v. Y 1, Y 2,..., Y n and a function U(Y 1, Y 2,..., Y n ), denoted simply as U, e.g. U = (Y 1 + Y Y n )/n. Then three methods for finding the probability distribution of U are as follows: The method of distribution functions( ) The method of transformations. The method of moment-generating functions( ) 4

5 6.3 Method of distribution functions Suppose that we have r.v. Y 1,..., Y n with joint pdf f(y 1,..., y n ). Let U = U(Y 1,..., Y n ) be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Draw the region over which f(y 1,..., y n ) is positive in (y 1, y 2,..., y n ), and find the region in the (y 1, y 2,..., y n ) space for which U = u. 2. Find F U (u) = P (U u) by integrating f(y 1, y 2,..., y n ) over the region for which U u. 3. Find the density function f U (u) by differentiating F U (u). Thus, f U (u) = df U (u)/du. (Example 6.1) Suppose Y has density function given by f(y) = 2y, 0 y 1, 0, elsewhere. Let U be a new random variable defined by U = 3Y 1. Find the probability density function for U. 5

6 (Example 6.2) Suppose Y 1 and Y 2 have the joint density function given by f(y 1, y 2 ) = 3y 1, 0 y 2 y 1 1, 0, elsewhere. Let U be a new random variable defined by U = Y 1 Y 2. Find the probability density function for U. (Example 6.3)Let (Y 1, Y 2 ) denote a random sample of size n = 2 from the uniform distribution on the interval (0, 1). Find the probability density function for U = Y 1 + Y 2. (Example 6.4) Suppose Y has density function given by f(y) = y+1 2, 1 y 1, 0, elsewhere. Let U be a new random variable defined by U = Y 2. Find the probability density function for U.

7 (Example 6.5) Let U be a uniform random variable on the interval (0, 1). Find a transformation G(U) such that G(U) possesses an exponential distribution with mean β.

8 [Method of Transformations] From [Method of distribution functions], we can arrive at a simple method of writing down the density function of U = h(y ) provided h(y) is either decreasing or increasing. Steps to implement the Transformation method: Suppose Y have probability density function f Y (y). Let U = h(y ) where h(y) is either increasing or decreasing for all y such that f Y (y) > Find the inverse function y = h 1 (u). 2. Evaluate dh 1 du = dh 1 (u). du 3. Find f U (u) by f U (u) = f Y ( h 1 (u) ) dh 1 du 6

9 Why? 1. Let Y have probability density function f Y (y). If h(y) is either increasing or decreasing for all y such that f Y (y) > 0, then U = h(y ) has density function f U (u) = f Y ( h 1 (u) ) dh 1 du where dh 1 du = dh 1 (u) du. (Derivation(p )) i) Suppose h(y) is a increasing function of y. ii) Suppose h(y) is a decreasing function of y. 7

10 2. Note that 1) It is not necessary that h(y) be increasing or decreasing for all values of y. h( ) need only be increasing or decreasing for the values of y such that f Y (y) > 0. The set of point {y : f Y (y) > 0} is called the support of the density f Y (y). If y = h 1 (u) is not the support of the density, then f Y (h 1 (u)) = 0. 2) Direct application of this method requires us to check that the function h(y) be either increasing or decreasing for all y such that f Y (y) > 0. If it is not, this method can not be used. 8

11 (Example 6.7(p.297))Suppose Y function given by f(y) = 2y, 0 y 1, 0, elsewhere. has density Let U be a new random variable defined by U = 4Y + 3. Find the probability density function for U. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = e (y 1+y 2 ), 0 y 1, 0 y 2 0, elsewhere. Find the density function for U = Y 1 + Y 2. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = 2(1 y 1 ), 0 y 1 1, 0 y 2 1 0, elsewhere. Find the density function for U = Y 1 Y 2. 9

12 6.5 The method of Moment Generating Functions This method is based on a uniqueness theorem of M.G.F., which states that, if two r.v. have identical moment-generating functions, the two r.v. s possess the same probability distributions. Let U be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Find the moment generating function for U, m U (t). 2. compare m U (t) with other well-known moment generating functions. If m U (t) = m V (t) for all values of t, then U and V have identical distributions (by uniqueness theorem) 10

13 (Theorem 6.1)[Uniqueness Theorem] Let m X (t) and m Y (t) denote the moment generating functions of r.v. s X and Y, respectively. If both moment-generating functions exist and m X (t) = m Y (t) for all values of t, then X and Y have the same probability distribution. (Example 6.11)Let Z be a normally distributed random variable with mean 0 and variance 1. Use the method of moment-generating functions to find the probability distribution of Z 2. 11

14 The moment generating function method is often very useful for finding the distributions of sums of independent r.v. s. (Theorem 6.2(p.304)) Let Y 1, Y 2,..., Y n be independent r.v. s with moment generating functions m Y 1 (t), m Y 2 (t),..., m Y n (t), respectively. If U = Y 1 + Y Y n then m U (t) = m Y 1 (t) m Y 2 (t) m Y n (t). 12

15 (Example 6.12)The number of customer arrivals at a checkout counter in a given interval of time possesses approximately a Poisson probability distribution. If Y 1 denotes the time until the first arrival, Y 2 denotes the time between the first and second arrival,..., and Y n denotes the time between the (n 1)st and nth arrival, then it can be shown that Y 1, Y 2,..., Y n are independent random variables, with the density function for Y i given by f Yi (y i ) = 1 θ e y i θ, y i > 0 0, elsewhere. Here θ is the average time between arrivals.find the probability density function for the waiting time from the opening of the counter until the nth customer arrives. 13

16 The m.g.f method can be used to establish some interesting and useful results about the distributions of some functions of normally distributed r.v. s. (Theorem 6.3) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and let a 1, a 2,..., a n be constants. If U = n i=1 a i Y i, then U is a normally distributed random variable with E(U) = n i=1 a i µ i and V (U) = n i=1 a 2 i σ2 i. (Proof) (Exercise 6.35) 14

17 (Theorem 6.4) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and define Z i by Z i = Y i µ i σ i, i = 1, 2,..., n. Then n i=1 Zi 2 has a χ 2 -distribution with n degrees of freedom. (Proof) (Exercise 6.34) 15

18 (Exercise 6.43) (Exercise 6.44) 16

19 Homework : Reading Chapter 6 Homework (Due Dec 4th) : 6.1, 6.8, 6.16, 6.19, 6.24, 6.31, 6.37, 6.40, 6.46,

Chapter 6: Functions of Random Variables

Chapter 6: Functions of Random Variables Chapter 6: Functions of Random Variables Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 6.1 6.4 & 6.8 1 Goals for this Chapter Introduce methods

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Stat 515 Midterm Examination II April 4, 2016 (7:00 p.m. - 9:00 p.m.)

Stat 515 Midterm Examination II April 4, 2016 (7:00 p.m. - 9:00 p.m.) Name: Section: Stat 515 Midterm Examination II April 4, 2016 (7:00 p.m. - 9:00 p.m.) The total score is 120 points. Instructions: There are 10 questions. Please circle 8 problems below that you want to

More information

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U). 1. Suppose Y U(0, 2) so that the probability density function (pdf) of Y is 1 2, 0 < y < 2 (a) Find the pdf of U = Y 4 + 1. Make sure to note the support. (c) Suppose Y 1, Y 2,..., Y n is an iid sample

More information

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) = 0.12 is not equal to P (A) P (B) = =

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) = 0.12 is not equal to P (A) P (B) = = Stat 516 (Spring 2010) Hw 1 (due Feb. 2, Tuesday) Question 1 Suppose P (A) =0.45, P (B) = 0.32 and P (Ā B) = 0.20. 1) Find P (A B) [Sol] Since P (B) =P (A B)+P (Ā B), P (A B) =P (B) P (Ā B) =0.32 0.20

More information

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars

More information

MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm)

MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm) Name: MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm) Instructions: The total score is 200 points. There are ten problems. Point values per problem are shown besides the questions.

More information

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Test Problems for Probability Theory ,

Test Problems for Probability Theory , 1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30

More information

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40 Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability

More information

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them: Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).

More information

Chapter 6: Functions of Random Variables

Chapter 6: Functions of Random Variables Chapter 6: Functions of Random Variables We are often interested in a function of one or several random variables, U(Y 1,..., Y n ). We will study three methods for determining the distribution of a function

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) =0.12 is not equal to P (A) P (B) = =

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) =0.12 is not equal to P (A) P (B) = = Stat 516 (Spring 2012) Hw 1 (due Feb. 2, Thursday) Question 1 Suppose P (A) =0.45, P (B) =0.32 and P (Ā B) =0.20. 1) Find P (A B) [Sol] Since P (B) =P (A B)+P (Ā B), P (A B) =P (B) P (Ā B) =0.32 0.20 =

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Chapter 3 Common Families of Distributions

Chapter 3 Common Families of Distributions Lecture 9 on BST 631: Statistical Theory I Kui Zhang, 9/3/8 and 9/5/8 Review for the previous lecture Definition: Several commonly used discrete distributions, including discrete uniform, hypergeometric,

More information

Transform Techniques - CF

Transform Techniques - CF Transform Techniques - CF [eview] Moment Generating Function For a real t, the MGF of the random variable is t t M () t E[ e ] e Characteristic Function (CF) k t k For a real ω, the characteristic function

More information

Y i. is the sample mean basal area and µ is the population mean. The difference between them is Y µ. We know the sampling distribution

Y i. is the sample mean basal area and µ is the population mean. The difference between them is Y µ. We know the sampling distribution 7.. In this problem, we envision the sample Y, Y,..., Y 9, where Y i basal area of ith tree measured in sq inches, i,,..., 9. We assume the population distribution is N µ, 6, and µ is the population mean

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Transform Techniques - CF

Transform Techniques - CF Transform Techniques - CF [eview] Moment Generating Function For a real t, the MGF of the random variable is t t M () t E[ e ] e Characteristic Function (CF) k t k For a real ω, the characteristic function

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Bivariate Normal Distribution

Bivariate Normal Distribution .0. TWO-DIMENSIONAL RANDOM VARIABLES 47.0.7 Bivariate Normal Distribution Figure.: Bivariate Normal pdf Here we use matrix notation. A bivariate rv is treated as a random vector X X =. The expectation

More information

Transform Techniques - CF

Transform Techniques - CF Transform Techniques - CF [eview] Moment Generating Function For a real t, the MGF of the random variable is t e k p ( k) discrete t t k M () t E[ e ] e t e f d continuous Characteristic Function (CF)

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Ch3 Operations on one random variable-expectation

Ch3 Operations on one random variable-expectation Ch3 Operations on one random variable-expectation Previously we define a random variable as a mapping from the sample space to the real line We will now introduce some operations on the random variable.

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter.

This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter. This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter. Chapter 6 Problems 1. Suppose that Y U(0, 2) so that the probability density function (pdf)

More information

Question Points Score Total: 76

Question Points Score Total: 76 Math 447 Test 2 March 17, Spring 216 No books, no notes, only SOA-approved calculators. true/false or fill-in-the-blank question. You must show work, unless the question is a Name: Question Points Score

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

1. Point Estimators, Review

1. Point Estimators, Review AMS571 Prof. Wei Zhu 1. Point Estimators, Review Example 1. Let be a random sample from. Please find a good point estimator for Solutions. There are the typical estimators for and. Both are unbiased estimators.

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

STA 260: Statistics and Probability II

STA 260: Statistics and Probability II Al Nosedal. University of Toronto. Winter 2017 1 Properties of Point Estimators and Methods of Estimation 2 3 If you can t explain it simply, you don t understand it well enough Albert Einstein. Definition

More information

Special Topic: Bayesian Finite Population Survey Sampling

Special Topic: Bayesian Finite Population Survey Sampling Special Topic: Bayesian Finite Population Survey Sampling Sudipto Banerjee Division of Biostatistics School of Public Health University of Minnesota April 2, 2008 1 Special Topic Overview Scientific survey

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7 MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7 1 Random Vectors Let a 0 and y be n 1 vectors, and let A be an n n matrix. Here, a 0 and A are non-random, whereas y is

More information

O June, 2010 MMT-008 : PROBABILITY AND STATISTICS

O June, 2010 MMT-008 : PROBABILITY AND STATISTICS No. of Printed Pages : 8 M.Sc. MATHEMATICS WITH APPLICATIONS IN COMPUTER SCIENCE (MACS) tr.) Term-End Examination O June, 2010 : PROBABILITY AND STATISTICS Time : 3 hours Maximum Marks : 100 Note : Question

More information

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1 Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function

More information

STA 584 Supplementary Examples (not to be graded) Fall, 2003

STA 584 Supplementary Examples (not to be graded) Fall, 2003 Page 1 of 8 Central Michigan University Department of Mathematics STA 584 Supplementary Examples (not to be graded) Fall, 003 1. (a) If A and B are independent events, P(A) =.40 and P(B) =.70, find (i)

More information

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Review Quiz 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Cramér Rao lower bound (CRLB). That is, if where { } and are scalars, then

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

Functions of Random Variables Notes of STAT 6205 by Dr. Fan Functions of Random Variables Notes of STAT 605 by Dr. Fan Overview Chapter 5 Functions of One random variable o o General: distribution function approach Change-of-variable approach Functions of Two random

More information

Master s Written Examination - Solution

Master s Written Examination - Solution Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Normal (Gaussian) distribution The normal distribution is often relevant because of the Central Limit Theorem (CLT):

Normal (Gaussian) distribution The normal distribution is often relevant because of the Central Limit Theorem (CLT): Lecture Three Normal theory null distributions Normal (Gaussian) distribution The normal distribution is often relevant because of the Central Limit Theorem (CLT): A random variable which is a sum of many

More information

STA 4322 Exam I Name: Introduction to Statistics Theory

STA 4322 Exam I Name: Introduction to Statistics Theory STA 4322 Exam I Name: Introduction to Statistics Theory Fall 2013 UF-ID: Instructions: There are 100 total points. You must show your work to receive credit. Read each part of each question carefully.

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators Thilo Klein University of Cambridge Judge Business School Session 4: Linear regression,

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

i=1 k i=1 g i (Y )] = k

i=1 k i=1 g i (Y )] = k Math 483 EXAM 2 covers 2.4, 2.5, 2.7, 2.8, 3.1, 3.2, 3.3, 3.4, 3.8, 3.9, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 4.9, 5.1, 5.2, and 5.3. The exam is on Thursday, Oct. 13. You are allowed THREE SHEETS OF NOTES and

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 January 18, 2018 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 18, 2018 1 / 9 Sampling from a Bernoulli Distribution Theorem (Beta-Bernoulli

More information

STA 260: Statistics and Probability II

STA 260: Statistics and Probability II Al Nosedal. University of Toronto. Winter 2017 1 Chapter 7. Sampling Distributions and the Central Limit Theorem If you can t explain it simply, you don t understand it well enough Albert Einstein. Theorem

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y Math 480 Exam 2 is Wed. Oct. 31. You are allowed 7 sheets of notes and a calculator. The exam emphasizes HW5-8, and Q5-8. From the 1st exam: The conditional probability of A given B is P(A B) = P(A B)

More information

Lecture 15. Hypothesis testing in the linear model

Lecture 15. Hypothesis testing in the linear model 14. Lecture 15. Hypothesis testing in the linear model Lecture 15. Hypothesis testing in the linear model 1 (1 1) Preliminary lemma 15. Hypothesis testing in the linear model 15.1. Preliminary lemma Lemma

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Continuous Distributions

Continuous Distributions Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study

More information

Module 6: Methods of Point Estimation Statistics (OA3102)

Module 6: Methods of Point Estimation Statistics (OA3102) Module 6: Methods of Point Estimation Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chapter 9.6-9.7 Revision: 1-12 1 Goals for this Module

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Statistics and Sampling distributions

Statistics and Sampling distributions Statistics and Sampling distributions a statistic is a numerical summary of sample data. It is a rv. The distribution of a statistic is called its sampling distribution. The rv s X 1, X 2,, X n are said

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

ISyE 3044 Fall 2017 Test #1a Solutions

ISyE 3044 Fall 2017 Test #1a Solutions 1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have

More information

Review for the previous lecture

Review for the previous lecture Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,

More information

Solutions to COMP9334 Week 8 Sample Problems

Solutions to COMP9334 Week 8 Sample Problems Solutions to COMP9334 Week 8 Sample Problems Problem 1: Customers arrive at a grocery store s checkout counter according to a Poisson process with rate 1 per minute. Each customer carries a number of items

More information

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a

More information

Stat 704 Data Analysis I Probability Review

Stat 704 Data Analysis I Probability Review 1 / 39 Stat 704 Data Analysis I Probability Review Dr. Yen-Yi Ho Department of Statistics, University of South Carolina A.3 Random Variables 2 / 39 def n: A random variable is defined as a function that

More information

STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples.

STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples. STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples. Rebecca Barter March 16, 2015 The χ 2 distribution The χ 2 distribution We have seen several instances

More information

Part 2: One-parameter models

Part 2: One-parameter models Part 2: One-parameter models 1 Bernoulli/binomial models Return to iid Y 1,...,Y n Bin(1, ). The sampling model/likelihood is p(y 1,...,y n ) = P y i (1 ) n P y i When combined with a prior p( ), Bayes

More information

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Problem 1 (20) Log-normal. f(x) Cauchy

Problem 1 (20) Log-normal. f(x) Cauchy ORF 245. Rigollet Date: 11/21/2008 Problem 1 (20) f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.0 0.2 0.4 0.6 0.8 4 2 0 2 4 Normal (with mean -1) 4 2 0 2 4 Negative-exponential x x f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.5

More information

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

Order Statistics and Distributions

Order Statistics and Distributions Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 8 1 / 8 Probability and Statistical Models Motivating ideas AMS 131: Suppose that the random variable

More information