Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?

Size: px
Start display at page:

Download "Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?"

Transcription

1 Biostatistics 62 - Statistical Inference Lecture 5 Hyun Min Kang 1 What is an ancillary statistic for θ? 2 Can an ancillary statistic be a sufficient statistic? 3 What are the location parameter and the scale parameter? 4 In which case ancillary statistics would be helpful? January 24th, 213 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Last Lecture : Example : Uniform Definition 6216 A statistic S(X) is an ancillary statistic if its distribution does not depend on θ Examples of iid X 1,, X n N (µ, σ 2 ) where σ 2 is known s 2 X = 1 n 1 n i=1 (X 1 X) 2 is an ancillary statistic X 1 X 2 N (, 2σ 2 ) is ancillary (X 1 + X 2 )/2 X 3 N (, 15σ 2 ) is ancillary (n 1)s2 X σ 2 χ 2 n 1 is ancillary iid X 1,, X n Uniform(θ, θ + 1) Show that R = X (n) X (1) is an ancillary statistic Possible Strategies Method 1 : Obtain the distribution of R and show that it is independent of θ Method 2 : Represent R as a function of ancillary statistics, which is independent of θ Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

2 Method 2 - A Simpler Proof Location-Scale Family and Parameters f X (x θ) = I(θ < x < θ + 1) = I( < x θ < 1) Let Y i = X i θ Uniform(, 1) Then X i = Y i + θ, dx dy = 1 holds f Y (y) = I( < y + θ θ < 1) dx dy = I( < y < 1) Then, the range statistic R can be rewritten as follows R = X (n) X (1) = (Y (n) + θ) (Y (1) + θ) = Y (n) Y (1) As Y (n) Y (1) is a function of Y 1,, Y n Any joint distribution of Y 1,, Y n does not depend on θ Therefore, R is an ancillary statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Definition Let T = {f T (t θ), θ Ω} be a family of pdfs or pmfs for a statistic T(X) Equivalently, T(X) is called a complete statistic Example T(X) N (, 1) The family of probability distributions is called complete if E[g(T) θ] = for all θ implies Pr[g(T) = θ] = 1 for all θ In other words, g(t) = almost surely g 1 (T(X)) = = Pr[g 1 (T(X)) = ] = 1 g 2 (T(X)) = I(T(X) = ) = Pr[g 2 (T(X)) = ] = 1 Pr[T(X) = )] In this case, g 2 (T(X)) = is almost surely true Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Definition 355 Let f(x) be any pdf Then for any µ, < µ <, and any σ > the family of pdfs f((x µ)/σ)/σ, indexed by the parameter (µ, σ) is called the location-scale family with standard pdf f(x), and µ is called the location parameter and σ is called the scale parameter for the family Example f(x) = 1 2π e x2 /2 N (, 1) f((x µ)/σ)/σ = 1 2πσ 2 e (x µ)2 /2σ 2 N (µ, σ 2 ) Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Notes on Notice that completeness is a property of a family of probability distributions, not of a particular distribution For example, X N (, 1) and g(x) = x makes E[g(X)] = EX =, but Pr(g(X) = ) = instead of 1 The above example is only for a particular distribution, not a family of distributions If X N (θ, 1), < θ <, then no function of X except for g(x) = satisfies E[g(X) θ] for all θ Therefore, the family of N (θ, 1) distributions, < θ <, is complete Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

3 Why Complete Statistics? Example - Poisson distribution Stigler (1972) Am Stat 26(2):28-9 While a student encountering completeness for the first time is very likely to appreciate its usefulness, he is just as likely to be puzzled by its name, and wonder what connection (if any) there is between the statistical use of the term complete, and the dictionary definition: lacking none of the parts, whole, entire Requiring g(t) to satisfy the definition puts a restriction on g The larger the family of pdfs/pmfs, the greater the restriction on g When the family of pdfs/pmfs is augmented to the point that E[g(T)] = for all θ, it rules out all g except for the trivial g(t) =, then the family is said to be complete { } Suppose T = f T : f T (t λ) = λt e λ for t {, 1, 2, } Let λ Ω = {1, 2} Show that this family is NOT complete Proof strategy We need to find a counter example, which is a function g such that E[g(T) λ] = for λ = 1, 2 but g(t) Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Poisson distribution example : Proof Poisson distribution example : Proof (cont d) The function g must satisfy for λ {1, 2} Thus, E[g(T) λ] = t= g(t) λt e λ = E[g(T) λ = 1] = t= g(t) 1t e 1 = E[g(T) λ = 2] = t= g(t) 2t e 2 = The above equation can be rewritten as t= g(t)/ = t= 2t g(t)/ = Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Define g(t) as Then g(t) = 2 t = t = 2 3 t = 1 otherwise g(t)/ = g()/! + g(1)/1! + g(2)/2! = /2 = t= 2 t g(t)/ = g()/! + 2g(1)/1! g(2)/2! = /2 = t= There exists a non-zero function g that satisfies E[g(T)λ] = for all λ Ω Therefore this family is NOT complete Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

4 Another example with Poisson distribution Proof : Finding the moment-generating function of X iid X 1,, X n Poisson(λ), λ > Show that T(X) = n i=1 X i is a complete statistic Proof strategy Need to find the distribution of T(X) Show that there is no non-zero function g such that E[g(T) λ] = for all λ M X (s) = E[e sx ] = x= e sx e λ λ x x! = e λ (es λ) x e esλ e es λ x! x= = e λ e es λ (es λ) x e es λ x! x= = e λ e es λ f Poisson (x e s λ) = e λ(es 1) x= Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Proof : Finding the MGF of T(X) = n i=1 X i M T (s) = E(e s X i ) = E = = ( n i=1 e sx i ) n E ( e sx ) i = [ E ( e sx )] n i i=1 [ e λ(es 1) ] n = e nλ(e s 1) Theorem 2311 (b) Let F X (x) and F Y (y) be two cdfs all of whose moments exists If the moment generating functions exists and M X (t) = M Y (t) for all t in some neighborhood of, then F X (u) = F Y (u) for all u By Theorem 2311, T(X) Poisson(nλ) Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Proof : Showing E[g(T) λ] = Pr[g(T) = ] = 1 Suppose that there exists a g(t) such that E[g(T) λ] = for all λ > e nλ (nλ) t E[g(T) λ] = Which is equivalent to t= = e nλ g(t)(nλ) t = t= g(t)n t λ t = t= for all λ > Because the function above is a power series expansion of λ, g(t)n t / = for all t and g(t) = for all t Therefore T(X) = n i=1 X i is a complete statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

5 Example : Uniform Distribution Proof : Uniform Distribution (cont d) iid Let X 1,, X n Uniform(, θ), θ >, Ω = (, ) Show that X (n) is complete Proof We need to obtain the distribution of T(X) = X (n) Let f X (x) = 1 θ I( < x < θ), then its cdf is F X(x) = x θ I( < x < θ) + I(x θ) n! f T (t θ) = (n 1)! f X (t)f X(t) n 1 = n ( ) t n 1 I( < t < θ) = nθ n t n 1 I( < t < θ) θ θ Consider a function g(t) such that E[g(T) θ] = for all θ > E[g(T) θ] = θ Taking derivative of both sides, ng(θ) θ g(t)nθ n t n 1 I( < t < θ)dt = n θ θ n g(t)t n 1 dt = n θ θ n g(θ)θn 1 n2 θ n+1 g(t)t n 1 dt = = n n θ θ θ n g(t)t n 1 dt = n θ E[g(T) θ] = Because g(t) = holds for all θ >, T(X) = X (n) is a complete statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 A simpler proof (how it was solved in the class) Another Example of Uniform Distribution Consider a function g(t) such that E[g(T) θ] = for all θ > θ E[g(T) θ] = g(t)t n 1 dt = Taking derivative of both sides, θ g(t)nθ n t n 1 I( < t < θ)dt = n θ θ n g(t)t n 1 dt = g(θ)θ n 1 = g(θ) = for all θ > Because g(t) = holds for all θ >, T(X) = X (n) is a complete statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 iid Let X 1,, X n Uniform(θ, θ + 1), θ R We have previously shown that T(X) = (X (1), X (n) ) is a minimal sufficient statistic for θ Show that T(X) is not a complete statistic Proof - Using a range statistic Define R = X (n) X (1) We have previously shown that f R (r θ) = n(n 1)r (n 2) (1 r), < r < 1 Then R Beta(n 1, 2), and E[R θ] = n 1 n+1 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

6 Proof Even Simpler Proof Define g(t(x)) = X (n) X (1) n 1 n+1 E[g(T) θ] = E[X (n) X (1) θ] n 1 n + 1 = n 1 n + 1 n 1 n + 1 = Therefore, there exist a g(t) such that Pr[g(T) θ] < 1 for all θ, so T(X) = (X (1), X (n) ) is not a complete statistic We know that R = X (n) X (1) is an ancillary statistic, which do not depend on θ Define g(t) = X (n) X (1) E(R) Note that E(R) is constant to θ Then E[g(T) θ] = E(R) E(R) =, so T is not a complete statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Example from Stigler (1972) Am Stat Solution (cont d) Let X is a uniform random sample from {1,, θ} where θ Ω = N Is T(X) = X a complete statistic? Solution Consider a function g(t) such that E[g(T) θ] = for all θ N Note that f X (x) = 1 θ I(x {1,, θ}) = 1 θ I N θ (x) E[g(T) θ] = E[g(X) θ] = g(x) = 1 θ g(x) = 1 θ g(x) = for all θ N, which implies if θ = 1, θ g(x) = g(1) = if θ = 2, θ g(x) = g(1) + g(2) = g(2) = if θ = k, θ g(x) = g(1) + + g(k 1) + g(2) = g(k) = Therefore, g(x) = for all x N, and T(X) = X is a complete statistic for θ Ω = N Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

7 Is the previous example barely complete? Modified Let X is a uniform random sample from {1,, θ} where θ Ω = N {n} Is T(X) = X a complete statistic? Solution Define a nonzero g(x) as follows g(x) = E[g(T) θ] = 1 θ 1 x = n 1 x = n + 1 otherwise { θ n g(x) = θ = n 1 θ Today - Examples of complete statistics Two Poisson distribution examples Two Uniform distribution examples Example of barely complete statistics Next Lecture Basu s Theorem Because Ω does not include n, g(x) = for all θ Ω = N {n}, and T(X) = X is not a complete statistic Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26 Hyun Min Kang Biostatistics 62 - Lecture 5 January 24th, / 26

Summary. Complete Statistics ... Last Lecture. .1 What is a complete statistic? .2 Why it is called as complete statistic?

Summary. Complete Statistics ... Last Lecture. .1 What is a complete statistic? .2 Why it is called as complete statistic? Last Lecture Biostatistics 602 - Statistical Inference Lecture 06 Hyun Min Kang January 29th, 2013 1 What is a complete statistic? 2 Why it is called as complete statistic? 3 Can the same statistic be

More information

Last Lecture - Key Questions. Biostatistics Statistical Inference Lecture 03. Minimal Sufficient Statistics

Last Lecture - Key Questions. Biostatistics Statistical Inference Lecture 03. Minimal Sufficient Statistics Last Lecture - Key Questions Biostatistics 602 - Statistical Inference Lecture 03 Hyun Min Kang January 17th, 2013 1 How do we show that a statistic is sufficient for θ? 2 What is a necessary and sufficient

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Last Lecture. Biostatistics Statistical Inference Lecture 14 Obtaining Best Unbiased Estimator. Related Theorems. Rao-Blackwell Theorem

Last Lecture. Biostatistics Statistical Inference Lecture 14 Obtaining Best Unbiased Estimator. Related Theorems. Rao-Blackwell Theorem Last Lecture Biostatistics 62 - Statistical Inference Lecture 14 Obtaining Best Unbiased Estimator Hyun Min Kang February 28th, 213 For single-parameter exponential family, is Cramer-Rao bound always attainable?

More information

1 Complete Statistics

1 Complete Statistics Complete Statistics February 4, 2016 Debdeep Pati 1 Complete Statistics Suppose X P θ, θ Θ. Let (X (1),..., X (n) ) denote the order statistics. Definition 1. A statistic T = T (X) is complete if E θ g(t

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

1 Probability Model. 1.1 Types of models to be discussed in the course

1 Probability Model. 1.1 Types of models to be discussed in the course Sufficiency January 11, 2016 Debdeep Pati 1 Probability Model Model: A family of distributions {P θ : θ Θ}. P θ (B) is the probability of the event B when the parameter takes the value θ. P θ is described

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Mathematical statistics

Mathematical statistics October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation

More information

Econ 508B: Lecture 5

Econ 508B: Lecture 5 Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline

More information

1 Probability Model. 1.1 Types of models to be discussed in the course

1 Probability Model. 1.1 Types of models to be discussed in the course Sufficiency January 18, 016 Debdeep Pati 1 Probability Model Model: A family of distributions P θ : θ Θ}. P θ (B) is the probability of the event B when the parameter takes the value θ. P θ is described

More information

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction Today s Outline Biostatistics 602 - Statistical Inference Lecture 01 Introduction to Principles of Hyun Min Kang Course Overview of January 10th, 2013 Hyun Min Kang Biostatistics 602 - Lecture 01 January

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Lecture 14: Multivariate mgf s and chf s

Lecture 14: Multivariate mgf s and chf s Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),

More information

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ). 1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Miscellaneous Errors in the Chapter 6 Solutions

Miscellaneous Errors in the Chapter 6 Solutions Miscellaneous Errors in the Chapter 6 Solutions 3.30(b In this problem, early printings of the second edition use the beta(a, b distribution, but later versions use the Poisson(λ distribution. If your

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

1.6 Families of Distributions

1.6 Families of Distributions Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for

More information

Lecture 5. i=1 xi. Ω h(x,y)f X Θ(y θ)µ Θ (dθ) = dµ Θ X

Lecture 5. i=1 xi. Ω h(x,y)f X Θ(y θ)µ Θ (dθ) = dµ Θ X LECURE NOES 25 Lecture 5 9. Minimal sufficient and complete statistics We introduced the notion of sufficient statistics in order to have a function of the data that contains all information about the

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

1. Point Estimators, Review

1. Point Estimators, Review AMS571 Prof. Wei Zhu 1. Point Estimators, Review Example 1. Let be a random sample from. Please find a good point estimator for Solutions. There are the typical estimators for and. Both are unbiased estimators.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Mathematical statistics

Mathematical statistics October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Last Lecture. Unbiased Test

Last Lecture. Unbiased Test Last Lecture Biostatistics 6 - Statistical Iferece Lecture Uiformly Most Powerful Test Hyu Mi Kag March 8th, 3 What are the typical steps for costructig a likelihood ratio test? Is LRT statistic based

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Lecture 4. f X T, (x t, ) = f X,T (x, t ) f T (t )

Lecture 4. f X T, (x t, ) = f X,T (x, t ) f T (t ) LECURE NOES 21 Lecture 4 7. Sufficient statistics Consider the usual statistical setup: the data is X and the paramter is. o gain information about the parameter we study various functions of the data

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

STAT 430/510: Lecture 15

STAT 430/510: Lecture 15 STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional

More information

Math 341: Probability Seventeenth Lecture (11/10/09)

Math 341: Probability Seventeenth Lecture (11/10/09) Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

5.6 The Normal Distributions

5.6 The Normal Distributions STAT 41 Lecture Notes 13 5.6 The Normal Distributions Definition 5.6.1. A (continuous) random variable X has a normal distribution with mean µ R and variance < R if the p.d.f. of X is f(x µ, ) ( π ) 1/

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15. IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

SUFFICIENT STATISTICS

SUFFICIENT STATISTICS SUFFICIENT STATISTICS. Introduction Let X (X,..., X n ) be a random sample from f θ, where θ Θ is unknown. We are interested using X to estimate θ. In the simple case where X i Bern(p), we found that the

More information

Lecture 4: UMVUE and unbiased estimators of 0

Lecture 4: UMVUE and unbiased estimators of 0 Lecture 4: UMVUE and unbiased estimators of Problems of approach 1. Approach 1 has the following two main shortcomings. Even if Theorem 7.3.9 or its extension is applicable, there is no guarantee that

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2009 Prof. Gesine Reinert Our standard situation is that we have data x = x 1, x 2,..., x n, which we view as realisations of random

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Review of Probability Theory II

Review of Probability Theory II Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40 Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability

More information

Stat 512 Homework key 2

Stat 512 Homework key 2 Stat 51 Homework key October 4, 015 REGULAR PROBLEMS 1 Suppose continuous random variable X belongs to the family of all distributions having a linear probability density function (pdf) over the interval

More information

Fundamentals of Statistics

Fundamentals of Statistics Chapter 2 Fundamentals of Statistics This chapter discusses some fundamental concepts of mathematical statistics. These concepts are essential for the material in later chapters. 2.1 Populations, Samples,

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Chapter 8.8.1: A factorization theorem

Chapter 8.8.1: A factorization theorem LECTURE 14 Chapter 8.8.1: A factorization theorem The characterization of a sufficient statistic in terms of the conditional distribution of the data given the statistic can be difficult to work with.

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Expectation, variance and moments

Expectation, variance and moments Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information