Pivotal Quantities. Mathematics 47: Lecture 16. Dan Sloughter. Furman University. March 30, 2006

Size: px
Start display at page:

Download "Pivotal Quantities. Mathematics 47: Lecture 16. Dan Sloughter. Furman University. March 30, 2006"

Transcription

1 Pivotal Quantities Mathematics 47: Lecture 16 Dan Sloughter Furman University March 30, 2006 Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

2 Pivotal quantities Definition Suppose X 1, X 2,..., X n is a random sample from a distribution with parameter θ. If Y = g(x 1, X 2,..., X n, θ) is a random variable whose distribution does not depend on θ, then we call Y a pivotal quantity for θ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

3 Pivotal quantities Definition Suppose X 1, X 2,..., X n is a random sample from a distribution with parameter θ. If Y = g(x 1, X 2,..., X n, θ) is a random variable whose distribution does not depend on θ, then we call Y a pivotal quantity for θ. Example In finding confidence intervals for µ given a random sample X 1, X 2,..., X n from N(µ, σ 2 ), we used the fact that X µ S n is a pivotal quantity for µ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

4 Example Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

5 Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ). Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

6 Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ). Then (n 1)S 2 is χ 2 (n 1), and so is a pivotal quantity for σ 2. σ 2 Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

7 Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ). Then (n 1)S 2 is χ 2 (n 1), and so is a pivotal quantity for σ 2. Let χ 2 n,α be the α quantile of χ 2 (n). σ 2 Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

8 Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ). Then (n 1)S 2 is χ 2 (n 1), and so is a pivotal quantity for σ 2. σ 2 Let χ 2 n,α be the α quantile of χ 2 (n). Then P ( χ 2 n 1, α 2 ) (n 1)S 2 σ 2 χ 2 n 1,1 α = 1 α. 2 Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

9 Example (cont d) Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

10 Example (cont d) It follows that P ( (n 1)S 2 χ 2 n 1,1 α 2 σ 2 ) (n 1)S 2 χ 2 = 1 α. n 1, α 2 Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

11 Example (cont d) It follows that P ( (n 1)S 2 χ 2 n 1,1 α 2 σ 2 ) (n 1)S 2 χ 2 = 1 α. n 1, α 2 Hence ( (n 1)S 2 χ 2, n 1,1 α 2 ) (n 1)S 2 χ 2 n 1, α 2 is a 100(1 α)% confidence interal for σ 2. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

12 Example Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

13 Example The estimated ages of 19 mineral samples from the Black Forest of Germany were found to be (in millions of years): Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

14 Example The estimated ages of 19 mineral samples from the Black Forest of Germany were found to be (in millions of years): Then s 2 = Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

15 Example The estimated ages of 19 mineral samples from the Black Forest of Germany were found to be (in millions of years): Then s 2 = We find (either with R, or with Table Va) that χ 2 18,.025 = 8.23, and χ 2 18,.975 = Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

16 Example Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

17 Example So, assuming the sample is from N(µ, σ 2 ), we see that ( (18)(733.4), (18)(733.4) ) = (419.1, ) is a 95% confidence interval for σ 2. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

18 Example So, assuming the sample is from N(µ, σ 2 ), we see that ( (18)(733.4), (18)(733.4) ) = (419.1, ) is a 95% confidence interval for σ 2. Note: Taking square roots, it follows that (20.47, 40.05) is a 95% confidence interval for σ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

19 Example Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

20 Example Let X 1, X 2,..., X n be a random sample from a uniform distribution on (0, θ). Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

21 Example Let X 1, X 2,..., X n be a random sample from a uniform distribution on (0, θ). Recall: the density of T = X (n) is { n θ n tn 1, if 0 < t < θ, f T (t) = 0, otherwise. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

22 Example Let X 1, X 2,..., X n be a random sample from a uniform distribution on (0, θ). Recall: the density of T = X (n) is { n θ n tn 1, if 0 < t < θ, f T (t) = 0, otherwise. If we let Y = T θ, then Y has density { ny n 1, if 0 < y < 1, f Y (y) = θf T (θy) = 0, otherwise. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

23 Example Let X 1, X 2,..., X n be a random sample from a uniform distribution on (0, θ). Recall: the density of T = X (n) is { n θ n tn 1, if 0 < t < θ, f T (t) = 0, otherwise. If we let Y = T θ, then Y has density { ny n 1, if 0 < y < 1, f Y (y) = θf T (θy) = 0, otherwise. Hence Y is a pivotal quantity for θ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

24 Example (cont d) Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

25 Example (cont d) Now the distribution function for Y is 0, if y 0, F Y (y) = y n, if 0 < y 1, 1, if y 1. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

26 Example (cont d) Now the distribution function for Y is 0, if y 0, F Y (y) = y n, if 0 < y 1, 1, if y 1. So, in particular, ( ) ( ) P Y (0.95) 1 n = F Y (0.95) 1 n = Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

27 Example (cont d) Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

28 Example (cont d) Thus P ( X(n) (0.95) 1 n θ ) = Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

29 Example (cont d) Thus P ( X(n) (0.95) 1 n θ ) = And so ( ) X(n), (0.95) 1 n is a 95% confidence interval for θ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

30 One-sided confidence intervals Note: the interval in the previous example is an example of a one-sided confidence interval, as opposed to the two-sided confidence intervals of our previous examples. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

31 One-sided confidence intervals Note: the interval in the previous example is an example of a one-sided confidence interval, as opposed to the two-sided confidence intervals of our previous examples. In particular, we call a lower confidence bound for θ. X (n) (0.95) 1 n Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

32 One-sided confidence intervals Note: the interval in the previous example is an example of a one-sided confidence interval, as opposed to the two-sided confidence intervals of our previous examples. In particular, we call Example a lower confidence bound for θ. X (n) (0.95) 1 n Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

33 One-sided confidence intervals Note: the interval in the previous example is an example of a one-sided confidence interval, as opposed to the two-sided confidence intervals of our previous examples. In particular, we call Example a lower confidence bound for θ. X (n) (0.95) 1 n Suppose in a sample of size 20 from a uniform distribution on the interval (0, θ), we find x (20) = Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

34 One-sided confidence intervals Note: the interval in the previous example is an example of a one-sided confidence interval, as opposed to the two-sided confidence intervals of our previous examples. In particular, we call Example a lower confidence bound for θ. X (n) (0.95) 1 n Suppose in a sample of size 20 from a uniform distribution on the interval (0, θ), we find x (20) = Then (0.95) , so ( , ) is a 95% confidence interval for θ. Dan Sloughter (Furman University) Pivotal Quantities March 30, / 10

Sampling Distributions

Sampling Distributions Sampling Distributions Mathematics 47: Lecture 9 Dan Sloughter Furman University March 16, 2006 Dan Sloughter (Furman University) Sampling Distributions March 16, 2006 1 / 10 Definition We call the probability

More information

Mathematics 13: Lecture 4

Mathematics 13: Lecture 4 Mathematics 13: Lecture Planes Dan Sloughter Furman University January 10, 2008 Dan Sloughter (Furman University) Mathematics 13: Lecture January 10, 2008 1 / 10 Planes in R n Suppose v and w are nonzero

More information

Antiderivatives. Mathematics 11: Lecture 30. Dan Sloughter. Furman University. November 7, 2007

Antiderivatives. Mathematics 11: Lecture 30. Dan Sloughter. Furman University. November 7, 2007 Antiderivatives Mathematics 11: Lecture 30 Dan Sloughter Furman University November 7, 2007 Dan Sloughter (Furman University) Antiderivatives November 7, 2007 1 / 9 Definition Recall: Suppose F and f are

More information

Mathematics 22: Lecture 19

Mathematics 22: Lecture 19 Mathematics 22: Lecture 19 Legendre s Equation Dan Sloughter Furman University February 5, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 19 February 5, 2008 1 / 11 Example: Legendre s

More information

Change of Variables: Indefinite Integrals

Change of Variables: Indefinite Integrals Change of Variables: Indefinite Integrals Mathematics 11: Lecture 39 Dan Sloughter Furman University November 29, 2007 Dan Sloughter (Furman University) Change of Variables: Indefinite Integrals November

More information

Mathematics 22: Lecture 5

Mathematics 22: Lecture 5 Mathematics 22: Lecture 5 Autonomous Equations Dan Sloughter Furman University January 11, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 5 January 11, 2008 1 / 11 Solving the logistics

More information

Some Trigonometric Limits

Some Trigonometric Limits Some Trigonometric Limits Mathematics 11: Lecture 7 Dan Sloughter Furman University September 20, 2007 Dan Sloughter (Furman University) Some Trigonometric Limits September 20, 2007 1 / 14 The squeeze

More information

Calculus: Area. Mathematics 15: Lecture 22. Dan Sloughter. Furman University. November 12, 2006

Calculus: Area. Mathematics 15: Lecture 22. Dan Sloughter. Furman University. November 12, 2006 Calculus: Area Mathematics 15: Lecture 22 Dan Sloughter Furman University November 12, 2006 Dan Sloughter (Furman University) Calculus: Area November 12, 2006 1 / 7 Area Note: formulas for the areas of

More information

Mathematics 22: Lecture 7

Mathematics 22: Lecture 7 Mathematics 22: Lecture 7 Separation of Variables Dan Sloughter Furman University January 15, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 7 January 15, 2008 1 / 8 Separable equations

More information

Mathematics 22: Lecture 10

Mathematics 22: Lecture 10 Mathematics 22: Lecture 10 Euler s Method Dan Sloughter Furman University January 22, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 10 January 22, 2008 1 / 14 Euler s method Consider the

More information

Mathematics 22: Lecture 12

Mathematics 22: Lecture 12 Mathematics 22: Lecture 12 Second-order Linear Equations Dan Sloughter Furman University January 28, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 12 January 28, 2008 1 / 14 Definition

More information

Mathematics 22: Lecture 11

Mathematics 22: Lecture 11 Mathematics 22: Lecture 11 Runge-Kutta Dan Sloughter Furman University January 25, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 11 January 25, 2008 1 / 11 Order of approximations One

More information

Mathematics 22: Lecture 4

Mathematics 22: Lecture 4 Mathematics 22: Lecture 4 Population Models Dan Sloughter Furman University January 10, 2008 Dan Sloughter (Furman University) Mathematics 22: Lecture 4 January 10, 2008 1 / 6 Malthusian growth model Let

More information

The Chain Rule. Mathematics 11: Lecture 18. Dan Sloughter. Furman University. October 10, 2007

The Chain Rule. Mathematics 11: Lecture 18. Dan Sloughter. Furman University. October 10, 2007 The Chain Rule Mathematics 11: Lecture 18 Dan Sloughter Furman University October 10, 2007 Dan Sloughter (Furman University) The Chain Rule October 10, 2007 1 / 15 Example Suppose that a pebble is dropped

More information

Nonparametric Tests. Mathematics 47: Lecture 25. Dan Sloughter. Furman University. April 20, 2006

Nonparametric Tests. Mathematics 47: Lecture 25. Dan Sloughter. Furman University. April 20, 2006 Nonparametric Tests Mathematics 47: Lecture 25 Dan Sloughter Furman University April 20, 2006 Dan Sloughter (Furman University) Nonparametric Tests April 20, 2006 1 / 14 The sign test Suppose X 1, X 2,...,

More information

Mathematics 13: Lecture 10

Mathematics 13: Lecture 10 Mathematics 13: Lecture 10 Matrices Dan Sloughter Furman University January 25, 2008 Dan Sloughter (Furman University) Mathematics 13: Lecture 10 January 25, 2008 1 / 19 Matrices Recall: A matrix is a

More information

Goodness of Fit Tests: Homogeneity

Goodness of Fit Tests: Homogeneity Goodness of Fit Tests: Homogeneity Mathematics 47: Lecture 35 Dan Sloughter Furman University May 11, 2006 Dan Sloughter (Furman University) Goodness of Fit Tests: Homogeneity May 11, 2006 1 / 13 Testing

More information

Confidence Intervals for Normal Data Spring 2014

Confidence Intervals for Normal Data Spring 2014 Confidence Intervals for Normal Data 18.05 Spring 2014 Agenda Today Review of critical values and quantiles. Computing z, t, χ 2 confidence intervals for normal data. Conceptual view of confidence intervals.

More information

Example. Mathematics 255: Lecture 17. Example. Example (cont d) Consider the equation. d 2 y dt 2 + dy

Example. Mathematics 255: Lecture 17. Example. Example (cont d) Consider the equation. d 2 y dt 2 + dy Mathematics 255: Lecture 17 Undetermined Coefficients Dan Sloughter Furman University October 10, 2008 6y = 5e 4t. so the general solution of 0 = r 2 + r 6 = (r + 3)(r 2), 6y = 0 y(t) = c 1 e 3t + c 2

More information

Chapter 8 - Statistical intervals for a single sample

Chapter 8 - Statistical intervals for a single sample Chapter 8 - Statistical intervals for a single sample 8-1 Introduction In statistics, no quantity estimated from data is known for certain. All estimated quantities have probability distributions of their

More information

The Geometry. Mathematics 15: Lecture 20. Dan Sloughter. Furman University. November 6, 2006

The Geometry. Mathematics 15: Lecture 20. Dan Sloughter. Furman University. November 6, 2006 The Geometry Mathematics 15: Lecture 20 Dan Sloughter Furman University November 6, 2006 Dan Sloughter (Furman University) The Geometry November 6, 2006 1 / 18 René Descartes 1596-1650 Dan Sloughter (Furman

More information

Confidence Intervals for Normal Data Spring 2018

Confidence Intervals for Normal Data Spring 2018 Confidence Intervals for Normal Data 18.05 Spring 2018 Agenda Exam on Monday April 30. Practice questions posted. Friday s class is for review (no studio) Today Review of critical values and quantiles.

More information

The t-test Pivots Summary. Pivots and t-tests. Patrick Breheny. October 15. Patrick Breheny Biostatistical Methods I (BIOS 5710) 1/18

The t-test Pivots Summary. Pivots and t-tests. Patrick Breheny. October 15. Patrick Breheny Biostatistical Methods I (BIOS 5710) 1/18 and t-tests Patrick Breheny October 15 Patrick Breheny Biostatistical Methods I (BIOS 5710) 1/18 Introduction The t-test As we discussed previously, W.S. Gossett derived the t-distribution as a way of

More information

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing STAT 135 Lab 5 Bootstrapping and Hypothesis Testing Rebecca Barter March 2, 2015 The Bootstrap Bootstrap Suppose that we are interested in estimating a parameter θ from some population with members x 1,...,

More information

STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots. March 8, 2015

STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots. March 8, 2015 STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots March 8, 2015 The duality between CI and hypothesis testing The duality between CI and hypothesis

More information

Statistical Inference

Statistical Inference Statistical Inference Classical and Bayesian Methods Class 5 AMS-UCSC Tue 24, 2012 Winter 2012. Session 1 (Class 5) AMS-132/206 Tue 24, 2012 1 / 11 Topics Topics We will talk about... 1 Confidence Intervals

More information

MVE055/MSG Lecture 8

MVE055/MSG Lecture 8 MVE055/MSG810 2017 Lecture 8 Petter Mostad Chalmers September 23, 2017 The Central Limit Theorem (CLT) Assume X 1,..., X n is a random sample from a distribution with expectation µ and variance σ 2. Then,

More information

Lecture 17: Likelihood ratio and asymptotic tests

Lecture 17: Likelihood ratio and asymptotic tests Lecture 17: Likelihood ratio and asymptotic tests Likelihood ratio When both H 0 and H 1 are simple (i.e., Θ 0 = {θ 0 } and Θ 1 = {θ 1 }), Theorem 6.1 applies and a UMP test rejects H 0 when f θ1 (X) f

More information

Comparing Systems Using Sample Data

Comparing Systems Using Sample Data Comparing Systems Using Sample Data Dr. John Mellor-Crummey Department of Computer Science Rice University johnmc@cs.rice.edu COMP 528 Lecture 8 10 February 2005 Goals for Today Understand Population and

More information

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X). 4. Interval estimation The goal for interval estimation is to specify the accurary of an estimate. A 1 α confidence set for a parameter θ is a set C(X) in the parameter space Θ, depending only on X, such

More information

Lecture 19. Condence Interval

Lecture 19. Condence Interval Lecture 19. Condence Interval December 5, 2011 The phrase condence interval can refer to a random interval, called an interval estimator, that covers the true value θ 0 of a parameter of interest with

More information

Hypothesis Testing One Sample Tests

Hypothesis Testing One Sample Tests STATISTICS Lecture no. 13 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 12. 1. 2010 Tests on Mean of a Normal distribution Tests on Variance of a Normal

More information

SOLUTION FOR HOMEWORK 4, STAT 4352

SOLUTION FOR HOMEWORK 4, STAT 4352 SOLUTION FOR HOMEWORK 4, STAT 4352 Welcome to your fourth homework. Here we begin the study of confidence intervals, Errors, etc. Recall that X n := (X 1,...,X n ) denotes the vector of n observations.

More information

Chapter 7. Confidence Sets Lecture 30: Pivotal quantities and confidence sets

Chapter 7. Confidence Sets Lecture 30: Pivotal quantities and confidence sets Chapter 7. Confidence Sets Lecture 30: Pivotal quantities and confidence sets Confidence sets X: a sample from a population P P. θ = θ(p): a functional from P to Θ R k for a fixed integer k. C(X): a confidence

More information

Confidence Intervals for Population Mean

Confidence Intervals for Population Mean Confidence Intervals for Population Mean Reading: Sections 7.1, 7.2, 7.3 Learning Objectives: Students should be able to: Understand the meaning and purpose of confidence intervals Calculate a confidence

More information

INTERVAL ESTIMATION AND HYPOTHESES TESTING

INTERVAL ESTIMATION AND HYPOTHESES TESTING INTERVAL ESTIMATION AND HYPOTHESES TESTING 1. IDEA An interval rather than a point estimate is often of interest. Confidence intervals are thus important in empirical work. To construct interval estimates,

More information

Statistics. Statistics

Statistics. Statistics The main aims of statistics 1 1 Choosing a model 2 Estimating its parameter(s) 1 point estimates 2 interval estimates 3 Testing hypotheses Distributions used in statistics: χ 2 n-distribution 2 Let X 1,

More information

Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2

Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Fall, 2013 Page 1 Random Variable and Probability Distribution Discrete random variable Y : Finite possible values {y

More information

Lecture 32: Asymptotic confidence sets and likelihoods

Lecture 32: Asymptotic confidence sets and likelihoods Lecture 32: Asymptotic confidence sets and likelihoods Asymptotic criterion In some problems, especially in nonparametric problems, it is difficult to find a reasonable confidence set with a given confidence

More information

ST495: Survival Analysis: Hypothesis testing and confidence intervals

ST495: Survival Analysis: Hypothesis testing and confidence intervals ST495: Survival Analysis: Hypothesis testing and confidence intervals Eric B. Laber Department of Statistics, North Carolina State University April 3, 2014 I remember that one fateful day when Coach took

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

Bootstrap metody II Kernelové Odhady Hustot

Bootstrap metody II Kernelové Odhady Hustot Bootstrap metody II Kernelové Odhady Hustot Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České

More information

Hypothesis testing (cont d)

Hypothesis testing (cont d) Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able

More information

Comparing two independent samples

Comparing two independent samples In many applications it is necessary to compare two competing methods (for example, to compare treatment effects of a standard drug and an experimental drug). To compare two methods from statistical point

More information

Section x7 +

Section x7 + Difference Equations to Differential Equations Section 5. Polynomial Approximations In Chapter 3 we discussed the problem of finding the affine function which best approximates a given function about some

More information

Chapter 9: Hypothesis Testing Sections

Chapter 9: Hypothesis Testing Sections Chapter 9: Hypothesis Testing Sections 9.1 Problems of Testing Hypotheses 9.2 Testing Simple Hypotheses 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.6 Comparing the Means of Two

More information

BIO5312 Biostatistics Lecture 6: Statistical hypothesis testings

BIO5312 Biostatistics Lecture 6: Statistical hypothesis testings BIO5312 Biostatistics Lecture 6: Statistical hypothesis testings Yujin Chung October 4th, 2016 Fall 2016 Yujin Chung Lec6: Statistical hypothesis testings Fall 2016 1/30 Previous Two types of statistical

More information

Summary of Chapters 7-9

Summary of Chapters 7-9 Summary of Chapters 7-9 Chapter 7. Interval Estimation 7.2. Confidence Intervals for Difference of Two Means Let X 1,, X n and Y 1, Y 2,, Y m be two independent random samples of sizes n and m from two

More information

Statistics 3858 : Maximum Likelihood Estimators

Statistics 3858 : Maximum Likelihood Estimators Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Output Analysis for Monte-Carlo Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Output Analysis

More information

i=1 X i/n i=1 (X i X) 2 /(n 1). Find the constant c so that the statistic c(x X n+1 )/S has a t-distribution. If n = 8, determine k such that

i=1 X i/n i=1 (X i X) 2 /(n 1). Find the constant c so that the statistic c(x X n+1 )/S has a t-distribution. If n = 8, determine k such that Math 47 Homework Assignment 4 Problem 411 Let X 1, X,, X n, X n+1 be a random sample of size n + 1, n > 1, from a distribution that is N(µ, σ ) Let X = n i=1 X i/n and S = n i=1 (X i X) /(n 1) Find the

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 16: Change of Basis Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/0 Rank The rank of A is the dimension of the column space

More information

Cherry Blossom run (1) The credit union Cherry Blossom Run is a 10 mile race that takes place every year in D.C. In 2009 there were participants

Cherry Blossom run (1) The credit union Cherry Blossom Run is a 10 mile race that takes place every year in D.C. In 2009 there were participants 18.650 Statistics for Applications Chapter 5: Parametric hypothesis testing 1/37 Cherry Blossom run (1) The credit union Cherry Blossom Run is a 10 mile race that takes place every year in D.C. In 2009

More information

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2009 Prof. Gesine Reinert Our standard situation is that we have data x = x 1, x 2,..., x n, which we view as realisations of random

More information

Multivariate Analysis and Likelihood Inference

Multivariate Analysis and Likelihood Inference Multivariate Analysis and Likelihood Inference Outline 1 Joint Distribution of Random Variables 2 Principal Component Analysis (PCA) 3 Multivariate Normal Distribution 4 Likelihood Inference Joint density

More information

EC2001 Econometrics 1 Dr. Jose Olmo Room D309

EC2001 Econometrics 1 Dr. Jose Olmo Room D309 EC2001 Econometrics 1 Dr. Jose Olmo Room D309 J.Olmo@City.ac.uk 1 Revision of Statistical Inference 1.1 Sample, observations, population A sample is a number of observations drawn from a population. Population:

More information

Asymptotic Statistics-VI. Changliang Zou

Asymptotic Statistics-VI. Changliang Zou Asymptotic Statistics-VI Changliang Zou Kolmogorov-Smirnov distance Example (Kolmogorov-Smirnov confidence intervals) We know given α (0, 1), there is a well-defined d = d α,n such that, for any continuous

More information

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1, Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem

More information

Linear models and their mathematical foundations: Simple linear regression

Linear models and their mathematical foundations: Simple linear regression Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction

More information

Math 181B Homework 1 Solution

Math 181B Homework 1 Solution Math 181B Homework 1 Solution 1. Write down the likelihood: L(λ = n λ X i e λ X i! (a One-sided test: H 0 : λ = 1 vs H 1 : λ = 0.1 The likelihood ratio: where LR = L(1 L(0.1 = 1 X i e n 1 = λ n X i e nλ

More information

The Wishart distribution Scaled Wishart. Wishart Priors. Patrick Breheny. March 28. Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/11

The Wishart distribution Scaled Wishart. Wishart Priors. Patrick Breheny. March 28. Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/11 Wishart Priors Patrick Breheny March 28 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/11 Introduction When more than two coefficients vary, it becomes difficult to directly model each element

More information

ENGI 5708 Design of Civil Engineering Systems

ENGI 5708 Design of Civil Engineering Systems ENGI 5708 Design of Civil Engineering Systems Lecture 09: Characteristics of Simplex Algorithm Solutions Shawn Kenny, Ph.D., P.Eng. Assistant Professor Faculty of Engineering and Applied Science Memorial

More information

The purpose of this section is to derive the asymptotic distribution of the Pearson chi-square statistic. k (n j np j ) 2. np j.

The purpose of this section is to derive the asymptotic distribution of the Pearson chi-square statistic. k (n j np j ) 2. np j. Chapter 9 Pearson s chi-square test 9. Null hypothesis asymptotics Let X, X 2, be independent from a multinomial(, p) distribution, where p is a k-vector with nonnegative entries that sum to one. That

More information

Lecture 8. October 22, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Lecture 8. October 22, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University. Lecture 8 Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University October 22, 2007 1 2 3 4 5 6 1 Define convergent series 2 Define the Law of Large Numbers

More information

Can a Machine Think?

Can a Machine Think? Can a Machine Think? Mathematics 15: Lecture 27 Dan Sloughter Furman University December 4, 2006 Dan Sloughter (Furman University) Can a Machine Think? December 4, 2006 1 / 8 Alan Turing 1912-1954 Dan

More information

Theoretical Statistics. Lecture 1.

Theoretical Statistics. Lecture 1. 1. Organizational issues. 2. Overview. 3. Stochastic convergence. Theoretical Statistics. Lecture 1. eter Bartlett 1 Organizational Issues Lectures: Tue/Thu 11am 12:30pm, 332 Evans. eter Bartlett. bartlett@stat.

More information

Lecture 35. Summarizing Data - II

Lecture 35. Summarizing Data - II Math 48 - Mathematical Statistics Lecture 35. Summarizing Data - II April 26, 212 Konstantin Zuev (USC) Math 48, Lecture 35 April 26, 213 1 / 18 Agenda Quantile-Quantile Plots Histograms Kernel Probability

More information

EXAMINERS REPORT & SOLUTIONS STATISTICS 1 (MATH 11400) May-June 2009

EXAMINERS REPORT & SOLUTIONS STATISTICS 1 (MATH 11400) May-June 2009 EAMINERS REPORT & SOLUTIONS STATISTICS (MATH 400) May-June 2009 Examiners Report A. Most plots were well done. Some candidates muddled hinges and quartiles and gave the wrong one. Generally candidates

More information

Lecture 12: Small Sample Intervals Based on a Normal Population Distribution

Lecture 12: Small Sample Intervals Based on a Normal Population Distribution Lecture 12: Small Sample Intervals Based on a Normal Population MSU-STT-351-Sum-17B (P. Vellaisamy: MSU-STT-351-Sum-17B) Probability & Statistics for Engineers 1 / 24 In this lecture, we will discuss (i)

More information

Confidence Intervals, Testing and ANOVA Summary

Confidence Intervals, Testing and ANOVA Summary Confidence Intervals, Testing and ANOVA Summary 1 One Sample Tests 1.1 One Sample z test: Mean (σ known) Let X 1,, X n a r.s. from N(µ, σ) or n > 30. Let The test statistic is H 0 : µ = µ 0. z = x µ 0

More information

2014/2015 Smester II ST5224 Final Exam Solution

2014/2015 Smester II ST5224 Final Exam Solution 014/015 Smester II ST54 Final Exam Solution 1 Suppose that (X 1,, X n ) is a random sample from a distribution with probability density function f(x; θ) = e (x θ) I [θ, ) (x) (i) Show that the family of

More information

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing 1 In most statistics problems, we assume that the data have been generated from some unknown probability distribution. We desire

More information

Percentage point z /2

Percentage point z /2 Chapter 8: Statistical Intervals Why? point estimate is not reliable under resampling. Interval Estimates: Bounds that represent an interval of plausible values for a parameter There are three types of

More information

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data Simulation Alberto Ceselli MSc in Computer Science Univ. of Milan Part 4 - Statistical Analysis of Simulated Data A. Ceselli Simulation P.4 Analysis of Sim. data 1 / 15 Statistical analysis of simulated

More information

Visual interpretation with normal approximation

Visual interpretation with normal approximation Visual interpretation with normal approximation H 0 is true: H 1 is true: p =0.06 25 33 Reject H 0 α =0.05 (Type I error rate) Fail to reject H 0 β =0.6468 (Type II error rate) 30 Accept H 1 Visual interpretation

More information

Chapter 8: Sampling distributions of estimators Sections

Chapter 8: Sampling distributions of estimators Sections Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample variance Skip: p.

More information

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1 Math 66/566 - Midterm Solutions NOTE: These solutions are for both the 66 and 566 exam. The problems are the same until questions and 5. 1. The moment generating function of a random variable X is M(t)

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Lecture 6: Point Estimation and Large Sample Confidence Intervals. Readings: Sections

Lecture 6: Point Estimation and Large Sample Confidence Intervals. Readings: Sections Lecture 6: Point Estimation and Large Sample Confidence Intervals Readings: Sections 7.1-7.3 1 Point Estimation Objective of point estimation: use a sample to compute a number that represents in some sense

More information

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm Statistics GIDP Ph.D. Qualifying Exam Theory Jan, 06, 9:00am-:00pm Instructions: Provide answers on the supplied pads of paper; write on only one side of each sheet. Complete exactly 5 of the 6 problems.

More information

Prof. Thistleton MAT 505 Introduction to Probability Lecture 13

Prof. Thistleton MAT 505 Introduction to Probability Lecture 13 Prof. Thistleton MAT 55 Introduction to Probability Lecture 3 Sections from Text and MIT Video Lecture: Sections 5.4, 5.6 http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-4- probabilisticsystems-analysis-and-applied-probability-fall-2/video-lectures/lecture-8-continuousrandomvariables/

More information

Chapter 3. Comparing two populations

Chapter 3. Comparing two populations Chapter 3. Comparing two populations Contents Hypothesis for the difference between two population means: matched pairs Hypothesis for the difference between two population means: independent samples Two

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Math 3C Lecture 20. John Douglas Moore

Math 3C Lecture 20. John Douglas Moore Math 3C Lecture 20 John Douglas Moore May 18, 2009 TENTATIVE FORMULA I Midterm I: 20% Midterm II: 20% Homework: 10% Quizzes: 10% Final: 40% TENTATIVE FORMULA II Higher of two midterms: 30% Homework: 10%

More information

Summary. Complete Statistics ... Last Lecture. .1 What is a complete statistic? .2 Why it is called as complete statistic?

Summary. Complete Statistics ... Last Lecture. .1 What is a complete statistic? .2 Why it is called as complete statistic? Last Lecture Biostatistics 602 - Statistical Inference Lecture 06 Hyun Min Kang January 29th, 2013 1 What is a complete statistic? 2 Why it is called as complete statistic? 3 Can the same statistic be

More information

Lecture 3. Inference about multivariate normal distribution

Lecture 3. Inference about multivariate normal distribution Lecture 3. Inference about multivariate normal distribution 3.1 Point and Interval Estimation Let X 1,..., X n be i.i.d. N p (µ, Σ). We are interested in evaluation of the maximum likelihood estimates

More information

Confidence Regions For The Ratio Of Two Percentiles

Confidence Regions For The Ratio Of Two Percentiles Confidence Regions For The Ratio Of Two Percentiles Richard Johnson Joint work with Li-Fei Huang and Songyong Sim January 28, 2009 OUTLINE Introduction Exact sampling results normal linear model case Other

More information

Lecture 2. (See Exercise 7.22, 7.23, 7.24 in Casella & Berger)

Lecture 2. (See Exercise 7.22, 7.23, 7.24 in Casella & Berger) 8 HENRIK HULT Lecture 2 3. Some common distributions in classical and Bayesian statistics 3.1. Conjugate prior distributions. In the Bayesian setting it is important to compute posterior distributions.

More information

ON COMBINING CORRELATED ESTIMATORS OF THE COMMON MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION

ON COMBINING CORRELATED ESTIMATORS OF THE COMMON MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION ON COMBINING CORRELATED ESTIMATORS OF THE COMMON MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION K. KRISHNAMOORTHY 1 and YONG LU Department of Mathematics, University of Louisiana at Lafayette Lafayette, LA

More information

STAT 513 fa 2018 Lec 02

STAT 513 fa 2018 Lec 02 STAT 513 fa 2018 Lec 02 Inference about the mean and variance of a Normal population Karl B. Gregory Fall 2018 Inference about the mean and variance of a Normal population Here we consider the case in

More information

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

ECE 275B Homework # 1 Solutions Version Winter 2015

ECE 275B Homework # 1 Solutions Version Winter 2015 ECE 275B Homework # 1 Solutions Version Winter 2015 1. (a) Because x i are assumed to be independent realizations of a continuous random variable, it is almost surely (a.s.) 1 the case that x 1 < x 2

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2 STA 248 H1S MIDTERM TEST February 26, 2008 SURNAME: SOLUTIONS GIVEN NAME: STUDENT NUMBER: INSTRUCTIONS: Time: 1 hour and 50 minutes Aids allowed: calculator Tables of the standard normal, t and chi-square

More information

7.1 Basic Properties of Confidence Intervals

7.1 Basic Properties of Confidence Intervals 7.1 Basic Properties of Confidence Intervals What s Missing in a Point Just a single estimate What we need: how reliable it is Estimate? No idea how reliable this estimate is some measure of the variability

More information

STA 260: Statistics and Probability II

STA 260: Statistics and Probability II Al Nosedal. University of Toronto. Winter 2017 1 Properties of Point Estimators and Methods of Estimation 2 3 If you can t explain it simply, you don t understand it well enough Albert Einstein. Definition

More information

Confidence intervals for parameters of normal distribution.

Confidence intervals for parameters of normal distribution. Lecture 5 Confidence intervals for parameters of normal distribution. Let us consider a Matlab example based on the dataset of body temperature measurements of 30 individuals from the article []. The dataset

More information

7.3 The Chi-square, F and t-distributions

7.3 The Chi-square, F and t-distributions 7.3 The Chi-square, F and t-distributions Ulrich Hoensch Monday, March 25, 2013 The Chi-square Distribution Recall that a random variable X has a gamma probability distribution (X Gamma(r, λ)) with parameters

More information