Stat 421-SP2012 Interval Estimation Section

Similar documents
Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

1.010 Uncertainty in Engineering Fall 2008

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Statistics 511 Additional Materials

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Expectation and Variance of a random variable

Chapter 6 Principles of Data Reduction

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

Random Variables, Sampling and Estimation

KLMED8004 Medical statistics. Part I, autumn Estimation. We have previously learned: Population and sample. New questions

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

1 Inferential Methods for Correlation and Regression Analysis

MATH/STAT 352: Lecture 15

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Sampling Distributions, Z-Tests, Power

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

6. Sufficient, Complete, and Ancillary Statistics

Statistical inference: example 1. Inferential Statistics

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

Interval Estimation (Confidence Interval = C.I.): An interval estimate of some population parameter is an interval of the form (, ),

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Understanding Samples

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

6.3 Testing Series With Positive Terms

Simulation. Two Rule For Inverting A Distribution Function

7-1. Chapter 4. Part I. Sampling Distributions and Confidence Intervals

Chapter 6 Part 5. Confidence Intervals t distribution chi square distribution. October 23, 2008

Chapter 6. Sampling and Estimation

Lecture 7: Properties of Random Samples

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Bayesian Methods: Introduction to Multi-parameter Models

Common Large/Small Sample Tests 1/55

Final Examination Solutions 17/6/2010

Topic 10: Introduction to Estimation

Exam II Covers. STA 291 Lecture 19. Exam II Next Tuesday 5-7pm Memorial Hall (Same place as exam I) Makeup Exam 7:15pm 9:15pm Location CB 234

BIOS 4110: Introduction to Biostatistics. Breheny. Lab #9

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

Computing Confidence Intervals for Sample Data

ENGI 4421 Confidence Intervals (Two Samples) Page 12-01

Chapter 8: Estimating with Confidence

CHAPTER 8 FUNDAMENTAL SAMPLING DISTRIBUTIONS AND DATA DESCRIPTIONS. 8.1 Random Sampling. 8.2 Some Important Statistics

Chapter 6 Sampling Distributions

Lecture 2: Monte Carlo Simulation

1 Introduction to reducing variance in Monte Carlo simulations

Basis for simulation techniques

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Confidence Intervals

Frequentist Inference

Properties of Point Estimators and Methods of Estimation

Problem Set 4 Due Oct, 12

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Lecture 19: Convergence

Power and Type II Error

MBACATÓLICA. Quantitative Methods. Faculdade de Ciências Económicas e Empresariais UNIVERSIDADE CATÓLICA PORTUGUESA 9. SAMPLING DISTRIBUTIONS

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Big Picture. 5. Data, Estimates, and Models: quantifying the accuracy of estimates.

Statisticians use the word population to refer the total number of (potential) observations under consideration

Data Analysis and Statistical Methods Statistics 651

Confidence Interval for Standard Deviation of Normal Distribution with Known Coefficients of Variation

Estimation for Complete Data

Comparing Two Populations. Topic 15 - Two Sample Inference I. Comparing Two Means. Comparing Two Pop Means. Background Reading

Inferential Statistics. Inference Process. Inferential Statistics and Probability a Holistic Approach. Inference Process.

Output Analysis and Run-Length Control

STATISTICAL INFERENCE

Class 23. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

Chapter 2 The Monte Carlo Method

Instructor: Judith Canner Spring 2010 CONFIDENCE INTERVALS How do we make inferences about the population parameters?

Chapter 8 Interval Estimation


Economics Spring 2015

Introductory statistics

Module 1 Fundamentals in statistics

Recall the study where we estimated the difference between mean systolic blood pressure levels of users of oral contraceptives and non-users, x - y.

Hypothesis Testing. Evaluation of Performance of Learned h. Issues. Trade-off Between Bias and Variance

32 estimating the cumulative distribution function

Element sampling: Part 2

Statistical Theory MT 2008 Problems 1: Solution sketches

Binomial Distribution

STAT 155 Introductory Statistics Chapter 6: Introduction to Inference. Lecture 18: Estimation with Confidence

(7 One- and Two-Sample Estimation Problem )

A statistical method to determine sample size to estimate characteristic value of soil parameters

Tests of Hypotheses Based on a Single Sample (Devore Chapter Eight)

Lecture 12: September 27

Parameter, Statistic and Random Samples

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Mathacle. PSet Stats, Concepts In Statistics Level Number Name: Date: Confidence Interval Guesswork with Confidence

CONFIDENCE INTERVALS STUDY GUIDE

Section 14. Simple linear regression.

5. Likelihood Ratio Tests

Statistical Theory MT 2009 Problems 1: Solution sketches

4. Partial Sums and the Central Limit Theorem

Parameter, Statistic and Random Samples

The standard deviation of the mean

April 18, 2017 CONFIDENCE INTERVALS AND HYPOTHESIS TESTING, UNDERGRADUATE MATH 526 STYLE

Transcription:

Stat 41-SP01 Iterval Estimatio Sectio 11.1-11. We ow uderstad (Chapter 10) how to fid poit estimators of a ukow parameter. o However, a poit estimate does ot provide ay iformatio about the ucertaity (possible size of error) associated with the estimate, e.g., Var( ) for a ubiased estimator, or M.S. E. E[( ) ], or the amout of iformatio this estimate is based o. Remember that the amout of iformatio icreases with sample size. Oe way to assess the ucertaity of a estimator is to defie its estimate via a iterval 1, where, ( 1, ) are the lower ad upper limits of a suitably defied iterval based o the poit estimator. o The radom iterval ( 1, ), obtaied via the samplig distributio of the poit estimator, covers the ukow parameter with a specified coverage probability (1 ); i.e., P (1 ). 1 What do the uderlied phrases mea? How does oe iterpret the above probability statemet? The iterval ( 1, ) varies from sample to sample. For a give radom sample, the statemet ( 1, ), i.e., the observed iterval ( ( x,, x ), ( x,, x )), 1 1 1 cotais (covers) the true value of, may be True/False. [Biary outcome] Sice the value of is ukow, we have o way of kowig whether or ot this statemet is true for a give sample. But amog all future realizatios of this iterval, approximately 100(1 )% will cotai the true value of. Thus, if (1 ) is fairly large, this statemet will be true for most of the samples. I this sese, oe has a high degree of cofidece that the true value of is i the observed iterval. Of course, there is o guaratee that a give iterval covers the true value. o Defiitio: For a specified value of (1 ), the iterval ( ( x,, x ), ( x,, x )) 1 1 1 is called a 100 (1 ) % cofidece iterval estimate for.

o Defiitio: The value (1 ) is called the Degree of Cofidece. o Defiitio: The edpoits ( 1, ) are called the lower ad upper cofidece limits. Sice there are may poit estimators for a give parameter, ayoe ca be used to obtai a iterval estimator with a give degree of cofidece. o Which oe should be used? o Choice based o desirable statistical properties. o Commet: For a real valued parameter, (, ) cotais the true value with degree of cofidece 1.0. But, is this iformatio meaigful? Why ot Would you wat to use a iterval based o oly oe observatio? Why ot? Desirable statistical properties of a iterval estimator: o It should use all the iformatio i the data. How do we achieve this goal? Use sufficiet statistics! o For a give degree of cofidece (1 ) Shortest legth iterval may be useful. Iterval legth ( 1, ) may be a radom! Shortest I what sese? Shortest Expected legth. o Give two ubiased estimators, which oe should be used to obtai a iterval estimator? Ay suggestios! Key Idea: Use the samplig distributio of the poit estimator (or its large sample approximatio) to defie the iterval estimator with the specified cofidece coefficiet:

Example: Iterval estimator of the Mea of the Normal populatio, with kow variace, X N(, ) What poit estimator would you choose to develop the iterval estimator? o For this problem, X is MVUE, MLE, ad MOM-Good Choice o From Theorem 8.4, the exact samplig distributio of X : X N (, / ). X o Therefore, Z ~ N(0,1) [Stadardize X Z] / o For a fixed, let z / be the value such that PZ [ z /] 1. Therefore, the area uder the stadard ormal desity i the iterval ( z /, ) is /. How does oe obtai this value for a give? Table III, Page 574, provides area uder the stadard ormal pdf from 0 to z, for z=0(.01)3.9,4,5,6. How does the value of z / chage as icreases? o Note that the followig iequalities are all equivalet: X Z z z X z / / / / o Therefore, PZ [ z/ ] P X z/ 1. This statemet provides a probabilistic assessmet of the size of maximum error, ( X ) with a specified probability, leadig to the followig result.

o Theorem 11.1. Give a radom sample of size from a Normal populatio X N(, ), with kow variace, if oe uses the mea X as a estimator of the populatio mea, the with probability (1 ), the absolute error will be less tha { z / }. Examples: o (i) o (ii) o (iii) o (iv) 4, 16 64, 16 1600, 16 64, 1 What patters do you see? o Effect of icreasig o Effect of decreasig o Effect of icreasig variace

To obtai a iterval estimator, covert the above probability statemet to a iterval estimate of : P X z = P[ z X z ] / / / For large, the iterval ( s s X z/, X z/ ) is a approximate 100(1- )% CI for the populatio mea. ( X z/, X z/ ) with degree of cofidece (1- ), or ( X z/, X z/ ) is a 100(1- )% Cofidece Iterval (CI). Examples: o (i) 4, 16 o (ii) 64, 16 o (iii) 1600, 16 o (iv) 64, 1 Extesios: o What to do if the uderlyig populatio is o-ormal? Need more data! The sample size large ( > 30) [Hit: Use Cetral Limit Theorem]. Fact: For large, X ~ N(0,1). / Nothig ew, except ow the samplig distributio of X is approximately ormal. What if variace is ukow? Replace by the sample variace o Fact: For large, X ~ N(0,1). s/ s. [Eve for o-ormal populatio]

Situatio ot covered yet: small ( 30), ukow variace. Assume Normal distributio i the uderlyig populatio. Note that X ( X )/ /. s/ s / This is a ratio of two radom variables: o Z i the umerator, ad s / i the deomiator. Chapter 8: Sectio 8.5 The t-distributio Theorem 8.1. If Y ad Z are idepedet radom variables, with Z Y ad Z N(0,1), the the distributio of T is give by Y / 1 ( 1)/ u f( u) 1, u. It is called the t distributio with d.f. Proof (Page 78-79). Uses chage of variables techiques from Sectio 7.3. Start with the joit distributio of (y,z) to joit distributio of (y,t), fidig the Jacobia of the trasformatio ad the itegrate out y. Note: Those iterested i reviewig chage of variables topic may wat to complete the proof. More iterestig historical fact: A special case of this theorem, discussed i Theorem 8:13 below, was origially derived by W.S. Gossett, a employee of Guiess Brewig Co., which did t allow employees to publish research. He published uder the Pseudo-ame Studet. o It is called Studet s t distributio (S was his middle iitial)

Theorem 8.13. If X ad S are the sample mea ad variace of a RS of size from a ormal populatio, N(, ), the the samplig distributio of T X s / is the t-distributio with ( 1) d.f. Proof: Kow that Z N(0,1), (from Theorem 8.4). ( 1) S o Theorem 8.11 proves that the distributio of Y, as well as the fact that Y ad Z are idepedetly distributed. o Now use Theorem 8.1. Fact: The t-desity is symmetric aroud, goes to zero as u, but much more slowly tha the Normal desity. o For 1, it is Cauchy Desity, ad as v, it approaches ormal desity. o For 30, just use Table III for ormal desity. Used a lot i practice, so Tables of area uder the curve, for 1, 9 were made available. o [Table IV, page 575] provides t,. The iterval ( s s X t/, X t/ ) is a exact 100(1- )% CI for. o The differece from large case: I small sample case, istead of z-tables, we use t-tables for a give ( 1) d.f. Examples: Exercise 11.31 Expoetial: 1 f x e x x/ ( ), 0. 1 Uiform: f( x ),0 x Geeral method of costructig CI is called Pivot-method. o Fid a fuctio of the data ad the parameter, whose distributio does ot deped o the populatio parameter. For Example, o Z or T for meas from ormal populatios or for large. What about Expoetial distributio, Uiform distributio.