Parameter, Statistic and Random Samples

Similar documents
Parameter, Statistic and Random Samples

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Chapter 6 Sampling Distributions

4. Partial Sums and the Central Limit Theorem

SDS 321: Introduction to Probability and Statistics

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

Binomial Distribution

Exam II Review. CEE 3710 November 15, /16/2017. EXAM II Friday, November 17, in class. Open book and open notes.

Distribution of Random Samples & Limit theorems

Topic 9: Sampling Distributions of Estimators

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

This section is optional.

Math 152. Rumbos Fall Solutions to Review Problems for Exam #2. Number of Heads Frequency

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

7.1 Convergence of sequences of random variables

Statistics 511 Additional Materials

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Last Lecture. Wald Test

Lecture 33: Bootstrap

STATISTICAL INFERENCE

Class 23. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

Simulation. Two Rule For Inverting A Distribution Function

AMS570 Lecture Notes #2

z is the upper tail critical value from the normal distribution

1.010 Uncertainty in Engineering Fall 2008

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

A quick activity - Central Limit Theorem and Proportions. Lecture 21: Testing Proportions. Results from the GSS. Statistics and the General Population

Lecture 19: Convergence

32 estimating the cumulative distribution function

Discrete Probability Functions

PRACTICE PROBLEMS FOR THE FINAL

EE 4TM4: Digital Communications II Probability Theory

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Chapter 13: Tests of Hypothesis Section 13.1 Introduction

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

LECTURE 8: ASYMPTOTICS I

7.1 Convergence of sequences of random variables

Lecture 20: Multivariate convergence and the Central Limit Theorem

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Sampling Distributions, Z-Tests, Power

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Common Large/Small Sample Tests 1/55

Statistical inference: example 1. Inferential Statistics

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Lecture 5. Materials Covered: Chapter 6 Suggested Exercises: 6.7, 6.9, 6.17, 6.20, 6.21, 6.41, 6.49, 6.52, 6.53, 6.62, 6.63.

Mathematical Statistics - MS

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

Sample Size Determination (Two or More Samples)

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

π: ESTIMATES, CONFIDENCE INTERVALS, AND TESTS Business Statistics

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. Comments:

STAC51: Categorical data Analysis

Tests of Hypotheses Based on a Single Sample (Devore Chapter Eight)

PRACTICE PROBLEMS FOR THE FINAL

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

5. Likelihood Ratio Tests

STA Learning Objectives. Population Proportions. Module 10 Comparing Two Proportions. Upon completing this module, you should be able to:

MATH/STAT 352: Lecture 15

April 18, 2017 CONFIDENCE INTERVALS AND HYPOTHESIS TESTING, UNDERGRADUATE MATH 526 STYLE

Lecture 7: Properties of Random Samples

Class 27. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

KLMED8004 Medical statistics. Part I, autumn Estimation. We have previously learned: Population and sample. New questions

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

Lecture Chapter 6: Convergence of Random Sequences

Probability and statistics: basic terms

Economics Spring 2015

Topic 8: Expected Values

2 Definition of Variance and the obvious guess

Exponential Families and Bayesian Inference

Statisticians use the word population to refer the total number of (potential) observations under consideration

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Estimation for Complete Data

Random Variables, Sampling and Estimation

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Chapter 6 Principles of Data Reduction

BIOS 4110: Introduction to Biostatistics. Breheny. Lab #9

Bayesian Methods: Introduction to Multi-parameter Models

Lecture Note 8 Point Estimators and Point Estimation Methods. MIT Spring 2006 Herman Bennett

Expectation and Variance of a random variable

Rule of probability. Let A and B be two events (sets of elementary events). 11. If P (AB) = P (A)P (B), then A and B are independent.

Computing Confidence Intervals for Sample Data

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

Learning Theory: Lecture Notes

7-1. Chapter 4. Part I. Sampling Distributions and Confidence Intervals

MA Advanced Econometrics: Properties of Least Squares Estimators

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Agreement of CI and HT. Lecture 13 - Tests of Proportions. Example - Waiting Times

Lecture 12: September 27

Background Information

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

Hypothesis Testing. Evaluation of Performance of Learned h. Issues. Trade-off Between Bias and Variance

MBACATÓLICA. Quantitative Methods. Faculdade de Ciências Económicas e Empresariais UNIVERSIDADE CATÓLICA PORTUGUESA 9. SAMPLING DISTRIBUTIONS

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Stat 421-SP2012 Interval Estimation Section

Chapter 22. Comparing Two Proportions. Copyright 2010, 2007, 2004 Pearson Education, Inc.

Transcription:

Parameter, Statistic ad Radom Samples A parameter is a umber that describes the populatio. It is a fixed umber, but i practice we do ot kow its value. A statistic is a fuctio of the sample data, i.e., it is a quatity whose value ca be calculated from the sample data. It is a radom variable with a distributio fuctio. Statistics are used to make iferece about ukow populatio parameters. The radom variables X, X,, X are said to form a (simple) radom sample of size if the X i s are idepedet radom variables ad each X i has the sample probability distributio. We say that the X i s are iid. week

Example Sample Mea ad Variace Suppose X, X,, X is a radom sample of size from a populatio with mea μ ad variace σ. The sample mea is defied as X i X i. The sample variace is defied as S ( X i X ). i week

Goals of Statistics Estimate ukow parameters μ ad σ. Measure errors of these estimates. Test whether sample gives evidece that parameters are (or are ot) equal to a certai value. week 3

Samplig Distributio of a Statistic The samplig distributio of a statistic is the distributio of values take by the statistic i all possible samples of the same size from the same populatio. The distributio fuctio of a statistic is NOT the same as the distributio of the origial populatio that geerated the origial sample. The form of the theoretical samplig distributio of a statistic will deped upo the distributio of the observable radom variables i the sample. week 4

Samplig from Normal populatio Ofte we assume the radom sample X, X, X is from a ormal populatio with ukow mea μ ad variace σ. Suppose we are iterested i estimatig μ ad testig whether it is equal to a certai value. For this we eed to kow the probability distributio of the estimator of μ. week 5

Claim Suppose X, X, X are i.i.d ormal radom variables with ukow mea μ ad variace σ the X ~ σ N μ, Proof: week 6

Recall - The Chi Square distributio If Z ~ N(0,) the, X Z has a Chi-Square distributio with parameter, i.e., X χ ~ (). Ca proof this usig chage of variable theorem for uivariate radom variables. The momet geeratig fuctio of X is m X () t t / If X χ, X ~ χ, K, X χ, all idepedet the Proof ~ k ( v ) ( v ) k ( v ) ~ k T ~ χ i X i Σ k v i week 7

Claim Suppose X, X, X are i.i.d ormal radom variables with mea μ ad variace σ. The, i Z are idepedet stadard ormal i σ variables, where i,,, ad Proof: i Z i X i μ X i μ σ ~ χ ( ) week 8

t distributio Suppose Z ~ N(0,) idepedet of X ~ χ (). The, T Z X / v ~ t ( ). v Proof: week 9

Claim Suppose X, X, X are i.i.d ormal radom variables with mea μ ad variace σ. The, Proof: X μ ~ t S / ( ) week 0

F distributio Suppose X ~ χ () idepedet of Y ~ χ (m). The, X / Y / m ~ F (, m) week

Properties of the F distributio The F-distributio is a right skewed distributio. F( ) i.e. m, F ( < a) P F(, m) (, m) P F (, m) > a P F( m, ) > a Ca use Table 7 o page 796 to fid percetile of the F- distributio. Example week

The Cetral Limit Theorem Let X, X, be a sequece of i.i.d radom variables with E(X i ) μ < ad Var(X i ) σ <. Let S X i i The, S μ lim P σ z P ( Z z) Φ( z) for - < x < where Z is a stadard ormal radom variable ad Ф(z)is the cdf for the stadard ormal distributio. S μ This is equivalet to sayig that Z coverges i distributio to σ Z ~ N(0,). X Also, lim P σ μ x Φ ( x) X μ i.e. Z coverges i distributio to Z ~ N(0,). σ week 3

Example Suppose X, X, are i.i.d radom variables ad each has the Poisso(3) distributio. So E(X i ) V(X i ) 3. ( ) ( ) The CLT says that P X + + X 3 + x 3 Φ x as. L week 4

Examples A very commo applicatio of the CLT is the Normal approximatio to the Biomial distributio. Suppose X, X, are i.i.d radom variables ad each has the Beroulli(p) distributio. So E(X i ) p ad V(X i ) p(-p). ( ( )) ( ) The CLT says that P X + + X p x p p x as. + Φ L Let Y X + + X the Y has a Biomial(, p) distributio. So for large, P ( Y y) P Y p p y p Φ y p ( p) p( p) p( p) Suppose we flip a biased coi 000 times ad the probability of heads o ay oe toss is 0.6. Fid the probability of gettig at least 550 heads. Suppose we toss a coi 00 times ad observed 60 heads. Is the coi fair? week 5