f(x i ; ) L(x; p) = i=1 To estimate the value of that maximizes L or equivalently ln L we will set =0, for i =1, 2,...,m p x i (1 p) 1 x i i=1

Similar documents
Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

STATISTICAL INFERENCE

Interval Estimation (Confidence Interval = C.I.): An interval estimate of some population parameter is an interval of the form (, ),

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

MATH/STAT 352: Lecture 15

Estimation of the Mean and the ACVF

Problem Set 4 Due Oct, 12

Random Variables, Sampling and Estimation

Sample Size Determination (Two or More Samples)

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Topic 9: Sampling Distributions of Estimators

Stat 319 Theory of Statistics (2) Exercises

Lecture Note 8 Point Estimators and Point Estimation Methods. MIT Spring 2006 Herman Bennett

Introduction to Econometrics (3 rd Updated Edition) Solutions to Odd- Numbered End- of- Chapter Exercises: Chapter 3

Topic 9: Sampling Distributions of Estimators

7-1. Chapter 4. Part I. Sampling Distributions and Confidence Intervals

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

Expectation and Variance of a random variable

Mathacle. PSet Stats, Concepts In Statistics Level Number Name: Date:

Chapter 13: Tests of Hypothesis Section 13.1 Introduction

Topic 9: Sampling Distributions of Estimators

Statistics 300: Elementary Statistics

KLMED8004 Medical statistics. Part I, autumn Estimation. We have previously learned: Population and sample. New questions

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

1.010 Uncertainty in Engineering Fall 2008

Exponential Families and Bayesian Inference

(7 One- and Two-Sample Estimation Problem )

Lecture 5. Materials Covered: Chapter 6 Suggested Exercises: 6.7, 6.9, 6.17, 6.20, 6.21, 6.41, 6.49, 6.52, 6.53, 6.62, 6.63.

Simulation. Two Rule For Inverting A Distribution Function

1 Introduction to reducing variance in Monte Carlo simulations

Statistics 20: Final Exam Solutions Summer Session 2007

MBACATÓLICA. Quantitative Methods. Faculdade de Ciências Económicas e Empresariais UNIVERSIDADE CATÓLICA PORTUGUESA 9. SAMPLING DISTRIBUTIONS

Chapter 6 Principles of Data Reduction

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Statistical Intervals for a Single Sample

STA 4032 Final Exam Formula Sheet

Lecture 11 and 12: Basic estimation theory

Topic 5 [434 marks] (i) Find the range of values of n for which. (ii) Write down the value of x dx in terms of n, when it does exist.

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Parameter, Statistic and Random Samples

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

Lecture 7: Properties of Random Samples

Probability and statistics: basic terms

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Stat 421-SP2012 Interval Estimation Section

5. Likelihood Ratio Tests

SDS 321: Introduction to Probability and Statistics

Hypothesis Testing. Evaluation of Performance of Learned h. Issues. Trade-off Between Bias and Variance

Properties and Hypothesis Testing

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

A quick activity - Central Limit Theorem and Proportions. Lecture 21: Testing Proportions. Results from the GSS. Statistics and the General Population

Topic 10: Introduction to Estimation

Lecture 33: Bootstrap

Solutions to Homework 6

Final Review. Fall 2013 Prof. Yao Xie, H. Milton Stewart School of Industrial Systems & Engineering Georgia Tech

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

TAMS24: Notations and Formulas

Inferential Statistics. Inference Process. Inferential Statistics and Probability a Holistic Approach. Inference Process.

Chapter 8: Estimating with Confidence

Tests of Hypotheses Based on a Single Sample (Devore Chapter Eight)

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Department of Mathematics

Classification with linear models

Goodness-Of-Fit For The Generalized Exponential Distribution. Abstract

Common Large/Small Sample Tests 1/55

6. Sufficient, Complete, and Ancillary Statistics

Stat 200 -Testing Summary Page 1

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

Lecture 2: Poisson Sta*s*cs Probability Density Func*ons Expecta*on and Variance Es*mators

Linear Regression Models

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker

Matrix Representation of Data in Experiment

Estimation for Complete Data

April 18, 2017 CONFIDENCE INTERVALS AND HYPOTHESIS TESTING, UNDERGRADUATE MATH 526 STYLE

Lecture 2: Concentration Bounds

MOST PEOPLE WOULD RATHER LIVE WITH A PROBLEM THEY CAN'T SOLVE, THAN ACCEPT A SOLUTION THEY CAN'T UNDERSTAND.

Approximations and more PMFs and PDFs

Solutions: Homework 3

STAT431 Review. X = n. n )

This chapter focuses on two experimental designs that are crucial to comparative studies: (1) independent samples and (2) matched pair samples.

Lesson 7: Estimation 7.3 Estimation of Population Proportio. 1-PropZInterval

Module 1 Fundamentals in statistics

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

HYPOTHESIS TESTS FOR ONE POPULATION MEAN WORKSHEET MTH 1210, FALL 2018

Advanced Engineering Mathematics Exercises on Module 4: Probability and Statistics

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

The Sampling Distribution of the Maximum. Likelihood Estimators for the Parameters of. Beta-Binomial Distribution

7.1 Convergence of sequences of random variables

Summary. Recap ... Last Lecture. Summary. Theorem

Statistical Theory MT 2009 Problems 1: Solution sketches

Transcription:

Parameter Estimatio Samples from a probability distributio F () are: [,,..., ] T.Theprobabilitydistributio has a parameter vector [,,..., m ] T. Estimator: Statistic used to estimate ukow. Estimate: Observed value of the estimator. Maimum Likelihood Estimator The likelihood for idepedet samples is defied as L(; ) The maimum likelihood estimator is defied as Y f( i ; ) ˆ ML argmal(; ) To estimate the value of that maimizes L or equivaletly l L we will set Eample. For Beroulli distributio, @ l L @ i 0, for i,,...,m P (X ) p ( p) Hece, amog observatios, the likelihood is defied as L(; p) Y p i ( p) i p P i ( p) P i p ( ) ( p) l L l p + ( )l( p)

Takig derivative with respect to the parameter p d l L ( ) 0 dp p p ( p) ( )p 0 Hece, the ML estimator is ˆp Eample. ) ˆp P i For Poisso distributio P (X )! e Hece, amog observatios, the likelihood is defied as L(; p) Y i i! ep( ) P i Q i! ep( ) l L l Takig derivative with respect to the parameter X l( i!) Hece, the ML estimator is ˆ. Eample 3. For Gaussia distributio d l L 0 d P ) ˆ i f() p ep apple ( µ) Hece, amog observatios, the likelihood is defied as L(; µ, ) Y p ep apple (i µ) " ep ( ) / # X ( i µ) Subhaya De

l L l( ) l( ) X ( i µ) Takig derivative with respect to the parameter µ @ l L X @µ ( i µ) 0 P ) ˆµ i Hece, the ML estimator is ˆµ. Takig derivative with respect to the parameter @ l L @( ) X + ) ˆ Hece, the ML estimator is ˆ P i0 ( i ˆµ). Eample 4. ( i µ) 4 0 X ( i ˆµ) i0 For Gamma distributio f() ( ) e Hece, amog observatios, the likelihood is defied as L(;, ) Y i ( ) ( ) Y e i! e P i X l L ( ) l i X i +( )l l ( ) Takig derivative with respect to the parameter @ l L @ X i + 0 ) ˆ ˆ P i Subhaya De

Hece, the ML estimator is ˆ. ˆ P i Takig derivative with respect to the parameter ) l ˆ @ l L X @ l i + l 0 (ˆ ) (ˆ ) l! X i This is a oliear equatio eeded to be solved to get ˆ. Eample 5. 0 ( ) ( ) X l i If the observatios {0.3, 0., 0.5, 0.8, 0.9} are obtaied from a distributio with f(), 0theestimatethevalueof usig Maimul Likelihood method. The likelihood is defied as The log likelihood is L(; ) 5Y l L 5l +( ) Takig derivative of l L with respect to i 5X l i 0 @ l L @ 5 5X + l i 0 ) ˆ 5 P 5 l i.3038 Eample 6. For Uiform distributio i (0,) f(), 0 << Hece, amog observatios, the likelihood is defied as L(; ) l L Y l This is maimized whe is miimum but ma(,,..., ). Hece, the ML estimator is ˆ ma(,,..., ). Subhaya De

Iterval Estimate Let X,X,...,X are samples from a Gaussia distributio with mea µ ad variace. The poit estimator X is Gaussia with mea µ ad variace /. Hece, P.96 < X µ / p <.96 0.95 P X.96p <µ< X +.96p 0.95 Based o the observatios, with 95% we ca say that the populatio mea µ lies withi the iterval.96 p, +.96 p kow as the 95 percet cofidece iterval estimate of µ. I geeral, 00( )percettwo-sidedcofideceitervalfor µ is z / p, + z / p. Oe-sided upper cofidece iterval for µ is z p, +. Oe-sided lower cofidece iterval for µ is, + z p. Sample size: If we wat the 00( ) percettwo-sidedcofideceitervalforµ to be withi ( ± ) weeedasamplesize (z / / ). Quick referece: 00( )% two-sided cofidece iterval: 90% cofidece: 0,z /.65 95% cofidece: 5,z /.96 98% cofidece:,z /.33 99% cofidece:,z /.58 Similarly, the followig Table shows a variety of cases for samples from a ormal populatio: Note that, s P ( i ). Table 6.3: Di eret cases. Case Parameter Cofidece iterval Lower iterval Upper iterval kow µ ± z / p, + z p z p, + ukow µ s ± t /, p s, + t, p s t, p, + ( )s µ ukow ( )s, ( )s 0, ( )s, + /, /, /, /, Eample 6. Estimate the sample size eeded for mea to be withi ±0.5 where iterval of 95%. ad a cofidece Subhaya De

z /.96 0.5 984 Eample 7. The lifetime X of light bulbs are epoetially distributed. Based o observatio of 8 light bulbs we obtai their average lifetime is 00 hours. Estimate the 95% cofidece iterval for the mea lifetime. For epoetially distributed radom variable X, f() e The mea of X is / ad variace is /. For large umber of samples, thesamplemea is Gaussia with mea / ad variace. Hece, we ca write P.96 < X! <.96 0.95 p P.96 p < X < +.96 p 0.95.96 P p < X < +.96 p 0.95 X P +.96/ p < X <.96/ p 0.95 Hece, the 95% cofidece iterval for the mea lifetime of the bulbs is 00 +.96/ p 8 < < 00.96/ p 8 or 64 < < 56. Eample 8. For Poisso distributed radom variable get the 00( The p.m.f. is give by The mea E[X] i ) cofidece iterval. P (X i) e i! Var(X). Hece, for large X is approimately Gaussia with mea Subhaya De

ad variace /. Thishelpsiwritig r! P X.96 r < < X +.96 0.95 r! P X <.96 P X 0.95 (.96) < 0.95 Therefore, the cofidece iterval is the two solutios of the followig quadratic equatio ( ) (.96) Subhaya De