Maximum Likelihood Estimation

Similar documents
Multivariate Transformation of Variables and Maximum Likelihood Estimation

Line Fitting and Regression

LINEAR REGRESSION ANALYSIS

Chapter 8: Statistical Analysis of Simulated Data

Simple Linear Regression and Correlation.

Econometric Methods. Review of Estimation

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

1 Solution to Problem 6.40

STK4011 and STK9011 Autumn 2016

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

12.2 Estimating Model parameters Assumptions: ox and y are related according to the simple linear regression model

ENGI 4421 Propagation of Error Page 8-01

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Lecture 3. Sampling, sampling distributions, and parameter estimation

ρ < 1 be five real numbers. The

Special Instructions / Useful Data

TESTS BASED ON MAXIMUM LIKELIHOOD

Chapter 14 Logistic Regression Models

Chapter 4 Multiple Random Variables

Simple Linear Regression

M2S1 - EXERCISES 8: SOLUTIONS

Multiple Choice Test. Chapter Adequacy of Models for Regression

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

Chapter 5 Properties of a Random Sample

Lecture 2: Linear Least Squares Regression

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Comparison of Dual to Ratio-Cum-Product Estimators of Population Mean

Lecture Note to Rice Chapter 8

Part I: Background on the Binomial Distribution

Analyzing Two-Dimensional Data. Analyzing Two-Dimensional Data

Functions of Random Variables

Point Estimation: definition of estimators

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Chapter 2 Supplemental Text Material

X ε ) = 0, or equivalently, lim

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

ESS Line Fitting

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Module 7. Lecture 7: Statistical parameter estimation

Objectives of Multiple Regression

6.867 Machine Learning

CHAPTER VI Statistical Analysis of Experimental Data

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

Multiple Linear Regression Analysis

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

9.1 Introduction to the probit and logit models

Lecture Notes Types of economic variables

ENGI 3423 Simple Linear Regression Page 12-01

Comparing Different Estimators of three Parameters for Transmuted Weibull Distribution

ECON 5360 Class Notes GMM

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Chapter 5 Transformation and Weighting to Correct Model Inadequacies

i 2 σ ) i = 1,2,...,n , and = 3.01 = 4.01

Lecture 1: Introduction to Regression

Lecture 8: Linear Regression

Linear Regression with One Regressor

Bayes Estimator for Exponential Distribution with Extension of Jeffery Prior Information

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

4. Standard Regression Model and Spatial Dependence Tests

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

Correlation and Simple Linear Regression

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

CHAPTER 6. d. With success = observation greater than 10, x = # of successes = 4, and

Introduction to Matrices and Matrix Approach to Simple Linear Regression

Simple Linear Regression

Qualifying Exam Statistical Theory Problem Solutions August 2005

STK3100 and STK4100 Autumn 2017


STK3100 and STK4100 Autumn 2018

Elementary Slopes in Simple Linear Regression. University of Montana and College of St. Catherine Missoula, MT St.

Simulation Output Analysis

CHAPTER 3 POSTERIOR DISTRIBUTIONS

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Logistic regression (continued)

Analysis of Variance with Weibull Data

22 Nonparametric Methods.

Lecture 02: Bounding tail distributions of a random variable

Third handout: On the Gini Index

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

residual. (Note that usually in descriptions of regression analysis, upper-case

Unimodality Tests for Global Optimization of Single Variable Functions Using Statistical Methods

b. There appears to be a positive relationship between X and Y; that is, as X increases, so does Y.

COV. Violation of constant variance of ε i s but they are still independent. The error term (ε) is said to be heteroscedastic.

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

Probability and. Lecture 13: and Correlation

A New Family of Transformations for Lifetime Data

2SLS Estimates ECON In this case, begin with the assumption that E[ i

Summary of the lecture in Biostatistics

Lecture 1: Introduction to Regression

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

Chapter 13, Part A Analysis of Variance and Experimental Design. Introduction to Analysis of Variance. Introduction to Analysis of Variance

Transcription:

Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b

Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~ N(, ), whe what we actuall mea s that That s, has some true uderlg value μ, where ~ N(0, ). but there s addtve measuremet error (ose). We kow that f ~ N(0, ), the from a lear trasformato of varable, we get ~ N(, ).

Marquette Uverst Maxmum Lkelhood Estmato - Mea If we have a radom sample of sze wth, where ~ N(0, ). The we have, ~ N(0, ) for =,,. Sce these are depedet observatos, the jot dstrbuto s exp[ ( ) ] exp[ ( ) ] / / f (,...,, )... ( ) ( ) 3

Marquette Uverst Maxmum Lkelhood Estmato - Mea If we have a radom sample of sze wth, where ~ N(0, ). The we have, ~ N(0, ) for =,,. Sce these are depedet observatos, the jot dstrbuto s f (,...,, ) ( ) exp ( ) L(, ) / 4

Marquette Uverst Maxmum Lkelhood Estmato - Mea L(, ) s called the lkelhood fucto. What we wat to do s fd the values of (, ) that maxmze L(, ) L(, ). The value of μ that maxmzes s the value that mmzes ( ). The value of σ that maxmzes ( ). L(, ) s d mmze d 5

Marquette Uverst Maxmum Lkelhood Estmato - Mea L(, ) s called the lkelhood fucto. What we do s dfferetate solve. That s, L(, ) wrt μ ad σ, set = 0 ad L (, ) ad L(, ) 0., The values of ad that maxmze L(µ,σ ) are the maxmum lkelhood estmators (MLEs). 0, 6

Marquette Uverst Maxmum Lkelhood Estmato - Mea However, ths s mess, but we ca stead maxmze LL (, ) l( L(, )) as LL (, ), 0 LL (, ), 0 because t s a mootoc fucto to obta MLEs ad. 7

Marquette Uverst Maxmum Lkelhood Estmato - Mea Wth ad ~ N(0, ), depedet, LL / f (,...,, ) ( ) exp ( ) LL(, ) log( ) log( ) ( ) (, ) ( )( ) 0, 8

Marquette Uverst Maxmum Lkelhood Estmato - Mea LL Wth ad ~ N(0, ), depedet, / f (,...,, ) ( ) exp ( ) LL(, ) log( ) log( ) ( ) (, ) ( ) 0 ( ), ( ) 9

Marquette Uverst Maxmum Lkelhood Estmato - Mea Solvg for μ ad σ elds ( ) ad. These are MLEs, most probable or modal values. Note that the deomator s ad ot -. ( ) s a based estmator of, E( ). ~ ( ) E( ) E( ) ( ) s ( ) s ~ ( ) E Es ( ) Ths s wh we use a deomator -. 0

Marquette Uverst Maxmum Lkelhood Estmato - Mea ~ N, =0;, mu=5;, sgma=; =sgma*rad(0^6,)+mu; bar=mea(,); fgure() hst(bar,(0:.:0)') axs([0 0 0 70000]) mea(bar),var(bar) 5 / 0.4 5.0003 s 0.3987 7 x 04 6 5 4 3 0 0 3 4 5 6 7 8 9 0

Marquette Uverst Maxmum Lkelhood Estmato - Mea ~ ( ) sgmahat=var(',)'; fgure() hst(sgmahat,(0:.:0)') axs([0 0 0 55000]) mea(sgmahat) var(sgmahat) 4 ( ) 3.6 ( ).88.8805 5.5 x 04 5 4.5 4 3.5 3.5.5 3.600 s ( ) Toggle wth ext slde. horzotal- axs scale 0.5 0 0 4 6 8 0 4 6 8 0

Marquette Uverst Maxmum Lkelhood Estmato - Mea ~ ( ) ch=*sgmahat/sgma^; fgure(3) hst(ch,(0:.5:50)') axs([0 50 0 55000]) mea(ch) var(ch) ( ) 9 ( ) 8 5.5 x 04 5 4.5 4 3.5 3.5.5 9.000 8.003 s ( ) Toggle wth prevous slde. horzotal- axs scale 0.5 0 0 5 0 5 0 5 30 35 40 45 50 / 3

Marquette Uverst Maxmum Lkelhood Estmato - Lear Ths techque, ca be geeralzed to lear regresso. Let a bx, 5 where ~ N(0, ) 4 d 5 are depedet. 3 d 3 d 4,..., Measuremet Error True Le a bx d d d a bx 0 0 3 4 5 x 4

Marquette Uverst Maxmum Lkelhood Estmato - Lear Ths techque, ca be geeralzed to lear regresso. Let a bx, where ~ N(0, ) d 5 are depedet. d 3 d 4,..., True Le a bx Measuremet Error d d x 5

Marquette Uverst Maxmum Lkelhood Estmato - Lear Ths techque, ca be geeralzed to lear regresso. Let a bx, where ~ N(0, ) are depedet. The, the lkelhood s exp[ ( a bx) ] exp[ ( a bx) ] / / f (,..., a, b, )... ( ) ( ) 6

Marquette Uverst Maxmum Lkelhood Estmato - Lear Ths techque, ca be geeralzed to lear regresso. Let a bx, where ~ N(0, ) are depedet. The, the lkelhood s f (,..., a, b, ) ( ) exp ( a bx ) ad the log lkelhood s LL( a, b, ) log( ) log( ) ( ) a bx. o a or b o a or b a or b 7

Marquette Uverst Maxmum Lkelhood Estmato - Lear L( a, b, ) s aga called the lkelhood fucto. What we wat to do s fd the values of ( ab,, ) that maxmze L( a, b, ) L( a, b, ). The values (a,b) that maxmze are the values ( ab, ) that mmze ( a bx ). The value of σ that maxmzes ( a bx). L( a, b, ) s d a bx mmze d wrt a, b 8

Marquette Uverst Maxmum Lkelhood Estmato - Lear Dfferetate LL( a, b, ) wrt a, b, ad σ, the set = 0 LL( a, b, ) log( ) log( ) ( a bx ) LL a b LL a b (,, ) a ab,, (,, ) a bx x b ab,, LL a b ( a bx )( ) 0 ( )( ) 0 (,, ) ( ) 0 a bx ( ) ab,, 9

Marquette Uverst Maxmum Lkelhood Estmato - Lear Solvg for the estmated parameters elds b a ( x ) ( x )( ) ( x ) ( x) ( )( x ) ( x )( x ) ( x ) ( x) â bx ( a bx) 5 4 3 d abx d 0 0 3 4 5 d 3 d 4 d a bx x d 5 0

Marquette Uverst Maxmum Lkelhood Estmato - Lear measured data The regresso model a bx where ~ N(0, ),..., that we preseted, ca be equvaletl wrtte as X where x, x X, a,, b x ad ~ N(0, I ). I s a -dmesoal dett matrx. desg matrx regresso coeffcets d measuremet error

Marquette Uverst Maxmum Lkelhood Estmato - Lear The regresso model X where ~ N(0, I ). x x a b x a bx

Marquette Uverst Maxmum Lkelhood Estmato - Lear Wth X ad ~ N(0, I ) The lkelhood s f (,..., a, b, ) ( ) exp ( X )'( X ) compare to f (,..., a, b, ) ( ) exp ( a bx) ad the log lkelhood s LL( a, b, ) log( ) log( ) ( X )'( X ). 3

Marquette Uverst Maxmum Lkelhood Estmato - Lear L(, ) s aga called the lkelhood fucto. What we wat to do s fd the values of (, ) that maxmze L(, ). The value of β that maxmzes L(, ) s the value that mmzes ( X )'( X ). The value of σ that maxmzes ( X )'(. X ) We eed to fd. L(, ) s d a bx mmze wrt β ( X )'( X ) 4

Marquette Uverst Maxmum Lkelhood Estmato - Lear We do t eed to take the dervatve of L(, ) wrt β (although we could). We ca wrte wth algebra ( X )'( X ) ( X )'( X ) ( )'( X ' X )( ) add ad subtract X vertble does ot deped o β where ( X ' X ) X '. It ca be see that maxmzes LL(, ) because t makes ( X )'( X ) LL(, ) log( ) log( ) smallest ( X )'( X ) ( )'( X ' X )( ) 5

Marquette Uverst Maxmum Lkelhood Estmato - Lear More geerall, we ca have a multple regresso model X where ~ N(0, I ) measured data desg matrx ad regresso coeffcets measuremet error x x q 0 x x q, X,,,. x x q q (q+) (q+) 6

Marquette Uverst Maxmum Lkelhood Estmato - Lear The MLEs are the same, ( X ' X ) ' (q+) I addto, X ~ N, ( X ' X ) (q+) ad ( X )'( X ). ad ~ ( q ). ( ) ( q) ( q) could + & - X ( X )'( X ) ( X )'( X ) ( )'( X ' X )( ) Ths meas we should use a deomator of -q- for ubased estmator of σ. depedet 7

Marquette Uverst Maxmum Lkelhood Estmato - Lear Let ( ab, )', X (, x), the ~ N, ( X ' X ) (q+) um=0^6; a=.8;b=.5;, sgma=; x=[,,3,4,5]'; =legth(x); mu=a+b*x';, X=[oes(,),x]; =sgma*rad(um,)... +oes(um,)*mu; betahat=v(x'*x)*x'*'; fgure(), hst(betahat(,:),(-0:.:0)') fgure(), hst(betahat(,:),(-5:.:5)') betabar=mea(betahat,); varbetahat=var(beta,hat,); Colum of settgs 3.5 3.5.5 0.5 0-5 -0-5 0 5 0 5 3.5 3.5.5 0.5 4 x 04 4 x 04 a 0.8 a 0.7997 b 0.5 0.5005 b W W 4.4 s 4.4009 W 0.4 0.3997 s b 0-5 -0-5 0 5 0 5 a (X'X) cov(a,b)=-. corr(a,b)= -0.9045 8

Marquette Uverst Maxmum Lkelhood Estmato - Lear ~ ( ) resd=-(x*betahat)'; sgmahat=var(resd',)'; ch=*sgmahat/sgma^; fgure(3) hst(ch,(0:.5:30)') xlm([0 30]) mea(ch), var(ch) q ( q) ( ) 3 ( ) 6 3.0006 s 6.0038 4 x 04 0 8 6 4 ( a bx) 0 0 5 0 5 0 5 30 / 9

Method Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Estmate the slope, -tercept, ad resdual varace..4.3 X 5 5 ' X. 0.3 5 55 ( X ' X) 0.3 0..7 X 3 3.0 0.8 0.5 0. 0. 0.4 4 ( X ' X ) X ' 0. 0. 0.0 0. 0. 3.4 5 ( ' ) ' 0.95 X X X 0.47 ( X ' X ) X ' 0.86 ( a bx ) 30

Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Estmate the slope, -tercept, ad resdual varace. Because the lkelhood L( a, b, ) ( ) exp ( a bx ) s maxmzed whe we select (a,b) to mmze ( a bx ), we ca set up a score fucto Q ( ) (.e. σ a bx ) ad tr (a,b) combatos to see whch make Q smallest. 3

Method Marquette Uverst Example abx Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Numercall get slope, -tercept, ad resdual varace. Select a m, a max, b m, ad b max values. Use a b.0. Perform a exhaustve brute force grd search. ca make smaller Compute ( for each (a,b) combato. a bx ) Fd the a ad b combato that make σ the smallest. The a ad b that m σ are ad, ad the σ â b s. 3

Method Marquette Uverst Example abx Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Numercall get slope, -tercept, ad resdual varace. a m =-.0 a max =.0 b m =-.0 b max =.5 Compute σ for each (a,b) combato. Make surface. 0.95 0.47 0.86 33

Method 3 Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Use Gradet Descet to teratvel fd the ( â, b ) that Q ( a bx ) dq a b x da dq x a x b x db that mmze. S x x S S xx x Sx x 34

Method 3 Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Use Gradet Descet to teratvel fd the ( â, b ) that Q ( a bx ) dq S a bsx da dq Sx asx bsxx db that mmze. dq da Q dq db S Sx Q a Sx Sx S xx b S S S S x xx x x x x 35

Method 3 Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Use Gradet Descet to teratvel fd the ( â, b ) that Q ( a bx ) S Sx Q( a, b) a Sx Sx S xx b that mmze. (0) (0) Start wth tal (0) ( a, b ) or. (0) (0) (0) Calculate ew () (0) (0) Q( ).000 Calculate ew () () () Q( ) Cotue utl covergece ( k ) ( k ) ( k ) Q( ) at k=l. a b S S S S x xx x x x x step sze 36

Method 3 Marquette Uverst Example Gve observed data (,.4), (,.3), (3,.7), (4,3.0), (5,3.4). Use Gradet Descet to teratvel fd the ( â, b ) that Q ( a bx ) that mmze. L ( ) MLE s last value ( L) ( ) or (, L a b ). Just set L=5 0 5. 0.95 0.47 0.86 37

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal Ths s a more geeral method tha just for lear fuctos bx Let ae, 5 ae bx where ~ N(0, ) 4 are depedet. 3 d,..., d d ae bx d 3 d 4 0 0 3 4 5 x 38

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal Ths s a more geeral method tha just for lear fuctos bx Let ae, where ~ N(0, ) are depedet. The, the lkelhood s f (,..., a, b, ) ( ) exp ( ae ) bx ad the log lkelhood s bx LL( a, b, ) log( ) log( ) ( ) ae. 39

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal L( a, b, ) s aga called the lkelhood fucto. What we wat to do s fd the values of ( ab,, ) that maxmze L( a, b, ) L( a, b, ). The values (a,b) that maxmze bx are the values ( ab, ) that mmze ( ae ). The value of σ that maxmzes bx ( ae ). L( a, b, ) s bx d ae mmze d wrt a, b 40

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal Dfferetate LL( a, b, ) wrt a, b, ad σ, the set = 0 bx LL( a, b, ) log( ) log( ) ( ) ae LL a b a (,, ) LL a b ab,, (,, ) LL a b bx bx ( )( ) 0 ae e (,, ) ab,, bx ( ae ) 0 ( ) bx bx b ab,, ( ae )( ax e ) 0 4

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal Solvg for the estmated parameters elds a b e e bx bx x e xe bx bx bx ( ae ) No aaltc soluto. Need umercal Soluto. 5 4 3 d d 0 0 3 4 5 ae bx d 3 d ae d 4 x bx d 5 4

Marquette Uverst Maxmum Lkelhood Estmato - Expoetal Sce we had to umercall maxmze the lkelhood, we do ot have ce formulas 5 ae bx for the mea ad varace of ( ab,, ) 4 3 d a ad b that mmze d bx ( ae ) d d 3 d ae 0 0 3 4 5 bx d 4 d 5 x 43

Marquette Uverst Homework : ) Prove a) ( X )'( X ) ( X )'( X ) ( )'( X ' X )( ) b) That the MLEs for a,b ad σ o slde 9 are the same as those o slde 3. b ( x ) ( x )( ) ( x ) ( x) ( a bx) â bx a b ( X ' X ) ' X ( X )'( X ) 44

Method Marquette Uverst Homework : ) Gve observed data pots (,), (3,), (,3), (4,4). a) Plot the pots. b) Aaltcall estmate the regresso slope ad -tercept..e. fd abx b estmatg â ad b. Use 3 ( a, b)' ( X ' X ) X ' where X ad. 3 4 4 Estmate the resdual varace σ, ( ) a bx usg the estmated â ad b. 45

Method Marquette Uverst Homework : abx ) Gve observed data pots (,), (3,), (,3), (4,4). c) Numercall ft a regresso le to the pots. Select a m, a max, b m, ad b max values. Use a b.. Perform a exhaustve brute force grd search. ca make smaller Compute ( for each (a,b) combato. a bx ) Fd the a ad b combato that make σ the smallest. The a ad b that m σ are ad, ad the σ â b s. 46

Method 3 Marquette Uverst Homework : ) Gve observed data pots (,), (3,), (,3), (4,4). S S x x d) Use Gradet Descet to teratvel fd the ( â, b ) that that mmze Q ( a bx ). S dq d ( ) a bx a b x da da S dq d ( a bx ) x a x b x db db xx x x x ( k ) ( k ) ( k ) Q( ).000 S Sx Q Sx Sx S xx a b 47

Marquette Uverst Homework : ) Gve observed data pots (,), (3,), (,3), (4,4). ( a, b)' ( X ' X ) X ' abx ( ) a bx e) Plot the two (three) les o the same graph as the pots. f) Plot the surface of (a,b,σ ) values from c). wth the estmated pots from b) ad c) (ad d) ). g) Commet. 48

Marquette Uverst Homework : 3) Gve observed data pots (/, 3.), (,.8), (,.86), (3,.0), (4,.06), (5,.40). a) Plot the pots. b) Numercall ft a regresso sgle expoetal to the pots. Fd ae bx. Set up a terval of possble a ad b values. Select Δa ad Δb values. Compute ( bx ) ae for each combato. Fd a ad b that make σ smallest. The a ad b that m σ are ad ad the σ â b s. c) Plot the curve ae bx o the same graph as the pots. d) Plot the surface of (a,b,σ ) values from b). e) Commet. a b. 49

Marquette Uverst Homework : (/, 3.), (,.8), (,.86), (3,.0), (4,.06), (5,.40) 4) Gve same observed data pots as 3). a) Take the atural log of each pot, =log(). b) Plot the pots ( ad old x). b) Guess where the best ft le to the data s. c) Aaltcall ft a lear regresso le to the pots..e. fd ' cdx, where c log( a) ad d b. c dx d) Plot the curve e e o the same graph as the pots ad the prevous ftted curve from 3). c dx e) Compute from =exp( ) ad e e. f) Commet. 50

Marquette Uverst Homework : 5) Let x,, x be a depedet sample from each of the followg PDFs. I each case fd the MLE of θ. x e a) f( x ), x 0,,,..., 0, f (0 ). x! b) f ( x ) x, 0 x, 0. x/ c) f ( x ) e,,. 0 x 0 x d) f ( x ) e,,. x ( x ) e) f ( x ) e, x,. f(x θ)=0 where ot defed 5

Marquette Uverst Homework : 6) Geerate a radom sample x,, x from each of the pdfs 5). You choose approprate θ value for each f(x θ). =5 Repeat samples so ou have a total of 0 6 from each f(x θ). Calculate the MLE from each so that ou have 0 6. Calculate the mea, varace, ad make a hst of MLEs. *How do the MLEs of θ compare to the meas, modes, ad medas of f(x θ). 5