Parameter Estimation

Similar documents
Creating New Distributions

Practice Exam 1. (A) (B) (C) (D) (E) You are given the following data on loss sizes:

Method of Moments. which we usually denote by X or sometimes by X n to emphasize that there are n observations.

SPRING 2007 EXAM C SOLUTIONS

SOLUTION FOR HOMEWORK 8, STAT 4372

A Very Brief Summary of Bayesian Inference, and Examples

Exam C Solutions Spring 2005

Nonparametric Model Construction

Errata and updates for ASM Exam C/Exam 4 Manual (Sixth Edition) sorted by date

Errata and updates for ASM Exam C/Exam 4 Manual (Sixth Edition) sorted by page

A Very Brief Summary of Statistical Inference, and Examples

Mathematical statistics

Statistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation

Mathematical statistics

Stat 5102 Final Exam May 14, 2015

Part III. A Decision-Theoretic Approach and Bayesian testing

Severity Models - Special Families of Distributions

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b)

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Exam C. Exam C. Exam C. Exam C. Exam C

Bonus Malus Systems in Car Insurance

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Chapter 8.8.1: A factorization theorem

IE 303 Discrete-Event Simulation

Solutions to the Spring 2015 CAS Exam ST

Suggested solutions to written exam Jan 17, 2012

Modern Methods of Statistical Learning sf2935 Auxiliary material: Exponential Family of Distributions Timo Koski. Second Quarter 2016

Bayesian Inference: Posterior Intervals

Bayesian Predictive Modeling for Exponential-Pareto Composite Distribution

PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

Statistics & Data Sciences: First Year Prelim Exam May 2018

Math438 Actuarial Probability

STAT 135 Lab 3 Asymptotic MLE and the Method of Moments

Statistics 3858 : Maximum Likelihood Estimators

Course 4 Solutions November 2001 Exams

Statistics Masters Comprehensive Exam March 21, 2003

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference

Actuarial Science Exam 1/P

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Non-Life Insurance: Mathematics and Statistics

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

Econ 2140, spring 2018, Part IIa Statistical Decision Theory

Mathematical statistics

Subject CT6. CMP Upgrade 2013/14. CMP Upgrade

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistical Theory MT 2007 Problems 4: Solution sketches

Approximate Normality, Newton-Raphson, & Multivariate Delta Method

Statistics 135 Fall 2007 Midterm Exam

STAT 830 Bayesian Estimation

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

ST 740: Multiparameter Inference

On the Comparison of Fisher Information of the Weibull and GE Distributions

Estimation of Quantiles

Review of Discrete Probability (contd.)

ACTEX CAS EXAM 3 STUDY GUIDE FOR MATHEMATICAL STATISTICS

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Bayesian statistics, simulation and software

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

STAT 6350 Analysis of Lifetime Data. Probability Plotting

STAT J535: Chapter 5: Classes of Bayesian Priors

Dr. Maddah ENMG 617 EM Statistics 10/15/12. Nonparametric Statistics (2) (Goodness of fit tests)

Math 494: Mathematical Statistics

Probability Distributions Columns (a) through (d)

Risk Models and Their Estimation

9 Bayesian inference. 9.1 Subjective probability

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

STAT 135 Lab 2 Confidence Intervals, MLE and the Delta Method

Probability and Estimation. Alan Moses

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ).

Qualifying Exam in Probability and Statistics.

Principles of Statistics

This does not cover everything on the final. Look at the posted practice problems for other topics.

RISK AND EXTREMES: ASSESSING THE PROBABILITIES OF VERY RARE EVENTS

Qualifying Exam in Probability and Statistics.

STAT215: Solutions for Homework 1

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Monte Carlo conditioning on a sufficient statistic

Stat 5101 Lecture Notes

MFM Practitioner Module: Quantitiative Risk Management. John Dodson. October 14, 2015

STAT 3610: Review of Probability Distributions

NEWCASTLE UNIVERSITY SCHOOL OF MATHEMATICS & STATISTICS SEMESTER /2013 MAS2317/3317. Introduction to Bayesian Statistics: Mid Semester Test

1. Fisher Information

Bayesian Methods. David S. Rosenberg. New York University. March 20, 2018

The comparative studies on reliability for Rayleigh models


Statistics. Statistics

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

SOCIETY OF ACTUARIES EXAM STAM SHORT-TERM ACTUARIAL MODELS EXAM STAM SAMPLE SOLUTIONS

Estimation of Operational Risk Capital Charge under Parameter Uncertainty

Beta statistics. Keywords. Bayes theorem. Bayes rule

Transcription:

Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23

Methods for parameter estimation Methods for parameter estimation Methods for estimating parameters in a parametric model: method of moments matching of quantiles (or percentiles) maximum likelihood estimation full/complete, individual data complete, grouped data truncated or censored data Bayes estimation Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 2 / 23

Method of moments Method of moments Assume we observe values of the random sample X 1,..., X n from a population with common distribution function F (x θ) where θ = (θ 1,..., θ p ) is a vector of p parameters. The observed sample values will be denoted by x 1,..., x n. Denote the k-th raw moment by µ k (θ) = E(Xk θ). Denote the empirical (sample) moments by ˆµ k = 1 n x k j. n The method of moments is fairly straightforward: equate the first p raw moments to the corresponding p empirical moments and solve the resulting system of simultaneous equations: µ k (ˆθ) = 1 n j=1 n x k j. j=1 Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 3 / 23

Method of moments Illustrations Illustrations Example 1: For a Normal distribution, derive expressions for the method of moment estimators for the parameters µ and σ 2. Example 2: For a Gamma distribution with a shape parameter α and a scale parameter θ, derive expressions for their method of moment estimators. Example 3: For a Pareto distribution, derive expressions for the method of moment estimators for the parameters α and θ. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 4 / 23

Method of moments Illustrations SOA Example #6 For a sample of dental claims x 1, x 2,..., x 10 you are given: xi = 3860 and x 2 i = 4, 574, 802 Claims are assumed to follow a lognormal distribution with parameters µ and σ. µ and σ are estimated using the method of moments. Calculate E[X 500] for the fitted distribution. [ 259] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 5 / 23

Matching of quantiles Matching of quantiles Denote the 100q-th quantile of the distribution by π q (θ) for which in the case of continuous distribution is the solution to: F (π q (θ) θ) = q. Denote the smoothed empirical estimate by ˆπ q. This is usually found by solving for ˆπ q = (1 h)x (j) + hx (j+1), where j = (n + 1)q and h = (n + 1)q j. Here x (1) x (n) denote the order statistics of the sample. The quantile matching estimate of θ is any solution to the p equations: π qk (ˆθ) = ˆπ qk for k = 1, 2,..., p, where the q k s are p arbitrarily chosen quantiles. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 6 / 23

Matching of quantiles Example Example Suppose the following observed sample come from a Weibull with 12, 15, 18, 21, 21, 23, 28, 32, 32, 32, 58 F (x λ, γ) = 1 e λxγ. Calculate the quantile matching estimates of λ and γ by matching the median and the 75-th quantile. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 7 / 23

Matching of quantiles Example SOA Example #155 You are given the following data: 0.49 0.51 0.66 1.82 3.71 5.20 7.62 12.66 35.24 You use the method of percentile matching at the 40th and 80th percentiles to fit an inverse Weibull distribution to these data. Calculate the estimate of θ. [1.614] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 8 / 23

Maximum likelihood estimation The method of maximum likelihood The maximum likelihood estimate of parameter vector θ is obtained by maximizing the likelihood function. The likelihood contribution of an observation is the probability of observing the data. In many cases, it is more straightforward to maximize the logarithm of the likelihood function. The likelihood function will vary depending on whether it is completely observed, or if not, then possibly truncated and/or censored. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 9 / 23

Likelihood function Complete, individual data No truncation and no censoring In the case where we have no truncation and no censoring and each observation X i, for i = 1,..., n, is recorded, the likelihood function is L(θ) = n f Xj (x j θ). j=1 The corresponding log-likelihood function is l(θ) = log[l(θ)] = n log f Xj (x j θ). j=1 Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 10 / 23

Likelihood function Illustrations Illustrations Assume we have n samples completely observed. Derive expressions for the maximum likelihood estimates for the following distributions: Exponential: f X (x) = λe λx, for x > 0. Uniform: f X (x) = 1 θ, for 0 < x < θ. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 11 / 23

Likelihood function Illustrations SOA Example #26 You are given: Low-hazard risks have an exponential claim size distribution with mean θ. Medium-hazard risks have an exponential claim size distribution with mean 2θ. High-hazard risks have an exponential claim size distribution with mean 3θ. No claims from low-hazard risks are observed. Three claims from medium-hazard risks are observed, of sizes 1, 2 and 3. One claim from a high-hazard risk is observed, of size 15. Calculate the maximum likelihood estimate of θ. [2] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 12 / 23

Likelihood function Complete, grouped data Complete, grouped data Starting with a set of numbers c 0 < c 1 < < c k where from the sample, we have n j observed values in the interval (c j 1, c j ]. For such data, the likelihood function is L(θ) = k [F (c j θ) F (c j 1 θ)] n j. j=1 Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 13 / 23

Likelihood function Example Example Suppose you are given the following observations for a loss random variable X: Interval Number of Observations (0, 2] 4 (2, 4] 7 (4, 6] 10 (6, 8] 6 (8, ) 3 Total 30 Determine the log-likelihood function of the sample if X has a Pareto with parameters α and θ. If it is possible to maximize this log-likelihood and solve explicitly, determine the MLE of the parameters. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 14 / 23

Likelihood function Example SOA Example #44 You are given: Losses follow an exponential distribution with mean θ. A random sample of 20 losses is distributed as follows: Loss Range Frequency [0,1000] 7 (1000, 2000] 6 (2000, ) 7 Calculate the maximum likelihood estimate of θ. [1996.90] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 15 / 23

Likelihood function Truncated and/or censored data Truncated and/or censored data Consider the case where we have a left truncated and right censored observation. It is usually best to represent this observation as (t i, x i, δ i ) where: t i is the left truncation point; x i is the value that produced the data point; and δ i is an indicator (1 or 0) whether data is right censored. When observation is right censored, the contribution to the likelihood is [ ] 1 F (xi ) δi. 1 F (t i ) Otherwise if not right censored, the contribution to the likelihood is [ ] f(xi ) 1 δi. 1 F (t i ) Note that the policy limit if reached would be equal to x i t i. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 16 / 23

Likelihood function Truncated and/or censored data SOA Example #4 You are given: Losses follow a single-parameter Pareto distribution with density function: f(x) = α x α+1 x > 1 0 < α < A random sample of size five produced three losses with values 3, 6 and 14, and two losses exceeding 25. Calculate the maximum likelihood estimate of α. [0.2507] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 17 / 23

Likelihood function Example Illustrative example An insurance company records the claim amounts from a portfolio of policies with a current deductible of 100 and policy limit of 1100. Losses less than 100 are not reported to the company and losses above the limit are recorded as 1000. The recorded claim amounts as 120, 180, 200, 270, 300, 1000, 1000. Assume ground-up losses follow a Pareto with parameters α and θ = 400. Use the maximum likelihood estimate of α to estimate the Loss Elimination Ratio for a policy with twice the current deductible. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 18 / 23

Bayesian estimation Definitions and notation Assume that conditionally on θ, the observable random variables X 1,..., X n are i.i.d. with conditional density f X θ (x θ). Denote the random vector X = (X 1,..., X n ). prior distribution: the probability distribution of θ with density π(θ). joint distribution of (X, θ) is given by f X,θ (x, θ) = f X θ (x θ) π(θ) = f X θ (x 1 θ) f X (x n θ) π(θ). Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 19 / 23

Bayesian estimation Posterior distribution Posterior distribution The conditional probability distribution of θ, given the observed data X 1,..., X n, is called the posterior distribution. This is denoted by π(θ X) and is computed using π θ X (θ x) = f X,θ(x, θ) f X,θ (x, θ)dθ = f X θ(x θ)π(θ). f X θ (x θ)π(θ)dθ We can conveniently write π θ X (θ x) = c f X θ (x 1,..., x n θ)π(θ), where c is the constant of integration. More often, we write π θ X (θ x) f X θ (x 1,..., x n θ)π(θ). The mean of the posterior, E(θ X), is called the Bayes estimator of θ. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 20 / 23

Bayesian estimation Poisson with a Gamma prior Poisson with a Gamma prior Suppose that conditionally on θ, claims on an insurance policy X 1,..., X n are distributed as Poisson with mean λ. Let the prior distribution of λ be a Gamma with parameters α and θ (as in textbook). Show that the posterior distribution is also Gamma distributed and find expressions for its parameters. Show that the resulting Bayes estimator can be expressed as the weighted combination of the sample mean and the prior mean. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 21 / 23

Bayesian estimation SOA Exam Question SOA Exam Question You are given: Conditionally, given β, an individual loss X has Exponential distribution with density: f(x β) = 1 β e x/β, for x > 0. The prior distribution of β is Inverse Gamma with density: Hint: 0 π(β) = c2 β 3 e c/β, for β > 0. 1 (n 2)! y n e a/y dy = a n 1, for n = 2, 3,... Given that the observed loss is x, calculate the mean of the posterior distribution of β. Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 22 / 23

Bayesian estimation SOA Exam Question SOA Example #5 You are given The annual number of claims for a policyholder has a binomial distribution with probability function: ( ) 2 p(x q) = q x (1 q) 2 x, x = 0, 1, 2,... x The prior distribution is: p(q) = 4q 3, 0 < q < 1 This policyholder had one claim in each of Years 1 and 2. Calculate the Bayesian estimate of the number of claims in Year 3. [4/3] Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 23 / 23