Bootstrap Methods and their Application

Similar documents
Stat 5101 Lecture Notes

UNIVERSITÄT POTSDAM Institut für Mathematik

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Subject CS1 Actuarial Statistics 1 Core Principles

11. Bootstrap Methods

MA 575 Linear Models: Cedric E. Ginestet, Boston University Non-parametric Inference, Polynomial Regression Week 9, Lecture 2

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference

Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods

Preliminaries The bootstrap Bias reduction Hypothesis tests Regression Confidence intervals Time series Final remark. Bootstrap inference

The Nonparametric Bootstrap

HANDBOOK OF APPLICABLE MATHEMATICS

3 Joint Distributions 71

Estimation of AUC from 0 to Infinity in Serial Sacrifice Designs

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University

Testing Statistical Hypotheses

Better Bootstrap Confidence Intervals

Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances

Resampling and the Bootstrap

SCHOOL OF MATHEMATICS AND STATISTICS. Linear and Generalised Linear Models

Chapter 2: Resampling Maarten Jansen

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing

A nonparametric two-sample wald test of equality of variances

Exam Applied Statistical Regression. Good Luck!

Review of Statistics 101

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

PRINCIPLES OF STATISTICAL INFERENCE

MS&E 226: Small Data

STAT440/840: Statistical Computing

Finite Population Correction Methods

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Testing Statistical Hypotheses

STAT 4385 Topic 01: Introduction & Review

Bootstrap Confidence Intervals

Contents. Preface to Second Edition Preface to First Edition Abbreviations PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1

Political Science 236 Hypothesis Testing: Review and Bootstrapping

Institute of Actuaries of India

Math 494: Mathematical Statistics

Bootstrap, Jackknife and other resampling methods

36-463/663: Multilevel & Hierarchical Models

Theory and Methods of Statistical Inference

HYPOTHESIS TESTING: FREQUENTIST APPROACH.

Numerical Analysis for Statisticians

The In-and-Out-of-Sample (IOS) Likelihood Ratio Test for Model Misspecification p.1/27

Bayesian inference for sample surveys. Roderick Little Module 2: Bayesian models for simple random samples

simple if it completely specifies the density of x

Exam details. Final Review Session. Things to Review

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

Supplementary Materials for Residuals and Diagnostics for Ordinal Regression Models: A Surrogate Approach

A better way to bootstrap pairs

Theory and Methods of Statistical Inference. PART I Frequentist theory and methods

Chapter 1 Statistical Inference

Confidence Intervals, Testing and ANOVA Summary

STATISTICS SYLLABUS UNIT I

Business Statistics. Lecture 10: Course Review

UNIVERSITY OF TORONTO Faculty of Arts and Science

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

Bootstrap (Part 3) Christof Seiler. Stanford University, Spring 2016, Stats 205

Outline of GLMs. Definitions

So far our focus has been on estimation of the parameter vector β in the. y = Xβ + u

Multiple Linear Regression

Introduction to Statistical Analysis

New Bayesian methods for model comparison

Nonparametric Methods II

Part 8: GLMs and Hierarchical LMs and GLMs

* Tuesday 17 January :30-16:30 (2 hours) Recored on ESSE3 General introduction to the course.

Bootstrap Testing in Econometrics

Stat 5102 Final Exam May 14, 2015

Chapter 1 Likelihood-Based Inference and Finite-Sample Corrections: A Brief Overview

STAT 461/561- Assignments, Year 2015

Lectures on Simple Linear Regression Stat 431, Summer 2012

Theory and Methods of Statistical Inference. PART I Frequentist likelihood methods

Practical Statistics

A Note on Bootstraps and Robustness. Tony Lancaster, Brown University, December 2003.

STAT 512 sp 2018 Summary Sheet

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages:

Summary of Chapters 7-9

Heteroskedasticity-Robust Inference in Finite Samples

Supporting Information for Estimating restricted mean. treatment effects with stacked survival models

Permutation Tests. Noa Haas Statistics M.Sc. Seminar, Spring 2017 Bootstrap and Resampling Methods

Linear Methods for Prediction

Exact Inference for the Two-Parameter Exponential Distribution Under Type-II Hybrid Censoring

Accurate directional inference for vector parameters

401 Review. 6. Power analysis for one/two-sample hypothesis tests and for correlation analysis.

Statistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Lecture 2: Linear Models. Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011

Practical Statistics for the Analytical Scientist Table of Contents

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

ST495: Survival Analysis: Hypothesis testing and confidence intervals

Now consider the case where E(Y) = µ = Xβ and V (Y) = σ 2 G, where G is diagonal, but unknown.

Generalized linear models

G-ESTIMATION OF STRUCTURAL NESTED MODELS (CHAPTER 14) BIOS G-Estimation

ˆπ(x) = exp(ˆα + ˆβ T x) 1 + exp(ˆα + ˆβ T.

Multilevel Statistical Models: 3 rd edition, 2003 Contents

Part 6: Multivariate Normal and Linear Models

GENERALIZED LINEAR MIXED MODELS AND MEASUREMENT ERROR. Raymond J. Carroll: Texas A&M University

Unit 10: Simple Linear Regression and Correlation

REFERENCES AND FURTHER STUDIES

Statistical Data Analysis Stat 3: p-values, parameter estimation

Transcription:

Bootstrap Methods and their Application Anthony Davison c 2006 A short course based on the book Bootstrap Methods and their Application, by A C Davison and D V Hinkley c Cambridge University Press, 1997 Anthony Davison: Bootstrap Methods and their Application, 1

Summary Bootstrap: simulation methods for frequentist inference Useful when standard assumptions invalid (n small, data not normal, ); standard problem has non-standard twist; complex problem has no (reliable) theory; or (almost) anywhere else Aim to describe basic ideas; confidence intervals; tests; some approaches for regression Anthony Davison: Bootstrap Methods and their Application, 2

Motivation Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 3

Motivation AIDS data UK AIDS diagnoses 1988 1992 Reporting delay up to several years! Problem: predict state of epidemic at end 1992, with realistic statement of uncertainty Simple model: number of reports in row j and column k Poisson, mean µ jk = exp(α j + β k ) Unreported diagnoses in period j Poisson, mean µ jk = exp(α j ) exp(β k ) k unobs k unobs Estimate total unreported diagnoses from period j by replacing α j and β k by MLEs How reliable are these predictions? How sensible is the Poisson model? Anthony Davison: Bootstrap Methods and their Application, 4

Motivation Diagnosis Reporting-delay interval (quarters): Total period reports to end Year Quarter 0 1 2 3 4 5 6 14 of 1992 1988 1 31 80 16 9 3 2 8 6 174 2 26 99 27 9 8 11 3 3 211 3 31 95 35 13 18 4 6 3 224 4 36 77 20 26 11 3 8 2 205 1989 1 32 92 32 10 12 19 12 2 224 2 15 92 14 27 22 21 12 1 219 3 34 104 29 31 18 8 6 253 4 38 101 34 18 9 15 6 233 1990 1 31 124 47 24 11 15 8 281 2 32 132 36 10 9 7 6 245 3 49 107 51 17 15 8 9 260 4 44 153 41 16 11 6 5 285 1991 1 41 137 29 33 7 11 6 271 2 56 124 39 14 12 7 10 263 3 53 175 35 17 13 11 306 4 63 135 24 23 12 258 1992 1 71 161 48 25 310 2 95 178 39 318 3 76 181 273 4 67 133 Anthony Davison: Bootstrap Methods and their Application, 5

Motivation AIDS data Data (+), fits of simple model (solid), complex model (dots) Variance formulae could be derived painful! but useful? Effects of overdispersion, complex model,? Diagnoses 0 100 200 300 400 500 ++ + ++++ +++ + + + + +++ + + + + + + + + + ++ + + + ++ + + ++ + 1984 1986 1988 1990 1992 Time Anthony Davison: Bootstrap Methods and their Application, 6

Motivation Goal Find reliable assessment of uncertainty when estimator complex data complex sample size small model non-standard Anthony Davison: Bootstrap Methods and their Application, 7

Basic notions Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 8

Basic notions Handedness data Table: Data from a study of handedness; hand is an integer measure of handedness, and dnan a genetic measure Data due to Dr Gordon Claridge, University of Oxford dnan hand dnan hand dnan hand dnan hand 1 13 1 11 28 1 21 29 2 31 31 1 2 18 1 12 28 2 22 29 1 32 31 2 3 20 3 13 28 1 23 29 1 33 33 6 4 21 1 14 28 4 24 30 1 34 33 1 5 21 1 15 28 1 25 30 1 35 34 1 6 24 1 16 28 1 26 30 2 36 41 4 7 24 1 17 29 1 27 30 1 37 44 8 8 27 1 18 29 1 28 31 1 9 28 1 19 29 1 29 31 1 10 28 2 20 29 2 30 31 1 Anthony Davison: Bootstrap Methods and their Application, 9

Basic notions Handedness data Figure: Scatter plot of handedness data The numbers show the multiplicities of the observations hand 1 2 3 4 5 6 7 8 1 1 1 1 2 2 2 2 1 1 1 5 5 3 4 1 1 1 1 1 15 20 25 30 35 40 45 dnan Anthony Davison: Bootstrap Methods and their Application, 10

Basic notions Handedness data Is there dependence between dnan and hand for these n = 37 individuals? Sample product-moment correlation coefficient is θ = 0509 Standard confidence interval (based on bivariate normal population) gives 95% CI (0221, 0715) Data not bivariate normal! What is the status of the interval? Can we do better? Anthony Davison: Bootstrap Methods and their Application, 11

Basic notions Frequentist inference Estimator θ for unknown parameter θ iid Statistical model: data y 1,,y n F, unknown Handedness data y = (dnan,hand) F puts probability mass on subset of R 2 θ is correlation coefficient Key issue: what is variability of θ when samples are repeatedly taken from F? Imagine F known could answer question by analytical (mathematical) calculation simulation Anthony Davison: Bootstrap Methods and their Application, 12

Basic notions Simulation with F known For r = 1,,R: generate random sample y1,,yn compute θ r using y1,, yn; Output after R iterations: iid F; θ 1, θ 2,, θ R Use θ 1, θ 2,, θ R to estimate sampling distribution of θ (histogram, density estimate, ) If R, then get perfect match to theoretical calculation (if available): Monte Carlo error disappears completely In practice R is finite, so some error remains Anthony Davison: Bootstrap Methods and their Application, 13

Basic notions Handedness data: Fitted bivariate normal model Figure: Contours of bivariate normal distribution fitted to handedness data; parameter estimates are bµ 1 = 285, bµ 2 = 17, bσ 1 = 54, bσ 2 = 15, bρ = 0509 The data are also shown 10 8 1 0020 6 1 0015 hand 4 1 1 0010 1 2 2 2 1 1 0005 1 1 2 2 1 5 5 3 4 1 1 0 10 15 20 25 30 35 40 45 dnan 0000 Anthony Davison: Bootstrap Methods and their Application, 14

Basic notions Handedness data: Parametric bootstrap samples Figure: Left: original data, with jittered vertical values Centre and right: two samples generated from the fitted bivariate normal distribution hand 0 2 4 6 8 10 Correlation 0509 hand 0 2 4 6 8 10 Correlation 0753 hand 0 2 4 6 8 10 Correlation 0533 0 10 20 30 40 50 dnan 0 10 20 30 40 50 dnan 0 10 20 30 40 50 dnan Anthony Davison: Bootstrap Methods and their Application, 15

Basic notions Handedness data: Correlation coefficient Figure: Bootstrap distributions with R = 10000 Left: simulation from fitted bivariate normal distribution Right: simulation from the data by bootstrap resampling The lines show the theoretical probability density function of the correlation coefficient under sampling from a fitted bivariate normal distribution Probability density 00 05 10 15 20 25 30 35 Probability density 00 05 10 15 20 25 30 35 05 00 05 10 Correlation coefficient 05 00 05 10 Correlation coefficient Anthony Davison: Bootstrap Methods and their Application, 16

Basic notions F unknown Replace unknown F by estimate F obtained parametrically eg maximum likelihood or robust fit of distribution F(y) = F(y; ψ) (normal, exponential, bivariate normal, ) nonparametrically using empirical distribution function (EDF) of original data y 1,,y n, which puts mass 1/n on each of the y j Algorithm: For r = 1,,R: generate random sample y1,,y n compute θ r using y1,, yn; Output after R iterations: iid F; θ 1, θ 2,, θ R Anthony Davison: Bootstrap Methods and their Application, 17

Basic notions Nonparametric bootstrap Bootstrap (re)sample y1,,y n y 1,,y n Repetitions will occur! Compute bootstrap θ using y 1,,y n iid F, where F is EDF of For handedness data take n = 37 pairs y = (dnan,hand) with equal probabilities 1/37 and replacement from original pairs (dnan,hand) Repeat this R times, to get θ 1,, θ R See picture Results quite different from parametric simulation why? Anthony Davison: Bootstrap Methods and their Application, 18

Basic notions Handedness data: Bootstrap samples Figure: Left: original data, with jittered vertical values Centre and right: two bootstrap samples, with jittered vertical values hand 1 2 3 4 5 6 7 8 Correlation 0509 hand 1 2 3 4 5 6 7 8 Correlation 0733 hand 1 2 3 4 5 6 7 8 Correlation 0491 10 15 20 25 30 35 40 45 dnan 10 15 20 25 30 35 40 45 dnan 10 15 20 25 30 35 40 45 dnan Anthony Davison: Bootstrap Methods and their Application, 19

Basic notions Using the θ Bootstrap replicates θ r used to estimate properties of θ Write θ = θ(f) to emphasize dependence on F Bias of θ as estimator of θ is β(f) = E( θ y 1,,y n iid F) θ(f) estimated by replacing unknown F by known estimate F: β( F) = E( θ y 1,,y n iid F) θ( F) Replace theoretical expectation E() by empirical average: β( F) b = θ θ R = R 1 θ r θ r=1 Anthony Davison: Bootstrap Methods and their Application, 20

Basic notions Estimate variance ν(f) = var( θ F) by v = 1 R 1 R ( θ r θ ) 2 r=1 Estimate quantiles of θ by taking empirical quantiles of θ 1,, θ R For handedness data, 10,000 replicates shown earlier give b = 0046, v = 0043 = 0205 2 Anthony Davison: Bootstrap Methods and their Application, 21

Basic notions Handedness data Figure: Summaries of the b θ Left: histogram, with vertical line showing b θ Right: normal Q Q plot of b θ Histogram of t Density 00 05 10 15 20 25 t* 05 00 05 05 00 05 10 t* 4 2 0 2 4 Quantiles of Standard Normal Anthony Davison: Bootstrap Methods and their Application, 22

Basic notions How many bootstraps? Must estimate moments and quantiles of θ and derived quantities Nowadays often feasible to take R 5000 Need R 100 to estimate bias, variance, etc Need R 100, prefer R 1000 to estimate quantiles needed for 95% confidence intervals R=199 R=999 Z* 4 2 0 2 4 Z* 4 2 0 2 4 4 2 0 2 4 Theoretical Quantiles 4 2 0 2 4 Theoretical Quantiles Anthony Davison: Bootstrap Methods and their Application, 23

Basic notions Key points Estimator is algorithm applied to original data y 1,, y n gives original θ applied to simulated data y1,, y n gives θ θ can be of (almost) any complexity for more sophisticated ideas (later) to work, θ must often be smooth function of data Sample is used to estimate F F F heroic assumption Simulation replaces theoretical calculation removes need for mathematical skill does not remove need for thought check code very carefully garbage in, garbage out! Two sources of error statistical ( F F) reduce by thought simulation (R ) reduce by taking R large (enough) Anthony Davison: Bootstrap Methods and their Application, 24

Confidence intervals Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 25

Confidence intervals Normal confidence intervals If θ approximately normal, then θ N(θ + β,ν), where θ has bias β = β(f) and variance ν = ν(f) If β,ν known, (1 2α) confidence interval for θ would be (D1) θ β ± z α ν 1/2, where Φ(z α ) = α Replace β, ν by estimates: β(f) ν(f) = β( F) = b = θ θ = ν( F) = v = (R 1) 1 ( θ r θ ) 2, r giving (1 2α) interval θ b ± z α v 1/2 Handedness data: R = 10,000, b = 0046, v = 0205 2, α = 0025, z α = 196, so 95% CI is (0147,0963) Anthony Davison: Bootstrap Methods and their Application, 26

Confidence intervals Normal confidence intervals Normal approximation reliable? Transformation needed? Here are plots for ψ = 1 2 log{(1 + θ)/(1 θ)}: Density 00 05 10 15 Transformed correlation coefficient 05 00 05 10 15 05 00 05 10 15 Transformed correlation coefficient 4 2 0 2 4 Quantiles of Standard Normal Anthony Davison: Bootstrap Methods and their Application, 27

Confidence intervals Normal confidence intervals Correlation coefficient: try Fisher s z transformation: for which compute b ψ = R 1 R ψ = ψ( θ) = 1 2 log{(1 + θ)/(1 θ)} r=1 ψ r ψ, v ψ = 1 R 1 R ( ψ r ψ ) 2, r=1 (1 2α) confidence interval for θ is { } ψ 1 ψ { } bψ z 1 α v 1/2 ψ, ψ 1 ψ bψ z α v 1/2 ψ For handedness data, get (0074,0804) But how do we choose a transformation in general? Anthony Davison: Bootstrap Methods and their Application, 28

Confidence intervals Pivots Hope properties of θ 1,, θ R mimic effect of sampling from original model Amounts to faith in substitution principle : may replace unknown F with known F false in general, but often more nearly true for pivots Pivot is combination of data and parameter whose distribution is independent of underlying model iid Canonical example: Y 1,,Y n N(µ,σ 2 ) Then Z = Y µ (S 2 /n) 1/2 t n 1, for all µ,σ 2 so independent of the underlying distribution, provided this is normal Exact pivot generally unavailable in nonparametric case Anthony Davison: Bootstrap Methods and their Application, 29

Confidence intervals Studentized statistic Idea: generalize Student t statistic to bootstrap setting Requires variance V for θ computed from y 1,,y n Analogue of Student t statistic: Z = θ θ V 1/2 If the quantiles z α of Z known, then ( Pr (z α Z z 1 α ) = Pr z α θ ) θ V 1/2 z 1 α = 1 2α (z α no longer denotes a normal quantile!) implies that Pr ( θ V 1/2 z 1 α θ θ ) V 1/2 z α = 1 2α so (1 2α) confidence interval is ( θ V 1/2 z 1 α, θ V 1/2 z α ) Anthony Davison: Bootstrap Methods and their Application, 30

Confidence intervals Bootstrap sample gives ( θ,v ) and hence R bootstrap copies of ( θ,v ): and corresponding z 1 = θ 1 θ V 1/2 1 Z = θ θ V 1/2 ( θ 1,V 1 ), ( θ 2,V 2 ),, ( θ R,V R), z 2 = θ 2 θ V 1/2 2,, zr = θ R θ V 1/2 R Use z1,,z R to estimate distribution of Z for example, order statistics z(1) < < z (R) used to estimate quantiles Get (1 2α) confidence interval θ V 1/2 z ((1 α)(r+1)), θ V 1/2 z (α(r+1)) Anthony Davison: Bootstrap Methods and their Application, 31

Confidence intervals Why Studentize? Studentize, so Z D N(0,1) as n Edgeworth series: Pr(Z z F) = Φ(z) + n 1/2 a(z)φ(z) + O(n 1 ); a( ) even quadratic polynomial Corresponding expansion for Z is Pr(Z z F) = Φ(z) + n 1/2 â(z)φ(z) + O p (n 1 ) Typically â(z) = a(z) + O p (n 1/2 ), so Pr(Z z F) Pr(Z z F) = O p (n 1 ) Anthony Davison: Bootstrap Methods and their Application, 32

Confidence intervals If don t studentize, Z = ( θ θ) N(0,ν) Then ( z ) ( Pr(Z z F) = Φ ν 1/2 +n 1/2 a z ) ( z ) ν 1/2 φ ν 1/2 +O(n 1 ) and Pr(Z z F) = Φ Typically ν = ν + O p (n 1/2 ), giving D ( z ) ( ν 1/2 +n 1/2 â z ) ( z ) ν 1/2 φ ν 1/2 +O p (n 1 ) Pr(Z z F) Pr(Z z F) = O p (n 1/2 ) Thus use of Studentized Z reduces error from O p (n 1/2 ) to O p (n 1 ) better than using large-sample asymptotics, for which error is usually O p (n 1/2 ) Anthony Davison: Bootstrap Methods and their Application, 33

Confidence intervals Other confidence intervals Problem for studentized intervals: must obtain V, intervals not scale-invariant Simpler approaches: Basic bootstrap interval: treat θ θ as pivot, get θ ( θ ((R+1)(1 α)) θ), θ ( θ ((R+1)α) θ) Percentile interval: use empirical quantiles of θ 1,, θ R : θ ((R+1)α), θ ((R+1)(1 α)) Improved percentile intervals (BC a, ABC, ) Replace percentile interval with θ ((R+1)α ), θ ((R+1)(1 α )), where α, α chosen to improve properties Scale-invariant Anthony Davison: Bootstrap Methods and their Application, 34

Confidence intervals Handedness data 95% confidence intervals for correlation coefficient θ, R = 10,000: Normal 0147 0963 Percentile 0047 0758 Basic 0262 1043 BC a (α = 00485,α = 00085) 0053 0792 Student 0030 1206 Basic (transformed) 0131 0824 Student (transformed) 0277 0868 Transformation is essential here! Anthony Davison: Bootstrap Methods and their Application, 35

Confidence intervals General comparison Bootstrap confidence intervals usually too short leads to under-coverage Normal and basic intervals depend on scale Percentile interval often too short but is scale-invariant Studentized intervals give best coverage overall, but depend on scale, can be sensitive to V length can be very variable best on transformed scale, where V is approximately constant Improved percentile intervals have same error in principle as studentized intervals, but often shorter so lower coverage Anthony Davison: Bootstrap Methods and their Application, 36

Confidence intervals Caution Edgeworth theory OK for smooth statistics beware rough statistics: must check output Bootstrap of median theoretically OK, but very sensitive to sample values in practice Role for smoothing? Quantiles of Standard Normal T*-t for medians -2 0 2-10 -5 0 5 10 Anthony Davison: Bootstrap Methods and their Application, 37

Confidence intervals Key points Numerous procedures available for automatic construction of confidence intervals Computer does the work Need R 1000 in most cases Generally such intervals are a bit too short Must examine output to check if assumptions (eg smoothness of statistic) satisfied May need variance estimate V see later Anthony Davison: Bootstrap Methods and their Application, 38

Several samples Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 39

Several samples Gravity data Table: Measurements of the acceleration due to gravity, g, given as deviations from 980,000 10 3 cms 2, in units of cms 2 10 3 Series 1 2 3 4 5 6 7 8 76 87 105 95 76 78 82 84 82 95 83 90 76 78 79 86 83 98 76 76 78 78 81 85 54 100 75 76 79 86 79 82 35 109 51 87 72 87 77 77 46 109 76 79 68 81 79 76 87 100 93 77 75 73 79 77 68 81 75 71 78 67 78 80 75 62 75 79 83 68 82 82 81 67 83 76 78 73 78 64 78 Anthony Davison: Bootstrap Methods and their Application, 40

Several samples Gravity data Figure: Gravity series boxplots, showing a reduction in variance, a shift in location, and possible outliers g 40 60 80 100 1 2 3 4 5 6 7 8 series Anthony Davison: Bootstrap Methods and their Application, 41

Several samples Gravity data Eight series of measurements of gravitational acceleration g made May 1934 July 1935 in Washington DC Data are deviations from 98 m/s 2 in units of 10 3 cm/s 2 Goal: Estimate g and provide confidence interval Weighted combination of series averages and its variance estimate ( 8 i=1 θ = y i n i /s 2 8 1 i 8 i=1 n, V = n i/s 2 i /si) 2, i i=1 giving θ = 7854, V = 059 2 and 95% confidence interval of θ ± 196V 1/2 = (775,798) Anthony Davison: Bootstrap Methods and their Application, 42

Several samples Gravity data: Bootstrap Apply stratified (re)sampling to series, taking each series as a separate stratum Compute θ, V for simulated data Confidence interval based on Z = θ θ V 1/2, whose distribution is approximated by simulations giving z 1 = θ 1 θ V 1/2 1,,zR = θ R θ V 1/2, R ( θ V 1/2 z ((R+1)(1 α)), θ V 1/2 z ((R+1)α) ) For 95% limits set α = 0025, so with R = 999 use z(25),z (975), giving interval (771,803) Anthony Davison: Bootstrap Methods and their Application, 43

Several samples Figure: Summary plots for 999 nonparametric bootstrap simulations Top: normal probability plots of t and z = (t t)/v 1/2 Line on the top left has intercept t and slope v 1/2, line on the top right has intercept zero and unit slope Bottom: the smallest t r also has the smallest v, leading to an outlying value of z Quantiles of Standard Normal t* -2 0 2 77 78 79 80 81 Quantiles of Standard Normal z* -2 0 2-15 -10-5 0 5 t* sqrt(v*) 77 78 79 80 81 01 02 03 04 05 06 07 z* sqrt(v*) -15-10 -5 0 5 01 02 03 04 05 06 07 Anthony Davison: Bootstrap Methods and their Application, 44

Several samples Key points For several independent samples, implement bootstrap by stratified sampling independently from each Same basic ideas apply for confidence intervals Anthony Davison: Bootstrap Methods and their Application, 45

Variance estimation Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 46

Variance estimation Variance estimation Variance estimate V needed for certain types of confidence interval (esp studentized bootstrap) Ways to compute this: double bootstrap delta method nonparametric delta method jackknife Anthony Davison: Bootstrap Methods and their Application, 47

Variance estimation Double bootstrap Bootstrap sample y1,,y n and corresponding estimate θ Take Q second-level bootstrap samples y1,,y n from y1,,y n, giving corresponding bootstrap estimates,, θ θ 1 Q, Compute variance estimate V as sample variance of θ 1,, θ Q Requires total R(Q + 1) resamples, so could be expensive Often reasonable to take Q = 50 for variance estimation, so need O(50 1000) resamples nowadays not infeasible Anthony Davison: Bootstrap Methods and their Application, 48

Variance estimation Delta method Computation of variance formulae for functions of averages and other estimators Suppose ψ = g( θ) estimates ψ = g(θ), and θ N(θ,σ 2 /n) Then provided g (θ) 0, have (D2) E( ψ) = g(θ) + O(n 1 ) var( ψ) = σ 2 g (θ) 2 /n + O(n 3/2 ) Then var( ψ) = σ 2 g ( θ) 2 /n = V Example (D3): θ = Y, ψ = log θ Variance stabilisation (D4): if var( θ) = S(θ) 2 /n, find transformation h such that var{h( θ)} =constant Extends to multivariate estimators, and to ψ = g( θ 1,, θ d ) Anthony Davison: Bootstrap Methods and their Application, 49

Variance estimation Anthony Davison: Bootstrap Methods and their Application, 50

Variance estimation Nonparametric delta method Write parameter θ = t(f) as functional of distribution F General approximation: V = V L = 1 n n 2 L(Y j ;F) 2 L(y;F) is influence function value for θ for observation at y when distribution is F: t {(1 ε)f + εh y } t(f) L(y;F) = lim, ε 0 ε where H y puts unit mass at y Close link to robustness Empirical versions of L(y;F) and V L are j=1 l j = L(y j ; F), v L = n 2 l 2 j, usually obtained by analytic/numerical differentation Anthony Davison: Bootstrap Methods and their Application, 51

Variance estimation Computation of l j Write θ in weighted form, differentiate with respect to ε Sample average: Change weights: so (D5) θ = y = 1 n yj = w j y j wj 1/n w j ε + (1 ε) 1 n, w i (1 ε) 1 n, i j y y ε = εy j + (1 ε)y = ε(y j y) + y, giving l j = y j y and v L = 1 n 2 (yj y) 2 = n 1 n n 1 s 2 Interpretation: l j is standardized change in y when increase mass on y j Anthony Davison: Bootstrap Methods and their Application, 52

Variance estimation Nonparametric delta method: Ratio Population F(u, x) with y = (u, x) and θ = t(f) = xdf(u,x)/ udf(u,x), sample version is θ = t( F) = xd F(u,x)/ ud F(u,x) = x/u Then using chain rule of differentiation (D6), l j = (x j θu j )/u, giving v L = 1 n 2 ( x j θu j u ) 2 Anthony Davison: Bootstrap Methods and their Application, 53

Variance estimation Handedness data: Correlation coefficient Correlation coefficient may be written as a function of averages xu = n 1 x j u j etc: θ = xu xu { } 1/2, (x 2 x 2 )(u 2 u 2 ) from which empirical influence values l j can be derived In this example (and for others involving only averages), nonparametric delta method is equivalent to delta method Get v L = 0029 for comparison with v = 0043 obtained by bootstrapping v L typically underestimates var( θ) as here! Anthony Davison: Bootstrap Methods and their Application, 54

Variance estimation Delta methods: Comments Can be applied to many complex statistics Delta method variances often underestimate true variances: v L < var( θ) Can be applied automatically (numerical differentation) if algorithm for θ written in weighted form, eg x w = w j x j, w j 1/n for x and vary weights successively for j = 1,, n, setting w j = w i + ε, i j, wi = 1 for ε = 1/(100n) and using the definition as derivative Anthony Davison: Bootstrap Methods and their Application, 55

Variance estimation Jackknife Approximation to empirical influence values given by l j l jack,j = (n 1)( θ θ j ), where θ j is value of θ computed from sample y 1,,y j 1, y j+1,,y n Jackknife bias and variance estimates are b jack = 1 1 ( ljack,j, v jack = l 2 n n(n 1) jack,j nbjack) 2 Requires n + 1 calculations of θ Corresponds to numerical differentiation of θ, with ε = 1/(n 1) Anthony Davison: Bootstrap Methods and their Application, 56

Variance estimation Key points Several methods available for estimation of variances Needed for some types of confidence interval Most general method is double bootstrap: can be expensive Delta methods rely on linear expansion, can be applied numerically or analytically Jackknife gives approximation to delta method, can fail for rough statistics Anthony Davison: Bootstrap Methods and their Application, 57

Tests Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 58

Tests Ingredients Ingredients for testing problems: data y 1,, y n ; model M 0 to be tested; test statistic t = t(y 1,, y n ), with large values giving evidence against M 0, and observed value t obs P-value, p obs = Pr(T t obs M 0 ) measures evidence against M 0 small p obs indicates evidence against M 0 Difficulties: p obs may depend upon nuisance parameters, those of M 0 ; p obs often hard to calculate Anthony Davison: Bootstrap Methods and their Application, 59

Tests Examples Balsam-fir seedlings in 5 5 quadrats Poisson sample? 0 1 2 3 4 3 4 2 2 1 0 2 0 2 4 2 3 3 4 2 1 1 1 1 4 1 5 2 2 3 4 1 2 5 2 0 3 2 1 1 3 1 4 3 1 0 0 2 7 0 Two-way layout: row-column independence? 1 2 2 1 1 0 1 2 0 0 2 3 0 0 0 1 1 1 2 7 3 1 1 2 0 0 0 1 0 1 1 1 1 0 0 Anthony Davison: Bootstrap Methods and their Application, 60

Tests Estimation of p obs Estimate p obs by simulation from fitted null hypothesis model M 0 Algorithm: for r = 1,,R: simulate data set y 1,,y n from M 0 ; calculate test statistic t r from y 1,, y n Calculate simulation estimate p = 1 + #{t r t obs} 1 + R of p obs = Pr(T t obs M 0 ) Simulation and statstical errors: p p obs p obs Anthony Davison: Bootstrap Methods and their Application, 61

Tests Handedness data: Test of independence Are dnan and hand positively associated? Take T = θ (correlation coefficient), with t obs = 0509; this is large in case of positive association (one-sided test) Null hypothesis of independence: F(u,x) = F 1 (u)f 2 (x) Take bootstrap samples independently from F 1 (dnan 1,,dnan n ) and from F 2 (hand 1,,hand n ), then put them together to get bootstrap data (dnan 1,hand 1 ),,(dnan n,hand n) With R = 9,999 get 18 values of θ θ, so p = 1 + 18 1 + 9999 = 00019 : hand and dnan seem to be positively associated To test positive or negative association (two-sided test), take T = θ : gives p = 0004 Anthony Davison: Bootstrap Methods and their Application, 62

Tests Handedness data: Bootstrap from M 0 hand 1 2 3 4 5 6 7 8 1 2 1 1 1 2 2 2 1 11023 4 3 1 Probability density 00 05 10 15 20 25 15 20 25 30 35 40 45 dnan 06 02 02 06 Test statistic Anthony Davison: Bootstrap Methods and their Application, 63

Tests Choice of R Take R big enough to get small standard error for p, typically 100, using binomial calculation: ( ) 1 + #{t var( p) = var r t obs } 1 + R = 1 R 2 Rp obs(1 p obs ) = p obs(1 p obs ) R so if p obs = 005 need R 1900 for 10% relative error (D7) Can choose R sequentially: eg if p = 006 and R = 99, can augment R enough to diminish standard error Taking R too small lowers power of test Anthony Davison: Bootstrap Methods and their Application, 64

Tests Duality with confidence interval Often unclear how to impose null hypothesis on sampling scheme General approach based on duality between confidence interval I 1 α = (θ α, ) and test of null hypothesis θ = θ 0 Reject null hypothesis at level α in favour of alternative θ > θ 0, if θ 0 < θ α Handedness data: θ 0 = 0 I 095, but θ 0 = 0 I 099, so estimated significance level 001 < p < 005: weaker evidence than before Extends to tests of θ = θ 0 against other alternatives: if θ 0 I 1 α = (, θ α ), have evidence that θ < θ 0 if θ 0 I 1 2α = (θ α, θ α ), have evidence that θ θ 0 Anthony Davison: Bootstrap Methods and their Application, 65

Tests Pivot tests Equivalent to use of confidence intervals Idea: use (approximate) pivot such as Z = ( θ θ)/v 1/2 as statistic to test θ = θ 0 Observed value of pivot is z obs = ( θ θ 0 )/V 1/2 Significance level is ) ( θ θ Pr V 1/2 z obs M 0 = Pr(Z z obs M 0 ) = Pr(Z z obs F) = Pr(Z z obs F) Compare observed z obs with simulated distribution of Z = ( θ θ)/v 1/2, without needing to construct null hypothesis model M 0 Use of (approximate) pivot is essential for success Anthony Davison: Bootstrap Methods and their Application, 66

Tests Example: Handedness data Test zero correlation (θ 0 = 0), not independence; θ = 0509, V = 0170 2 : z obs = θ θ 0 0509 0 = = 299 V 1/2 0170 Observed significance level is p = 1 + #{z r z obs} 1 + R = 1 + 215 1 + 9999 = 00216 Probability density 00 01 02 03 6 4 2 0 2 4 6 Test statistic Anthony Davison: Bootstrap Methods and their Application, 67

Tests Exact tests Problem: bootstrap estimate is p obs = Pr(T t obs M 0 ) Pr(T t M 0 ) = p obs, so estimate the wrong thing In some cases can eliminate parameters from null hypothesis distribution by conditioning on sufficient statistic Then simulate from conditional distribution More generally, can use Metropolis Hastings algorithm to simulate from conditional distribution (below) Anthony Davison: Bootstrap Methods and their Application, 68

Tests Example: Fir data Data Y 1,,Y n iid Pois(λ), with λ unknown Poisson model has E(Y ) = var(y ) = λ: base test of overdispersion on T = (Y j Y ) 2 /Y χ 2 n 1 ; observed value is t obs = 5515 Unconditional significance level: Pr(T t obs M 0,λ) Condition on value w of sufficient statistic W = Y j : p obs = Pr(T t obs M 0,W = w), independent of λ, owing to sufficiency of W Exact test: simulate from multinomial distribution of Y 1,,Y n given W = Y j = 107 Anthony Davison: Bootstrap Methods and their Application, 69

Tests Example: Fir data Figure: Simulation results for dispersion test Left panel: R = 999 values of the dispersion statistic t obtained under multinomial sampling: the data value is t obs = 5515 and p = 025 Right panel: chi-squared plot of ordered values of t, dotted line shows χ 2 49 approximation to null conditional distribution 00 001 002 003 004 dispersion statistic t* 20 40 60 80 20 40 60 80 t* 30 40 50 60 70 80 chi-squared quantiles Anthony Davison: Bootstrap Methods and their Application, 70

Tests Handedness data: Permutation test Are dnan and hand related? Take T = θ (correlation coefficient) again Impose null hypothesis of independence: F(u,x) = F 1 (u)f 2 (x), but condition so that marginal distributions F 1 and F 2 are held fixed under resampling plan permutation test Take resamples of form (dnan 1,hand 1 ),,(dnan n,hand n ) where (1,,n ) is random permutation of (1,,n) Doing this with R = 9,999 gives one- and two-sided significance probabilities of 0002, 0003 Typically values of p very similar to those for corresponding bootstrap test Anthony Davison: Bootstrap Methods and their Application, 71

Tests Handedness data: Permutation resample hand 1 2 3 4 5 6 7 8 Probability density 00 05 10 15 20 25 30 15 20 25 30 35 40 45 dnan 06 02 02 06 Test statistic Anthony Davison: Bootstrap Methods and their Application, 72

Tests Contingency table 1 2 2 1 1 0 1 2 0 0 2 3 0 0 0 1 1 1 2 7 3 1 1 2 0 0 0 1 0 1 1 1 1 0 0 Are row and column classifications independent: Pr(row i, column j) = Pr(row i) Pr(column j)? Standard test statistic for independence is T = i,j (y ij ŷ ij ) 2 ŷ ij, ŷ ij = y i y j y Get Pr(χ 2 24 3853) = 0048, but is T χ 2 24? Anthony Davison: Bootstrap Methods and their Application, 73

Tests Exact tests: Contingency table For exact test, need to simulate distribution of T conditional on sufficient statistics row and column totals Algorithm (D8) for conditional simulation: 1 choose two rows j 1 < j 2 and two columns k 1 < k 2 at random 2 generate new values from hypergeometric distribution of y j1 k 1 conditional on margins of 2 2 table y j1 k 1 y j1 k 2 y j2 k 1 y j2 k 2 3 compute test statistic T every I = 100 iterations, say Compare observed value t obs = 3853 with simulated T get p = 008 Anthony Davison: Bootstrap Methods and their Application, 74

Tests Key points Tests can be performed using resampling/simulation Must take account of null hypothesis, by modifying sampling scheme to satisfy null hypothesis inverting confidence interval (pivot test) Can use Monte Carlo simulation to get approximations to exact tests simulate from null distribution of data, conditional on observed value of sufficient statistic Sometimes obtain permutation tests very similar to bootstrap tests Anthony Davison: Bootstrap Methods and their Application, 75

Regression Motivation Basic notions Confidence intervals Several samples Variance estimation Tests Regression Anthony Davison: Bootstrap Methods and their Application, 76

Regression Linear regression Independent data (x 1,y 1 ),, (x n,y n ) with y j = x T j β + ε j, ε j (0,σ 2 ) Least squares estimates β, leverages h j, residuals e j = y j x T j β (1 h j ) 1/2 (0,σ 2 ) Design matrix X is experimental ancillary should be held fixed if possible, as if model y = Xβ + ε correct var( β) = σ 2 (X T X) 1 Anthony Davison: Bootstrap Methods and their Application, 77

Regression Linear regression: Resampling schemes Two main resampling schemes Model-based resampling: y j = xt j β + ε j, ε j EDF(e 1 e,,e n e) Fixes design but not robust to model failure Assumes ε j sampled from population Case resampling: (x j,y j ) EDF{(x 1,y 1 ),,(x n,y n )} Varying design X but robust Assumes (x j, y j ) sampled from population Usually design variation no problem; can prove awkward in designed experiments and when design sensitive Anthony Davison: Bootstrap Methods and their Application, 78

Regression Cement data Table: Cement data: y is the heat (calories per gram of cement) evolved while samples of cement set The covariates are percentages by weight of four constituents, tricalciumaluminate x 1, tricalcium silicate x 2, tetracalcium alumino ferrite x 3 and dicalcium silicate x 4 x 1 x 2 x 3 x 4 y 1 7 26 6 60 785 2 1 29 15 52 743 3 11 56 8 20 1043 4 11 31 8 47 876 5 7 52 6 33 959 6 11 55 9 22 1092 7 3 71 17 6 1027 8 1 31 22 44 725 9 2 54 18 22 931 10 21 47 4 26 1159 11 1 40 23 34 838 12 11 66 9 12 1133 13 10 68 8 12 1094 Anthony Davison: Bootstrap Methods and their Application, 79

Regression Cement data Fit linear model y = β 0 + β 1 x 1 + β 2 x 2 + β 3 x 3 + β 4 x 4 + ε and apply case resampling Covariates compositional: x 1 + + x 4 = 100% so X almost collinear smallest eigenvalue of X T X is l 5 = 00012 Plot of β 1 against smallest eigenvalue of X T X reveals that var ( β 1 ) strongly variable Relevant subset for case resampling post-stratification of output based on l5? Anthony Davison: Bootstrap Methods and their Application, 80

Regression Cement data beta1hat* -10-5 0 5 10 beta2hat* -10-5 0 5 10 1 5 10 50 500 smallest eigenvalue 1 5 10 50 500 smallest eigenvalue Anthony Davison: Bootstrap Methods and their Application, 81

Regression Cement data Table: Standard errors of linear regression coefficients for cement data Theoretical and error resampling assume homoscedasticity Resampling results use R = 999 samples, but last two rows are based only on those samples with the middle 500 and the largest 800 values of l 1 β 0 β1 β2 Normal theory 701 074 072 Model-based resampling, R = 999 663 070 069 Case resampling, all R = 999 1085 113 112 Case resampling, largest 800 673 077 069 Anthony Davison: Bootstrap Methods and their Application, 82

Regression Survival data dose x 1175 2350 4700 7050 9400 1410 survival % y 44000 16000 4000 0500 0110 0700 55000 13000 1960 0320 0015 0006 6120 0019 Data on survival % for rats at different doses Linear model: log(survival) = β 0 + β 1 dose survival % 0 10 20 30 40 50 log survival % -4-2 0 2 4 200 600 1000 1400 200 600 1000 1400 dose dose Anthony Davison: Bootstrap Methods and their Application, 83

Regression Survival data Case resampling Replication of outlier: none (0), once (1), two or more ( ) Model-based sampling including residual would lead to change in intercept but not slope estimated slope -0012-0008 -0004 00 1 1 1 1 0 1 1 1 1 11 1 1 1 1 11 11 11 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 05 10 15 20 25 30 sum of squares Anthony Davison: Bootstrap Methods and their Application, 84

Regression Generalized linear model Response may be binomial, Poisson, gamma, normal, y j mean µ j, variance φv (µ j ), where g(µ j ) = x T j β is linear predictor; g( ) is link function MLE β, fitted values µ j, Pearson residuals r Pj = Bootstrapped responses y j µ j {V ( µ j )(1 h j )} 1/2 (0,φ) yj = µ j + V ( µ j ) 1/2 ε j where ε j EDF(r P1 r P,,r Pn r P ) However possible that yj {0, 1, 2,, } r Pj not exchangeable, so may need stratified resampling Anthony Davison: Bootstrap Methods and their Application, 85

Regression AIDS data Log-linear model: number of reports in row j and column k follows Poisson distribution with mean Log link function and variance function Pearson residuals: µ jk = exp(α j + β k ) g(µ jk ) = log µ jk = α j + β k var(y jk ) = φ V (µ jk ) = 1 µ jk r jk = Model-based simulation: Y jk µ jk { µ jk (1 h jk )} 1/2 Y jk = µ jk + µ 1/2 jk ε jk Anthony Davison: Bootstrap Methods and their Application, 86

Regression Diagnosis Reporting-delay interval (quarters): Total period reports to end Year Quarter 0 1 2 3 4 5 6 14 of 1992 1988 1 31 80 16 9 3 2 8 6 174 2 26 99 27 9 8 11 3 3 211 3 31 95 35 13 18 4 6 3 224 4 36 77 20 26 11 3 8 2 205 1989 1 32 92 32 10 12 19 12 2 224 2 15 92 14 27 22 21 12 1 219 3 34 104 29 31 18 8 6 253 4 38 101 34 18 9 15 6 233 1990 1 31 124 47 24 11 15 8 281 2 32 132 36 10 9 7 6 245 3 49 107 51 17 15 8 9 260 4 44 153 41 16 11 6 5 285 1991 1 41 137 29 33 7 11 6 271 2 56 124 39 14 12 7 10 263 3 53 175 35 17 13 11 306 4 63 135 24 23 12 258 1992 1 71 161 48 25 310 2 95 178 39 318 3 76 181 273 4 67 133 Anthony Davison: Bootstrap Methods and their Application, 87

Regression AIDS data Poisson two-way model deviance 7165 on 413 df indicates strong overdispersion: φ > 1, so Poisson model implausible Residuals highly inhomogeneous exchangeability doubtful Diagnoses 0 100 200 300 400 500 + + + +++ + + + + ++ ++ + + + + + +++ + ++++ ++ ++ +++ rp -6-4 -2 0 2 4 6 1984 1986 1988 1990 1992 0 1 2 3 4 5 skewness Anthony Davison: Bootstrap Methods and their Application, 88

Regression AIDS data: Prediction intervals To estimate prediction error: simulate complete table y jk ; estimate parameters from incomplete y jk get estimated row totals and truth µ +,j = ebα j Prediction error k unobs e b β k, y +,j µ +,j µ 1/2 +,j studentized so more nearly pivotal y +,j = Form prediction intervals from R replicates k unobs y jk Anthony Davison: Bootstrap Methods and their Application, 89

Regression AIDS data: Resampling plans Resampling schemes: parametric simulation, fitted Poisson model parametric simulation, fitted negative binomial model nonparametric resampling of r P stratified nonparametric resampling of r P Stratification based on skewness of residuals, equivalent to stratifying original data by values of fitted means Take strata for which µ jk < 1, 1 µ jk < 2, µ jk 2 Anthony Davison: Bootstrap Methods and their Application, 90

Regression AIDS data: Results Deviance/df ratios for the sampling schemes, R = 999 Poisson variation inadequate 95% prediction limits poisson negative binomial 0 1 2 3 4 5 6 0 1 2 3 4 5 6 00 05 10 15 20 25 deviance/df nonparametric 0 1 2 3 4 5 6 0 1 2 3 4 5 6 00 05 10 15 20 25 deviance/df stratified nonparametric Diagnoses 200 300 400 500 600 + + + + + + + + + + + + + + + + + 1989 1990 1991 1992 00 05 10 15 20 25 00 05 10 15 20 25 deviance/df deviance/df Anthony Davison: Bootstrap Methods and their Application, 91

Regression AIDS data: Semiparametric model More realistic: generalized additive model µ jk = exp {α(j) + β k }, where α(j) is locally-fitted smooth Same resampling plans as before 95% intervals now generally narrower and shifted upwards Diagnoses 0 100 200 300 400 500 600 + + +++++ ++++ + +++ +++ + + + +++ + + + ++ + ++ + + + Diagnoses 200 300 400 500 600 + + + + + + + + + + + + + + + + + 1984 1986 1988 1990 1992 1989 1990 1991 1992 Anthony Davison: Bootstrap Methods and their Application, 92

Regression Key points Key assumption: independence of cases Two main resampling schemes for regression settings: Model-based Case resampling Intermediate schemes possible Can help to reduce dependence on assumptions needed for regression model These two basic approaches also used for more complex settings (time series, ), where data are dependent Anthony Davison: Bootstrap Methods and their Application, 93

Regression Summary Bootstrap: simulation methods for frequentist inference Useful when standard assumptions invalid (n small, data not normal, ); standard problem has non-standard twist; complex problem has no (reliable) theory; or (almost) anywhere else Have described basic ideas confidence intervals tests some approaches for regression Anthony Davison: Bootstrap Methods and their Application, 94

Regression Books Chernick (1999) Bootstrap Methods: A Practicioner s Guide Wiley Davison and Hinkley (1997) Bootstrap Methods and their Application Cambridge University Press Efron and Tibshirani (1993) An Introduction to the Bootstrap Chapman & Hall Hall (1992) The Bootstrap and Edgeworth Expansion Springer Lunneborg (2000) Data Analysis by Resampling: Concepts and Applications Duxbury Press Manly (1997) Randomisation, Bootstrap and Monte Carlo Methods in Biology Chapman & Hall Shao and Tu (1995) The Jackknife and Bootstrap Springer Anthony Davison: Bootstrap Methods and their Application, 95