Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 1. 1, page 2, in Table of Contents: Section 33 is Specific Examples of Markov Processes.

Similar documents
Solutions to the Spring 2015 CAS Exam ST

Practice Exam #1 CAS Exam 3L

Solutions to the Fall 2017 CAS Exam S

Practice Exam 1. (A) (B) (C) (D) (E) You are given the following data on loss sizes:

Solutions to the Spring 2018 CAS Exam MAS-1

Errata and suggested changes for Understanding Advanced Statistical Methods, by Westfall and Henning

Sampling Distributions

Sampling Distributions

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Solutions to the Fall 2018 CAS Exam MAS-1

CAS Exam MAS-1 Practice Exam #1

This does not cover everything on the final. Look at the posted practice problems for other topics.

LIST OF FORMULAS FOR STK1100 AND STK1110

Solutions to the Fall 2016 CAS Exam S

Errata and Updates for ASM Exam MAS-I (First Edition) Sorted by Date

Errata and Updates for ASM Exam MAS-I (First Edition) Sorted by Page

Corrections and Minor Revisions of Mathematical Methods in the Physical Sciences, third edition, by Mary L. Boas (deceased)

Continuous Distributions

CAS Exam MAS-1. Howard Mahler. Stochastic Models

Solutions to the Spring 2016 CAS Exam S

ACM 116: Lectures 3 4

Statistics and data analyses

Test Problems for Probability Theory ,

SPRING 2007 EXAM C SOLUTIONS

Exercises and Answers to Chapter 1

THE QUEEN S UNIVERSITY OF BELFAST

Regression of Time Series

Asymptotic Statistics-III. Changliang Zou

Course 4 Solutions November 2001 Exams

Stat 5101 Notes: Brand Name Distributions

Probability Distributions Columns (a) through (d)

3 Continuous Random Variables

STA 4322 Exam I Name: Introduction to Statistics Theory

Random Variables and Their Distributions

Continuous Distributions

y = 1 N y i = 1 N y. E(W )=a 1 µ 1 + a 2 µ a n µ n (1.2) and its variance is var(w )=a 2 1σ a 2 2σ a 2 nσ 2 n. (1.

Lecture 16 : Bayesian analysis of contingency tables. Bayesian linear regression. Jonathan Marchini (University of Oxford) BS2a MT / 15

Exam C Solutions Spring 2005

Statistics Ph.D. Qualifying Exam

Solutions to the Spring 2017 CAS Exam S

Errata and updates for ASM Exam C/Exam 4 Manual (Sixth Edition) sorted by date

Probability and Statistics Notes

Errata and updates for ASM Exam C/Exam 4 Manual (Sixth Edition) sorted by page

Probability and Distributions

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

Institute of Actuaries of India

Statistical Hypothesis Testing

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

First Year Examination Department of Statistics, University of Florida

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

Stat 5101 Notes: Brand Name Distributions

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Hybrid Censoring Scheme: An Introduction

Qualifying Exam in Probability and Statistics.

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

Limiting Distributions

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

First Year Examination Department of Statistics, University of Florida

Summary of Chapters 7-9

Ph.D. Preliminary Examination Statistics June 2, 2014

ECON Fundamentals of Probability

Elementary Statistics for Geographers, 3 rd Edition

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Notes on the Multivariate Normal and Related Topics

Errata for Campbell, Financial Decisions and Markets, 01/02/2019.

STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples.

STT 843 Key to Homework 1 Spring 2018

UNIVERSITY OF TORONTO Faculty of Arts and Science

1.1 Review of Probability Theory

15 Discrete Distributions

Limiting Distributions

STA 256: Statistics and Probability I

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

1 Exercises for lecture 1

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

PhD Qualifying Examination Department of Statistics, University of Florida

Foundations of Statistical Inference

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

3 Modeling Process Quality

Discrete Wavelet Transformations: An Elementary Approach with Applications

Statistical Data Analysis Stat 3: p-values, parameter estimation

MULTIVARIATE DISTRIBUTIONS

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

Lecture 15. Hypothesis testing in the linear model

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Severity Models - Special Families of Distributions

Data Fitting - Lecture 6

STAT 4385 Topic 01: Introduction & Review

Continuous random variables

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

3. Probability and Statistics

Qualifying Exam in Probability and Statistics.

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Transcription:

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 1 1, page 2, in Table of Contents: Section 33 is Specific Examples of Markov Processes. 1, page 24, solution ot the exercise: (60%)(100) + (40%)(200) = 140. 1, p.61, sol. 2.62: 6665-44.5 2 = 1, p.64, sol. 2.69: θ 2 2 = 2.25. θ 2 = 1.5. n-1 1, p.96: Comparing the two results for t = 1: Γ[n ; λ] = 1 - e - λ λ i / i! = e - λ λ i / i!. i=0 i=n 1, page 135, last two lines: S(t) dt = (e - 0.02t + e - 0.03t - e - 0.08t - e - 0.07t + e - 0.10t ) dt = 0 0 1/0.02 + 1/0.03-1/0.08-1/0.07 + 1/0.10 = 50 + 33.33-12.5-14.29 + 10 = 66.54 1, page 213, solution 6.5 is correct but needs much more explanation: Prob[1 claim good] = 0.05 e -0.05 = 0.04756. Prob[1 claim bad] = 0.1 e -0.1 = 0.09048. By Bayes Theorem, the probability that an insured that had one claim in year one is good is: Prob[Good] Prob[1 claim Good] Prob[Good] Prob[1 claim Good] + Prob[Bad] Prob[1 claim Bad] = (2 / 3)(0.04756) (2 / 3)(0.04756) + (1/ 3)(0.09048) = 51.25%. By Bayes Theorem, the probability that an insured that had one claim in year one is bad is: (1/ 3)(0.09048) = 48.75% = 1-51.25%. (2 / 3)(0.04756) + (1/ 3)(0.09048) Prob[2 claims good] = 0.05 2 e -0.05 /2 = 0.001189. Prob[2 claims bad] = 0.1 2 e -0.1 /2 = 0.004524. Therefore, this insuredʼs probability of having 2 claims in year 2 is: (51.25%)(0.001189) + (48.75%)(0.004524) = 0.281%. Comment: Bayes Theorem is in Mahlerʼs Guide to Bayes Analysis and Conjugate Priors. 1, p.311, very last line of sol. 10.32: An Exponential with mean θ, has a second moment of 2θ 2. 1, p. 343, solution to the first exercise: (82-37)/100 = 45%. 1, p. 346, Q. 12.7: a Poisson Process

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 2 5 1, p. 365, solution to the exercise: m(5) = (4 + t) dt = 20 + 5 2 /2 = 32.5. 0 1, p. 385: Solutions 4.19 and 4.20 are reversed in how I numbered of them. 1, p. 395, 4th line from the bottom: = {-e -0.5-2e -0.5 - (0-2)} + e 0.3 (e -0.8 + 1.25e -0.8 ) = 2-0.75e -0.5 = 1.545. The final solution to the exercise is OK. 1, p. 396, 2nd line: 2-2e -0.5 + e 0.3 1.25e -0.8 = 2-0.75e -0.5 = 1.545. 1, p. 399, last line of solution ot the first exercise, there is a missing decimal place: (1-0.6017) 1, p. 401, 3rd line from the bottom, a missing closing bracket: Φ[(1499.5-1560.65)/ 1560.65 ] 1, p. 414, solution to the exercise at the bottom: on (0, 5) rather than (0, T). 1, p. 443, next to last paragraph: (0, 1, 0)P = (0.1, 0.75, 0.15) = probabilities after one day, having started in state 2. (0, 1, 0)P 2 = (0.1, 0.75, 0.15)P = (0.155, 0.6275, 0.2175) = probabilities after two days. 1, p.447, in the solution to the exercise: (0.75)(0.25) = 0.1875 1, p.481, in solutions 18.13-18.15: (0.1265, 0.7904, 0.0831) P = (0.172031, 0.72316, 0.104809). (0.172031, 0.72316, 0.104809) P = (0.208673, 0.67245, 0.118878). 1, p. 544, Sol. 19.23: B. (0.2)(25) + (0.5)(50) + (0.3)(100) = 60 1, p. 544, Sol. 19.24: the letter should be A. Also, (0.54, 0.35, 0.11)P = (0.4525, 0.443, 0.1045). 1, p. 610, in the exercise: the transition matrix should be 0.8 0.2 0 0 0 0 0.8 0.2. 0.8 0.2 0 0 0 0 0.8 0.2 Also the first balance equation should be: 0.8π 1 + 0.8π 3 = π 1.

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 3 Good 1, p. 615, Q. 21.6-10: the transition matrix should be: Typical Poor 0.70 0.20 0.10 0.15 0.80 0.05 0.20 0.30 0.50 1, p. 620, Q. 21.42, the 3rd and 4th lines are wrong: ʻIf it did not rain two days ago but rained yesterday, the probability that it will rain today is 60%. If it rained two days ago but not yesterday, the probability that it will rain today is 50%. Also, choice E should be 0.70. B C M B 1/ 2 1/ 2 0 1, p. 623, Q. 21.55, the transition matrix is mislabeled: C 1/ 4 1/ 2 1/ 4 M 0 1/ 3 2 / 3 1, p. 634, Sol. 21.22: π 3 = 12/43. Final answer is OK. 1, p. 655, next to last line: Alternately, (0, 0, 0, 1, 0) P = 1, p. 671, 4th paragraph: For j < N, let I j = 1 1, pp. 680-681, solutions 23.7-23.10: S = (I - P T ) -1 = 4 2. The final solutions are OK. 1 3 1, p. 682, solution 23.13: 4.10/6.02 = The final solutions are OK. 6 1, p. 687, last line, the first summation should go to infinity: k, π2 k2 k=1 1, p. 690, solution to the second exercise: P 8 0.666885 0.333115 = 0.666230 0.333770 1, p. 712: In the 3rd row and last column of the matrix, there is a missing decimal point: 0.82819 1, p. 712, in the line below the big matrix: P n goes to a matrix 1, p. 713, line 2: (P 32 ) 2,5 0.828

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 4 1, p. 713, line 5: 0.580 (P 32 ) 1,5 1, p. 726, solutions 26.4: 1 - (0.52 / 0.48)23. Final solution is OK. 1 - (0.52 / 0.48) 25 1, p. 749, second to last line, i and j need to be subscripts: P ij 1, p. 750, second to last line, i and j need to be subscripts: P ij = Q ij 1, p. 761, next to last line: 5. The new 2nd element is: 3.48445 + 4.80976 = 8.29421. 1, p. 767, 4th paragraph: Min[1, b(226.364)/b(211.202)] 1, p. 788, solution to exercise, missing decimal place: 0.45P 2 = 5 1, p. 791, next to last line, t missing in the exponent of the integrand: e - 0.006 t dt 0 1, p. 795, solution to last exercise: The chance of exactly one visit to the SNF = (chance of transition to SNF permanent from ILU) + (chance of transition ILU SNF temp.) (chance of transition SNF temp. SNF perm.) + (chance of transition ILU SNF temp.) (chance of transition SNF temp. Death) + (chance of revisit to ILU)(chance of death in ILU) = (5/25) + (12/25)(7/24) + (12/25)(12/24) + (1/10)(8/25) = 0.612. The 0.612 is OK. 1, p. 798, second bullet: A new resident starts off in an ILU. 1, p. 825, near the bottom: d P 21(t) dt = q 21 P 11 (t) - q 21 P 21 (t). 1, p. 826, near the bottom, missing minus signs: Thus, d P 12(t) dt = -14be -14t. -14be -14t = 1, p. 831, third paragraph: q 12 = 0.10 1, p. 838, third paragraph: 1.02835 0.042261 0.0704349 1.09878

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 5 1, p. 838, last page: 0.5v 1 = (λ - 0.2)(λ - 0.7)v 1 /0.3. 1, p. 846 & p. 855, given matrices in Q. 32.19 to do not have their rows add to 1, change them to: 0.749681 0.115822 0.086549 0.047948 0.078508 0.752584 0.088226 0.080681 P ij (1) =. For example, P 0.043259 0.047001 0.794648 0.115092 23 (1) = 0.088226. 0.044098 0.079734 0.155371 0.720798 P ij (2) = 0.576973 0.181886 0.151329 0.089812 0.125315 0.586055 0.155837 0.132793. For example, P 0.075572 0.086908 0.657239 0.180282 14 (2) = 0.089812. 0.077826 0.129889 0.246307 0.545978 Then the solution is: 0.457332 0.217985 0.200191 0.124492 0.152554 0.473483 0.207019 0.166944 P ij (3) =. 0.099859 0.119424 0.564492 0.216225 0.103273 0.161876 0.298752 0.436099 1, p. 853, solution 32.16: (1, 0, 0) P = (0.965, 0.025, 0.01) 1, p. 861, solution to exercise: d P ij (t) dt = (0.08)(j-1)P i,j-1 (t) + (0.05)(j+1)P i,j+1 (t) - 0.13j P i,j (t) 2 1, p. 871, sol. 33.6: P 1,2 (20) = exp[-λ k 20] = exp[-λ 1 20] k=1 λ 2 λ 2 - λ 1 + exp[-λ 2 20] 2 1 1 λ m λ - exp[-λ λ m - λ k 20] m=1, m k k m λ m - λ k k=1 m=1, m k λ 1 λ 1 - λ 2 - exp[-λ 1 20] = e -(20)(0.02) (7/5) + e -(20)(0.07) (-2/5) - e -(20)(0.02) = 16.95%. Alternately, the chance that there is a birth at time 0 t < 20 and no birth thereafter by time 20 is: 20 0.02e - 0.02t e - 0.07(20 - t) 20 dt = 0.02 e -1.4 e 0.05 t dt = 0.02 e -1.4 (e 1-1)/0.05 = 16.95%. 0 0

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 6 1, p. 884: As t approaches infinity, S(t) approaches a Normal Distribution. 2, page 7, middle of page: Var[X 1 ] + Var[X 2 ] + Var[X 3 ] = 3 VaR[X]. 2, page 19, middle of page: (α+1) = 2.393α. 2, page 50, line 8: with α = 4 2, p.66, 2nd line from the bottom, there is a missing summation sign: ln[f(x i )] θ 2, p.68, solution to the last exercise: µ^ = lnx i N. σ2 = (lnx i - µ^ ) 2 N. n 2, p.69, above the exercise: 0 = n/a + ln[x i ] - n ln[θ]. n/a = - ln[x i / θ]. a^ = i=1 n i=1 -n. ln[x i / θ] 2, p.70, fifth line: n ln[a] + n ln[a+1] + (a-1) ln[x i ] + ln[1 - x i / θ] - n a ln[θ]. 2, page 73, next to last paragraph, there are should not be an n before ln(3): This differs by: 2ln(x) + ln(3), from the log density of the Weibull (for τ = 3) of: -θ -3 x 3 + 2ln(x) - 3ln(θ) + ln(3). 2, page 78, reword Q. 4.10: What is the maximum likelihood estimate of θ? 2, page 79, reword Q. 4.14: What is the fitted value of α? 2, page 123, solution 4.36: ln f(x) = -(x - θ) 2 /2 + ln(2/π)/2. loglikelihood is: -Σ(x i - θ) 2 /2 + n ln(2/π)/2. 2, page 151, last line of the fourth paragraph: 84,500/θ 2-2/θ = 0. 2, page 182, sol. 5.14: ln[f(x)/s(1000)] = -x/θ - ln(θ) + 1000/θ. Loglikelihood = -1500/θ - 3000/θ -5000/θ - 15,000/θ - 4ln(θ) + (4)(1000/θ) = -20,500/θ - 4ln(θ).

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 7 2, p. 210, sol. to first exercise: likelihood = f(x 1 )f(x 2 )... f(x n ) = x 1 2 x 2 2... x n 2 exp[-σx i /θ] / {2 n θ 3n } 2, p. 212, 7th line from bottom: = -0.5Σx i 2 /σ 2 + µσx i /σ 2-0.5n µ 2 /σ 2 - nln(σ) - (n/2)ln(2π). 2, p. 216, sol. 6.5&.6.6: loglikelihood = {µσx i /σ 2-0.5nµ 2 /σ 2 } - 0.5Σx i 2 /σ 2 - nln(σ) - (n/2)ln(2π). 2, p. 229, line 5: If each X i has mean µ and variance σ 2, then X n has mean µ and variance σ 2 /n. 2, p. 257, the values on the x-axis of the graph should be 5.8, 7, and 8.2. Density 0.6 0.5 0.4 0.3 0.2 0.1 2.28% 2.28% 5.8 7 8.2 x 2, p.278, 3rd paragraph: F(6.63) = 0.990. 2, p.280, second paragraph: 1 exp[-y 2 / 2] dy = 1-2t - 2π 1 1-2t. Thus the m.g.f. of a Chi-Square Distribution with one degree of freedom 2, p.281, footnote: M(t) = 1/(1-2t) ν /2

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 8 2, p.282, sol. to first exercise: = 0.13298 e -x/2 x 1.5. 2, p.283: More generally, given a sample of size n from a Normal Distribution with known mean µ, one confidence interval for σ 2 covering probability 1 - α is: Σ(X i - µ) 2 / χ 1 2 α/2 (n) to Σ(X i - µ) 2 / χ 2 α/2 (n), where χ α/2 2 (n) is the value at which the Chi-Square Distribution with n d.f. is α/2. 0 to Σ(X i - µ) 2 / χ 2 α (n), and Σ(X i - µ) 2 / χ 1 2 α (n) to are also each confidence intervals for σ 2 covering probability 1 - α. 2, p.284, middle of page: Coefficient of Variation = 2ν + 4λ / (ν + λ). 2, p.304, third paragraph: X = -(Y 1 + Y 2 )σ. 2, p.304, fourth paragraph: = (4E[X 1 2 ]/9 + E[X 2 2 ]/9 + E[X 3 2 ]/9 2, p.304, last paragraph: (-2E[X 1 2 ]/9-2E[X 2 2 ]/9 + E[X 3 2 ]/9 2, p.306, 6th line from the bottom, no square on y 1 : exp[-(y 1-2µ) 2 2, p.308 and p.1225: Given a sample of size n from a Normal Distribution, a confidence interval for σ 2 covering probability 1 - α is: (n-1)s 2 / χ 1 2 α/2 (n 1) to (n-1)s 2 / χ 2 α/2 (n 1), where χ α/2 2 (n 1) is the value at which the Chi-Square Distribution with n - 1 d.f. is α/2. 0 to (n-1)s 2 / χ 2 α (n 1), and (n-1)s2 / χ 1 2 α (n 1) to are also each confidence intervals for σ 2 covering probability 1 - α. 2, p.309, final line: (100/124.342)S 2 = 0.804S 2 to (100/ 77.9295)S 2 =1.283S 2. 2, p.321, sol. 10.11, final line, there should be no 3 in front of S 2 : Prob[S 2 1.05] = 0.05. 2, p. 374, final line of solution 12.9: χ 2 = {(O 1 - np 1 )/ np 1 (1- p 1 ) } 2 approaches 2, p. 398, 6th line from the bottom, the last probability is wrong: 2800/8600

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 9 2, p.405, Q.13.6, the values shown in the totals column are wrong; they should instead be: 111,611 3343 61 115,015 2, p.442: The definition of a t-distribution with ν degrees of freedom 2, p.442: should be ν inside the square root in the denominator rather than n, Z χ2 / ν 2, p.443: Prob[t T] 2, p.444, line 4: f Y (y) = 2, p.444: Prob[t T] 2, p.444: f T (t) = y / ν f X (t y / ν) f Y (y) dy = y=0 1 ν y=0 y exp[-t2 y / (2ν)] 2π e - y / 2 y ν / 2-1 2 ν / 2 Γ(ν / 2) dy 2, p.444, footnote 115: β[a, b] = Γ[a]Γ[b]} / Γ[a + b] 2, p.452, first line of solution 14.2: Prob[t < -3.5] > 2%/2 2, p. 471, line 7: there is 5% area above 1.833. 2, p. 534, sol. 18.8: variance: σ 2 = σ X 2 + σ Y 2-2ρσ x σ y. Final solution is OK. 2, p. 571, footnote 155: sample size of 120 2, p. 572, 4 lines from the bottom: with mean: (10-7) / (σ/4) = 12/σ.

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 10 2, p. 573: Thus, the ratio (X - 7)/ (σ / 4) S / σ follows a noncentral t-distribution with ν = 15 and δ = 12/σ. (X - 7)/ (σ / 4) Prob[ S / σ > 1.753 µ = 10] = Survival Function at 1.753 of a noncentral t-distribution with ν = 15 and δ = 12/σ. Let us assume that the sample standard deviation is 5, so that we estimate δ = 12/5 = 2.4. Then using a computer, this probability and thus the power is 74.1%. Here is a graph of the power function for this test of H 0 : µ 7 versus H 1 : µ > 7: power 1.0 0.8 0.6 0.4 0.2 8 9 10 11 12 mu As µ, the power approaches 1. As µ 7, the power approaches the significance of 5%. In general, the power is the Survival Function at the critical value of the t-test, of a noncentral t-distribution with ν degrees of freedom and δ = n (µ - µ 0 )/σ, where we estimate σ via S the sample standard deviation.

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 11 2, p. 691, line 5: since 2.472 < 4 < 4.325, we reject at 20% and do not reject at 10%.. 2, p. 720: Analysis of Covariance is discussed in Section 20 of Mahlerʼs Guide to Regression. 2, p. 750,: IOA 01,9/01 Q. 11 should be numbered 24.29 rather than 24.26. Unfortunately, therefore all of the following problems need to have their numbers increased by one. So for example CAS S, 11/16, Q.39 should be numbered 24.35 rather than 24.24. 2, p. 781, 3rd line from the bottom: µ 1 > µ 0 or µ 0 > µ 1. 2, p. 782, 8th line: Reject when this difference is large: -5 + {ln(0.10) - ln(0.05)}σx i b. 2, p. 795, sol. 25.4: Loglikelihood for H 1 minus loglikelihood for H 0 is: +n + {ln(2) - ln(3)}σx i. Reject when this difference is large: +n + {ln(2) - ln(3)}σx i b. (-0.4055)Σx i (b - n). Final solution is OK. 2, p. 816, sol. ot the exercise :f(x) = τ(x/θ) τ exp(-(x/θ) τ ) /x. ln f(x) = ln(τ) + (τ-1)ln(x) - (x/θ) τ - τln(θ). The loglikelihood is: n ln(τ) + (τ-1)σln(x i ) - Σ(x i /θ) τ - nτln(θ). 2, p. 840, sol. 26.1, 2nd line of comment, no divided by sign: Prob[ X - 3 1.960 100 / n H 0 ] 2, p. 862, the last two paragraphs: 0.4 = F(Q 0.4 ) = 1 - exp[-q 0.4 /1000]. α = F(Q α ) = 1 - exp[-q α /θ]. 2, p. 865, first line: F -1 [ i n+ 1 )]. 2, p. 877, sol. 27.2: 0.37 = F(Q 0.37 ) = 1 - exp[-q 0.37 /400]. Q 0.37 = -400 ln[0.63] = 185. 2, p. 887, solution to the next to last exercise: G 1 (x) = 1-1-1 7 F(x) j S(x) 7 - j j j=0 2, p. 892, 2nd exercise: a uniform distribution on [d, u]. 2, p. 896, middle of the page: {Γ(s + 2)Γ(N + 1 - s) / Γ(N + 3)} {Γ(r + 1)Γ(s - r) / Γ(s + 1)} {Γ(r)Γ(s - r)/ Γ(s)} {Γ(s)Γ(N + 1 - s)/ Γ(N+ 1)}

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 12 2, p. 898, comment ot the first exercise: E[X (N) ] = N/(N+1). 2, p. 899, 3rd paragraph, missing superscript j: N N (x / T) j (1 - x / T) j N - j j=r 2, p. 899, solution to first exercise, missing superscript j: N N (x / T) j (1 - x / T) j N - j j=r 2, p. 901, second line: 2 F(x) f(x) = 2 (e -x/θ - e -2x/θ )/θ. 2, p. 902, next to last paragraph, there is an extra divided by 2: X (2) = X (1) + W 2 = W 1 /2 + W 2. 2, p. 902, last paragraph, there is an extra divided by 3: X (2) = X (1) + W 2 /2 = W 1 /3 + W 2 /2 2, p. 905, last line: 1/ N2 π2 / 6 = 6 N π 2, p. 936, Q. 28.90: The lowest two values are excluded 32.5-25 2, p. 1017, sol. 29.14, third line: 1 - Φ[ ] Final solution is OK. 12.5 2, p. 1024, 4th line from bottom: = Var[X 1 + X 2 + X 3 + X 4 + X 5 ]/5 2 c ε 2, page 1032, 4th line from bottom, no minus sign: ε 2 f(ψ n ) dψ n + ε 2 f(ψ n ) dψ n - c+ε 2, p. 1045, 3rd paragraph: (α - x i) 2 α = 2 Σ(α x i ), 2, p. 1050, Q.30.14: estimator of ω. 2, p. 1115, line 5: -1 n E [ 2 ln f(x) / θ 2 ]

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 13 2, p. 1117, 6 lines from the bottom: 1 E [( lnl / θ) 2 ] 2, p. 1117, 3 lines from the bottom: -h'(θ) 2 n E [ 2 ln f(x) / θ 2 ] 2, p. 1138, line 6: -3σ2 σ4 + 1/σ 2 2, p. 1139, 3rd line from the bottom: 0.03 / (9.5)(0.0002) 2, p. 1141, Q. 32.6: D. at least 0.00014 but less than 0.00015 E. at least 0.00015 2, p. 1185, line 8: C(x) = -ln(θ + x) 2, p. 1188, line 13: k r = d k r -1 dψ 2, p. 1193, sol. 34.5: (x+r) ln(1 + β) D(β) = -r ln(1 + β). 2, p. 1198, line 10: -1 n E [ 2 ln f(x) / θ 2 ] 2, p. 1239: -1 n E [ 2 ln f(x) / θ 2 ] 2, p. 1239: Variance of Estimated Single Parameters, Maximum Likelihood (Section 32) 3, p.130, first line: = 60.018 {(0.02996-0.02097) - (0.00400-0.00210)} 3, p.171, sol. 4.48-4.49, first line: λ > 0 3, p.184, last line: -e - t - t e - t t = x ] t = 0

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 14 n 3, p.186, line 5: n! θ n+1 (x / θ ) {1 - i e - x / θ } i! i =0 3, p.186, line 5: x > 0 3, p.186, solution to the first exercise: = Γ(α+1) Γ(α) θ 3, p.194, 5 lines from the bottom: = λ n e - λ λα 1 e - λ /θ n! θ 0 α Γ(α) dλ 3, p.200, top of the page: Below the prior mixed Negative Binomial distribution (M) and the posterior mixed Negative Binomial distribution (P) are compared: 3, p.241, sol. 6.25: 1/θ = 3, p.251, sol. 6.63: (6)(0.04) = 0.24. 3, p.327, first line: f(m) = exp[-(m- 7)2 / 8] 2 2π 3, p.329, last line: 9 + 36 / 29 = 297/29 4, page 8: Corr[X - c, Y] = Cov[X - c, Y] / (StdDev[X - c] StdDev[Y]) = Cov[X, Y] / (StdDev[X] StdDev[Y]) = Corr[X, Y]. 4, p. 19, sol. 3.1: Using in the t-table the entries for 100 d.f. 1.984 < 2.126 < 2.364. Letter solution is OK. 4, page 23, middle of the page: the Pearson correlation coefficient 4, page 44, solution to the exercise: n c = 2 4, page 68, line two: greater than 7000 provides 4, page 84, sol. 6.8, line three: (1/2)(1/2)(70) = 17.5.

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 15 5, page 15, 3 lines from the bottom: (0.34)(4161) + (0.66)(6932) = 5633.46. 5, page 16, next to last paragraph: Thus a whisker extends from 2176 the third quartile to 4161. 5, page 43, 5 lines from the bottom: s X = 4.5277 5, page 436, middle of the page: For the first observation, x = 1, µ = β 0 + β 1, and y = 2. For the third observation, x = 4, µ = β 0 + 4β 1, and y = 11. 5, page 89, 5 lines from the bottom: R 2 = ^β 2 x i 2 y i 2 5, page 91, lines 6 and 7: R 2 = 1 - RSS/TSS = 1 - (N - k - 1)s 2 /TSS. R 2 = k F p,n - k - 1 k F p,n - k - 1 + (N- k -1). 5, page 124, 5th paragraph: E[R 2 ] = 25%. 5, page 157, sol. 6.2: 25-2 = 23 degrees of freedom 5, p. 217: XʼX = 10 520 215 520 35,050 8700. 215 8700 9425 5, page 289, sol. 10.5: 4 and 18 degrees of freedom 5, page 386, top of page, subscript should be i rather than 2: s VIFi s X i N- 1. 5, p. 425: Q. 15.21: Determine DFFITS 17. 5, p. 458, Q.16.3: Corr[ε t-1, ε t ] = 1, determine the expected value of the Error Sum of Squares. 5, p. 458, Q.16.4: If Corr[ε t-1, ε t ] = -1,

Errata, Mahler Study Aids for Exam S, 2017 HCM, 10/24/17 Page 16 5, p. 529, line 10: F = (1542.667-268.667) / 2 148 / (12-6) 6, p.5, solution to the exercise: Then if a claim is reported at time t, -1 < t < 0, the probability that it is still not settled is: 1+t. The time since it was reported is -t. Thus the average time since reporting is: 0 0 (1+ t) (-t) dt = -t - t 2 dt = 1/2-1/3 = 1/6. -1-1 6, pages 113 to 120: This is what Dobson and Barnett call Iterative Weighted Least Squares. 6, p.269: in last block of the table the labels should be tumor types rather than territories. 7, p. 35, solution 3.6: p 105 + 2 p 105 + 3 p 105 + 4 p 105 + 5 p 105 + 6 p 105 = (l 106 + l 107 + l 108 + l 109 + l 110 + 0) / l 105 = (727 + 292 + 108 + 36 + 11) / 1668 = 0.704 years. 7, p. 54, Q. 5.4: ȧ. x = 9. 8, p. 33, solution 2.16: - Prob[1, 2, 3, and 4 fail by time 15] 8, p. 91. line 10: For minimal cut set C j, its indicator function is 8, p. 92, line 7: Let C i be the minimal cut sets for a system, 9, p.15, sol. 2.2b: (1.0394 + 0.9912 + 1.0126 + 1.0231)/4 = 1.0166. 9, p.117, sol. 9.6: E. Prob(x t > 8) = Prob(w t + 1.1w t-1 + 0.8w t-2-0.3w t-3-0.4w t-4 > 8-5 = 3). E[w t + 1.1w t-1 + 0.8w t-2-0.3w t-3-0.4w t-4 ] = 0. Var[w t + 1.1w t-1 + 0.8w t-2-0.3w t-3-0.4w t-4 ] = (3) (1 2 + 1.1 2 + 0.8 2 + 0.3 2 + 0.4 2 ) = 9.3. Prob(w t + 1.1w t-1 + 0.8w t-2-0.3w t-3-0.4w t-4 > 3) = 1 - Φ[(3-0) / 9.3 ] = 1 - Φ[0.984] = 16.3% 9, p.118, sol. 9.10: -0.155 = β 2 / (1 + β 1 2 + β 2 2 ). Final answer is OK.