Math 181B Homework 1 Solution 1. Write down the likelihood: L(λ = n λ X i e λ X i! (a One-sided test: H 0 : λ = 1 vs H 1 : λ = 0.1 The likelihood ratio: where LR = L(1 L(0.1 = 1 X i e n 1 = λ n X i e nλ n X i! = e n X i e n 0.1 0.1 X i = g( X i g(t = en 0.1 t = en 10 t, g (t = e n 10 t log 10 > 0, when t 0 So g(t is strictly increasing and invertible g 1 (g(t = t. Therefore, we can simplify the rejecting region for likelihood-ratio test LR = g( X i k as n X i C = g 1 (k. Remark: You may state without proof: LR is a monotone increasing function of X i. g(t argument serves to convince you. Often, you first come up with the conclusion. Then, you figure out the way to prove it (not necessary. Regular solution: Apply Central Limit Theorem under H 0, Z = n 1/ n (X i 1 H 0 N(0, 1. Obviously, X i = h(z = nz. h(z is a strictly increasing function. The rejecting region is then equivalent to Z C = h 1 (C = h 1 (g 1 (k. To achieve level α, we want P H0 (Z C = α. This is the definition of C being the α quantile of N(0, 1, usually written as Z 1 α = Z α. Extra credit: X H 0 i P oisson(n. To achieve level α, we want PH0 ( X i C = α. This is the definition of C being the α quantile of P oisson(n. (b Two-sided test: H 0 : λ = 1 vs H 1 : λ 1 MLE: ˆλ = X. The likelihood ratio: LR = L(1 L( X = 1 X i e n 1 X X i e n X = en( X 1 X n X X = eg( Regular solution: Apply the asymptotic theory for Generalized Likelihood- Ratio test, log LR = n X log X n( X 1 H 0 χ 1 The rejecting region is log LR C. To achieve level α, we want P H0 ( log LR C = α. This is the definition of C being the 1 α quantile of χ 1, usually written as χ 1,α. 1
Extra credit: { > 0 t < 1 g (t = n log t < 0 t > 1 LR is first increasing then decreasing w.r.t. X. Therefore, the rejecting region {LR k} is equivalent to { X a} { X b} for some pair a < 1 < b satisfying g(a = g(b. The distribution of X under H1 is studied in part (a.. Write down the likelihood: MLE: ˆp = X/n. The likelihood ratio: LR = L(0.5 L(X/n = L(p = p X (1 p n X 0.5 n (X/n X (1 X/n n X = en log 0.5 X log X (n X log(n X n log n = e g(x g (x = log x + log(n x { > 0 x < n/ < 0 x > n/ LR is first increasing then decreasing w.r.t. X. Therefore, the rejecting region {LR k} is equivalent to {X a} {X b} for some pair a < n/ < b satisfying g(a = g(b. Extra credit: Notice g(n x = g(x, so we may set b = n a. The rejecting region is then {X a} {X n a} = { X n C = n a}. Finally, figure out a from the distribution of X under H 0. Regular solution: Normal approximation Z = n 1/ (X n/ H 0 N(0, 1. Rejecting region is equivalent to Z > C. To achieve level α, we want P H0 ( Z > C = α. N(0, 1 is symmetric. Therefore, P H0 (Z > C = α/. This is the definition of C being the 1 α/ quantile of N(0, 1, usually written as Z α/. Extra Credit: X H 0 Binom(n, 0.5. To achieve level α, we want P H0 ( X n n a = α. Binom(n, 0.5 is also symmetric. Therefore, P H0 (X a = α/. This is the definition of a being the α/ quantile of Binom(n, 0.5. 3. Write down the likelihood: Log-likelihood: L(µ, σ = n 1 σ 1 π e (X i µ = (σ π n/ e 1 (X i µ l(µ, σ = n log(σ n log(π 1 (X i µ (a Regular solution: First simplify the null as H 0 : σ = σ 0. The full parameter space is now σ σ 0.
MLE: First take partial derivative w.r.t. the unconstrained parameter µ: l µ = n σ ( X µ The solution of the equation is always ˆµ = X for any σ > 0. Therefore, ˆµ = X is the MLE in both Θ 0 and Θ 0 Θ 1. The MLE for σ is then the maximum of the partial likelihood: l (σ = l( X, σ = n log(σ n log(π 1 (X i X Under H 0, σ =. Under H 0 + H 1, compute the derivative of l : { } dσ = n σ + 1 (X σ 4 i X = n σ n 1 (X σ 4 i X Denote the unconstrained MLE ˆσ = n 1 (X i X ˆσ depends on the observed data. It is random whether ˆσ belongs to the full parameter space [, +. We have to discuss the MLE under two cases. Case 1: If ˆσ, then the MLE is the unconstrained MLE ˆσ. The likelihood ratio is LR = L( X, σ 0 L( X, ˆσ = (σ 0 ˆσ n/ exp where ( 1 0 = ( σ 0 ˆσ n/ exp ( nˆσ + n = e (X i X + 1 ˆσ ˆσ g( σ 0 g(t = n log t n t + n, t 1 g (t = n t n 0, t 1 (X i X, ˆσ σ 0 ˆσ σ 0 The rejecting region {LR k} in case 1 is thus equivalent to {ˆσ } { ˆσ C}. Case : ˆσ <, by the derivative of partial log-likelihood dσ < 0, σ > σ 0 > ˆσ 1 3
l (σ is strictly decreasing over the parameter space [σ 0, +. The maximum is then the left boundary σ 0. The likelihood ratio is LR = L( X, σ 0 L( X, σ 0 = 1 If {LR = 1} belongs to the rejecting region defined by {LR k}, we must have k 1. However, LR is always no greater than 1, so the type I error is P H0 (LR k P H0 (LR 1 = 1 > α. Therefore, H 0 is always accepted in case to achieve any level α < 1. As a summary, the rejecting region is { nˆσ max(n, C }. Under H 0, nˆσ σ 0 = 1 0 To achieve level α test, we want (X i X χ n 1 P H0 ( nˆσ σ 0 max(n, C = α This is the definition of max(n, C being the 1 α/ quantile of χ n 1, usually written as χ n 1,α. Note the mean of χ n 1 is n 1. thus, its 1 α quantile is greater than n when α is small. (b Extra credit solution: Do not do the simplification. MLE: First take partial derivative w.r.t. the unconstrained parameter µ: l µ = n σ ( X µ The solution of the equation is always ˆµ = X for any σ > 0. Therefore, ˆµ = X is the MLE in both Θ 0 and Θ 0 Θ 1. The MLE for σ is then the maximum of the partial likelihood: l (σ = l( X, σ = n log(σ n log(π 1 (X i X Under H 0 + H 1, ˆσ = n 1 n (X i X is the unconstrained MLE. Under H 0, compute the derivative of l : dσ = n + 1 σ 4 (X i X = n σ 4 { σ n 1 } (X i X ˆσ depends on the observed data. It is random whether ˆσ belongs to the null parameter space (0, σ 0]. We have to discuss the MLE under two cases. 4
Case 1: If ˆσ σ 0, by the derivative of partial log-likelihood dσ > 0, σ σ 0 < ˆσ l (σ is strictly increasing over the parameter space (0, ]. The maximum is then the right boundary. The likelihood ratio is LR = L( X, ( L( X, ˆσ = (σ 0 ˆσ n/ exp 1 (X i X + 1 (X ˆσ i X where = ( σ 0 ˆσ n/ exp ( nˆσ + n = e ˆσ g( σ 0 g(t = n log t n t + n, t 1 g (t = n t n 0, t 1, ˆσ σ 0 ˆσ σ 0 The rejecting region {LR k} in case 1 is thus equivalent to {ˆσ } { ˆσ C}. Case : ˆσ <, then the MLE is the unconstrained MLE ˆσ. The likelihood ratio is LR = L( X, L( X, = 1 If {LR = 1} belongs to the rejecting region defined by {LR k}, we must have k 1. However, LR is always no greater than 1, so the type I error is P H0 (LR k P H0 (LR 1 = 1 > α. Therefore, H 0 is always accepted in case to achieve any level α < 1. As a summary, the rejecting region is { nˆσ max(n, C }. Under H 0, nˆσ σ = 1 To achieve level α test, we want ( nˆσ max P σ max(n, C σ σ 0 (X i X χ n 1 1 ( nˆσ = max P σ σ 0 σ σ σ max(n, C = α For any σ, the statistic nˆσ has the same distribution χ σ n 1. The smallest the cutoff σ 0 max(n, C is achieved at σ = σ σ 0. That corresponds to the biggest type I error. max P σ σ ( nˆσ σ 0 ( nˆσ max(n, C = P 5 max(n, C = α
This is the definition of max(n, C being the 1 α/ quantile of χ n 1, usually written as χ n 1,α. Note the mean of χ n 1 is n 1. thus, its 1 α quantile is greater than n when α is small. 4. Extra credit: Do the transformation X i Y i = Z i N(µ X µ Y, (1 ρσ. The affine (linear transformation of multivariate normal distribution is normal. Its parameters are given by its mean and variance. EZ = EX EY = µ X µ Y, VarZ = VarX + VarY Cov(X, Y = (1 ρσ Write ν = µ X µ Y, τ = (1 ρσ. Assume ρ ( 1, 1. Then, τ (0, 4σ. Write down the likelihood: n L(ν, τ 1 = τ 1 π e τ (Z i ν = (τ π n/ e 1 τ (Z i ν Log-likelihood: (a MLE: Under H 0, ν = 0. l(ν, τ = n log(τ n log(π 1 τ d dτ l(0, τ = n τ + 1 τ 4 Z i (Z i ν { = n τ n 1 τ 4 We have to discuss the MLE for τ under two cases. constraint τ < 4σ Denote the unconstrained MLE ˆτ 0 = n 1 Z i Z i } Remember there is this ˆτ 0 depends on the observed data. It is random whether ˆτ 0 belongs to the parameter space (0, 4σ. Following the same case-wise argument in previous problem, you will find the MLE under H 0 is min(4σ, ˆτ 0. Under H 0 + H 1, first solve ˆν = Z. Denote the unconstrained MLE ˆτ = n 1 (Z i Z Similarly, the constrained MLE for τ is min(4σ, ˆτ. Therefore, the likelihood-ratio statistic is Λ = L(0, min(4σ, ˆτ 0 L( Z, min(4σ, ˆτ 6
(b You may use log LR χ 1. Or follow the large sample argument. Under H 0, both unconstrained MLE ˆτ 0 and ˆτ converge in probability to true τ (0, 4σ by Law of Large Numbers. When sample size is large, we only need to consider the case ˆτ < ˆτ 0 < 4σ. Λ = L(0, ˆτ ( 0 ˆτ n/ L( Z, ˆτ = 0 ˆτ It is a monotone decreasing function of ˆτ 0 /ˆτ. Under H 0, F = ( ˆτ 0 ˆτ 1 (n 1 = To achieve the level α, the rejecting region is n Z (Z i Z /(n 1 F 1,n n Z (Z i Z /(n 1 F 1 α,1,n where F 1 α,1,n is the 1 α quantile of F 1,n. (c It is equivalent to the paired t-test Z T = s /n t α/,n 1 since T = F and t α/,n 1 = F 1,n 1. Read Larsen and Marx section 1.. 5. Extra credit: Read Larsen and Marx Appendix 9.A.1 page 488. 7