Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Size: px
Start display at page:

Download "Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1"

Transcription

1 Chapter 4 HOMEWORK ASSIGNMENTS These homeworks may be modified as the semester progresses. It is your responsibility to keep up to date with the correctly assigned homeworks. There may be some errors in the statements of these problems, due to typographical and conceptual errors on my part. I will give 1% addition to the first exam scores for all those finding errors, under the following conditions: (i) typos are fixed, stated properly and then worked out; (ii) conceptual errors on my part are explained by you. 4.1 Homework #1 Problem #1 Problem 6.1 in Casella & Berger (p. 280). Problem #2 Problem 6.6 in Casella & Berger (p. 280). (a) What are the sufficient statistics for α and β? (b) If α is known, show that the gamma density is a member of the class of exponential families. (c) If β is known, is the gamma density a member of the class of exponential families? Why or why not? (d) With neither α nor β known, is the gamma density a member of the multiple parameter exponential family? Why or why not? Problem #3 Suppose that x 1,..., x n are fixed constants. distributed with mean β 0 + β 1 x i and variance σ 2. (a) What are the sufficient statistics for (β 0,β 1,σ 2 )? Suppose further that Y i is normally Problem #4 The Rayleigh family has the density f(x, θ) =2(x/θ 2 )exp ( x 2 /θ 2),x>0,θ >0. Use the fact that this is an exponential family to compute the mean, variance and 3rd and 4th moments of X 2,whereXis Rayleigh. 13

2 14 CHAPTER 4. HOMEWORK ASSIGNMENTS Problem #5 Suppose I have a sample X 1,..., X n from the normal distribution with mean θ and variance θ. LetXbe the sample mean, and s 2 be the sample variance. Remember that X and s 2 are independent. (a) For any 0 α 1, compute the mean and variance of the statistic T (α) =αx+(1 α)s 2. (b) Compute the limiting distribution of T (α), i.e., what does n 1/2 {T (α) θ) converge to in distribution. (c) Is there a unique best value of α as n? Problem #6 Suppose that we have a sample X 1,..., X n from the density f(x, θ) = Find a minimal sufficient statistic for θ. x!γ(θ)γ(x + θ) 2 x+θ. Problem #7 Work problem 6.20 in Casella and Berger (page 280). A function T (X) isacomplete sufficient statistic if E [g{t (X) θ}] = 0 for all θ and for all g = T (X) 0.

3 4.2. HOMEWORK # Homework #2 Problem #1 Find the mle of θ in the Rayleigh family of Homework #1. Problem #2 Find the mle s of (β 0,β 1,σ 2 ) in the linear regression problem of Homework #1. Problem #3 Suppose that X 1,..., X n are a sample with mass function Find the mle of θ. pr(x = k) = (k 2)(k 1) (1 θ) k 3 θ 3. 2 Problem #4 Suppose that X 1,..., X n are i.i.d. uniform on the interval [θ, θ 2 ], where θ>1. (a) Show that a method of moments estimator of θ is ( θ(mm)= 8n 1 n i=1 1/2 X i +1) 1 /2. (b) Find the mle for θ. (c) By combining the central limit theorem and the delta method (Taylor-Slutsky), compute the limiting distribution of θ(mm). Problem #5 Work Problem 7.7 in Casella & Berger (page 332). Problem #6 Work Problem 7.12 of Casella & Berger (page 333). Problem #7 Suppose that z 1,..., z n are fixed constants, and that the responses Y 1,..., Y n are independent and normally distributed with mean z i β and variance σ 2 v(z i ), where v(z i )areknown constants. (a) Compute the mle of the parameters. (b) Compute the mean and variance of β. Problem #8 Suppose that z 1,..., z n are fixed constants, and that the responses Y 1,..., Y n are independently distributed according to a gamma distribution with mean exp z i β and variance σ 2 exp 2z i β. (a) It turns out that there is a function ψ(y,z,β) such that the mle for β solves n i=1 ψ(y i,z i,β)= 0. What is ψ( )? (b) What is the mle for σ 2?

4 16 CHAPTER 4. HOMEWORK ASSIGNMENTS 4.3 Homework #3 Problem #1 If X Poisson(θ), show that X is UMVUE for θ. Problem #2 If X Binomial(n, θ), show that there exists no unbiased estimator of the odds ratio g(θ) = θ 1 θ. HINT: Suppose there does exist an S(X) which is unbiased. Write out E θ {S(X)} and then find a contradiction. Problem #3 Suppose that X has the mass function Pr(X = k θ) =θ(1 θ) k, k =0,1,2,.../ Find the mle for θ from a sample of size n, and discuss its properties, namely: (a) mean (b) variance (c) is it UMVUE? Problem #4 Suppose that (z 1,..., z n ) are fixed constants, and that for i =1,..., n, X i is normally distributed with mean z i and variance θz 2 i. Find the mle for θ from a sample of size n, and discuss its properties, namely: (a) mean (b) variance (c) is it UMVUE? HINT: If Z Normal(0, 1), E(Z 3 )=0andE(Z 4 )=3. Problem #5 Work problem 7.56 in Casella & Berger (page 341).

5 4.4. HOMEWORK # Homework #4 Problem #1 Find the Fisher information for the Rayleigh family. Problem #2 If X 1,..., X n are i.i.d. and normally distributed with mean equal to its variance, find the mle and the Fisher information for θ. Problem #3 Let X be Poisson(λ x )andlety be independent of X and distributed as a Poisson(λ y ). Define θ = λ x /(λ x + λ y )andξ=λ x +λ y. (a) Suppose that θ is known. Show that T = X + Y is sufficient for ξ. (b) Compute the conditional distribution of X given T. (c) Conditioning on T, find the UMVUE for θ. I want you to show that this is really a conditional UMVUE, so I want you to cite theorems from class to justify your steps. Problem #4 Suppose I have a sample X 1,..., X n from the normal distribution with mean θ and variance θ 2.LetXbe the sample mean, and s 2 bethesamplevariance. (a) For any 0 α 1, compute the mean and variance of the statistic (b) Compute the limiting distribution of T (α). T (α) =αx 2 +(1 α)s 2. (c) Compute the limiting distribution of T (α). Problem #5 Work Problem 7.55(a) in Casella & Berger (p. 340). Hint #1: An unbiased estimator is I(X 1 =0),whereI( ) is the indicator function. Hint #2: what is the distribution of X 1 given the sufficient statistic? Problem #6 Suppose that (z 1,..., z n ) are fixed constants, and that for i =1,..., n, X i is normally distributed with mean z i and variance θz 2 i. Find the mle for θ from a sample of size n. Does the mle achieve the Fisher information bound? Does this in two ways: (a) by direct calculation (b) by using properties of OPEF s. Problem #7 Suppose that X 1,..., X n follow the Weibull model with density f(x λ, κ) =κλ(λx) κ 1 exp { (λy) κ }. (a) What equations must be solved to compute the mle? (b) Show that the mle of (λ, κ) is unique.

6 18 CHAPTER 4. HOMEWORK ASSIGNMENTS 4.5 Homework #5 Problem #1 In the Rayleigh family, show directly using the weak law of large numbers that the mle is consistent. Also show it is consistent using the general theory in class about consistency of mle s in exponential families. Problem #2 What is the asymptotic limit distribution of the mle in the Rayleigh family? Problem #3 Let X 1,..., X n be i.i.d. negative exponential with mean θ. (a) Find the mle for θ. (b) Find the mle for pr θ (X>t 0 ). (c) Prove that the mle for pr θ (X>t 0 ) is consistent. (d) Compute the limit distribution for the mle of pr θ (X>t 0 ). Problem #4 Let X 1,..., X n be i.i.d. Poisson with mean θ. It moment generating function is known to be E {exp(tx)} =exp[θ{exp(t) 1}]. (a) Show that E(X θ) 2 = θ, E(X θ) 3 = θ and E(X θ) 4 = θ +3θ 2. Imayhavemadean error here, so correct it if I have. (b) Compute the limiting distribution for the mle of θ. (c) The sample variance s 2 is unbiased for θ. Compute its limiting distribution. (d) Compare the limiting variances you found in parts (b) and (c). Problem #5 Let X 1,..., X n be i.i.d. from a one parameter exponential family in canonical form, with the density function p(x θ) = S(x)exp {θx + d(θ)}. (a) Show that if the mle exists, it must satisfy X = {E θ (X)} θ= θ. (b) Cite a theorem from class showing that the mle must be consistent. Problem #6 Suppose that X 1,..., X n follow the Weibull model with density f(x λ, κ) =κλ(λx) κ 1 exp { (λy) κ }. (a) Suppose that κ is known. What is the limit distribution for the mle of λ?

7 4.5. HOMEWORK #5 19 Problem #7 In many problems, time to event data would naturally be modeled via a negative exponential density. However, in some of these problems, there is the worry that there is a certain probability that the event will never occur. Such a model has the distribution (not density) function F (x, θ, κ) =κ+(1 κ){1 exp(x/θ)}, for0 x. Note that the value of x = has a positive probability. This model is not in the form of an exponential family, and in fact the data do not even have a density function. (a) interpret κ as a cure rate. (b) Show that the likelihood function for this model is κ I(x= ) {(1 κ)exp( x/θ)/θ} I(x< ). (c) Show that E(X) = and hence standard method of moments will not work. (d) Compute the mle for κ and θ. (e) Compute the limit distribution for the mle of κ.

8 20 CHAPTER 4. HOMEWORK ASSIGNMENTS 4.6 Homework #6 Problem #1 Suppose that, given θ, X is Poisson with mean θ. Letθhave a negative exponential prior distribution with mean θ 0. Let the loss function be L(θ, t) =(t θ) 2 /θ. (a) Show that the posterior distribution of θ is a gamma random variable. (b) What is the Bayes estimator of θ? Hint: you have been told a characterization of Bayes estimators in terms of minimizing a certain function. You should try to do this minimization explicitly here. Problem #2 Let X be Binomial(n, θ 1 )andlety be Binomial(n, θ 2 ). Suppose the loss function is L(θ 1,θ 2,t)=(θ 1 θ 2 t) 2.Letθ 1 and θ 2 have independent prior beta-distributions with parameters (α, β). Find the Bayes estimator for this loss function. Problem #3 Work problem 7.24 in Casella & Berger (p. 335). Problem #4 Let X 1,..., X n be i.i.d. Normal(0, variance = θ). Suppose I am interested only in the special class of estimators of θ defined by { F = T : T n (m) =(n+m) 1 n }. Xi 2 i=1 Suppose that the loss function is L(t, θ) =θ 2 (t θ) 2. (a) In this class of estimators, which values of m, if any, yield an admissible estimator? (b) Is m = 0 minimax? (c) Answer (a) if the loss function is changed to L(t, θ) =θ 1 (t θ) 2. (c) What is the asymptotic limiting distribution of the mle, in terms of derivatives of the function d(θ)? Problem #5 One of the more difficult aspects of Bayesian inference done by frequentists is to find a noninformative prior. Jeffreys Prior is the one in which the prior density is proportional to the square root of the Fisher information. Suppose that X 1,..., X n are independent and identically distributed Bernoulli(θ). (a) Find the Jeffreys prior for this model. (b) Interpret the Jeffreys prior as a uniform prior for arcsin( θ). Problem #6 One criticism of the use of beta priors for Bernoulli sampling is that they are unimodel. Thus, various people have proposed the use of a mixture of betas prior, namely π(θ) =ɛg B (θ a, b)+(1 ɛ)g B (θ c, d), where g B (θ a, b) isthebeta(a, b) density. Show that this prior is conjugate for Bernoulli sampling.

9 4.6. HOMEWORK #6 21 Problem #7 Suppose that X 1,..., X n are iid with a negative exponential distribution with mean 1/θ. (a) Find the Jeffreys prior for θ. (b) Compute the posterior distribution for θ. (c) Compute the posterior distribution for λ = 1/θ. (d) Discuss computing the posterior mean and model for λ.

10 22 CHAPTER 4. HOMEWORK ASSIGNMENTS 4.7 Homework #7 Problem #1 If X 1,..., X n are i.i.d. normal with mean θ and variance 1.0, consider testing the hypothesis H 0 : θ 0 against the alternative H 1 : θ>0. What is the power function of the UMP level α test? Problem #2 In Problem #1, suppose that θ has a prior Normal distribution with mean 0.0 and variance σ 2. Consider the 0 1 loss function discussed in class, i.e., the loss is zero if a correct decision is made, and the loss is one otherwise. What is the Bayes procedure for this problem? Problem #3 Let X 1,..., X n be i.i.d. with a common density p(x θ) =exp{ (x θ)}, x θ. Let U =min(x 1,..., X n ). (a) Show that U and U (1/n) are (respectively) an mle and a UMVUE for θ. (b) In testing H 0 : θ θ 0 against H 1 : θ>θ 0 at level α, show that the UMP level α test is of the form to reject H 0 when U>c. (c) In part (b), express c as a function of θ 0 and α. (d) In parts (b) (c), what is the power function for the test? Problem #4 Suppose that X 1,..., X n are a sample with mass function pr(x = k) = (k 2)(k 1) (1 θ) k 3 θ 3. 2 (a) Show that if we wish to test H 0 : θ θ 0 against H 1 : θ>θ 0, find the form of the the UMP test. (b) What is the Fisher information and the asymptotic distribution of the mle here? Problem #5 Let X be a Binomial random variable based on a sample of size n = 10 with success probability θ. LetS= X 5, and suppose this is all that is observed, i.e., I only observe S, and I cannot observe X. Consider testing H 0 : θ 1/3 orθ 2/3 against H 1 : θ =1/2. Suppose I use the test which rejects H 0 when S =0orS=1. (a) What is the distribution of S? (b) Find the level of this test. Remember to consider carefully what level means with this composite hypothesis. (c) Is the test UMP of its level? Why or why not? Problem #6 Suppose I take n observations from a multinomial distribution with cell probabilities as arranged in Table 4.1, and data as in Table 4.2 I am interested in testing the hypothesis H 0 : θ yy θ yn against the alternative H 0 : θ yy >θ yn. By thinking carefully, find an appropriate

11 4.7. HOMEWORK #7 23 Yes No Yes θ yy θ yn No θ ny θ nn Table 4.1: Table of probabilities for Problem #7. The θ s sum to 1.0. Yes No Yes N yy N yn No N ny N nn Table 4.2: Table of probabilities for Problem #7. The N s sum to n. conditional test for this hypothesis. By a conditional test, I mean that you should condition on part or all of the data.

12 24 CHAPTER 4. HOMEWORK ASSIGNMENTS 4.8 Homework #8 Problem #1 Let X 1,..., X n be i.i.d. Poisson(θ). Suppose I want to test the hypothesis H 0 : θ = θ 0 against the alternative H 1 : θ θ 0. (a) What is the form of the GLR test here? (b) What is the form of the Wald test? (c) What is the form of the score test? (d) Prove directly that as n, the score test achieves its nominal level α. Problem #2 Repeat problem #1 but for the case of sampling from the normal distribution with mean and variance both equal to θ. Problem #3 Let X be Binomial(n, θ 1 )andlety be Binomial(n, θ 2 ). Let S = X + Y. (a) What is the distribution of X given S? You may find it useful to reparameterize θ 1 = {1+exp( )} 1 and θ 2 = {1+exp( η)} 1. (c) Is this distribution a member of the one-parameter exponential family with a monotone likelihood ratio? (c) Use the result in (a) to find a UMP conditional test of the hypothesis H 0 : θ 1 θ 2 against the alternative H 1 : θ 1 >θ 2. (d) What is the conditional Wald test for this problem? This is a one-sided test, and we did not cover one-sided testing in class. I m asking that you come up with a reasonable guess. Problem #4 Suppose we are concerned with the lower endpoint of computer generated random numbers which purport to be uniform on (0, 1). We have a sample X 1,..., X n are consider the density f(x, θ) = I(θ x) 1 θ. Consider the following observations: (.87,.84,.79,.33,.02,.97,.20,.47,.51,.29,.58,.69). Suppose we adopt a prior distribution with density π(θ) =(1+a)(1 θ) a. (a) What kind of prior beliefs does this prior represent? (b) Compute the posterior density for θ. (c) Plot this posterior density for a few values of a. (d) Compute a 95% credible confidence interval for θ, i.e., one which covers 95% of the posterior density. Problem #5 Consider the same model as in problem #4, but this time compute a 95% likelihood ratio confidence interval for θ.

STAT 611. R. J. Carroll

STAT 611. R. J. Carroll STAT 611 R. J. Carroll January 17, 1999 Contents 1 OFFICE HOURS, etc. 1 1.1 OfficeHours... 1 1.2 Text... 1 1.3 Grading... 2 1.4 AbouttheInstructor... 2 2 COURSE SCHEDULE 3 3 OLD EXAMS FROM OTHER INSTRUCTORS

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)

More information

A Very Brief Summary of Bayesian Inference, and Examples

A Very Brief Summary of Bayesian Inference, and Examples A Very Brief Summary of Bayesian Inference, and Examples Trinity Term 009 Prof Gesine Reinert Our starting point are data x = x 1, x,, x n, which we view as realisations of random variables X 1, X,, X

More information

Statistics Masters Comprehensive Exam March 21, 2003

Statistics Masters Comprehensive Exam March 21, 2003 Statistics Masters Comprehensive Exam March 21, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2009 Prof. Gesine Reinert Our standard situation is that we have data x = x 1, x 2,..., x n, which we view as realisations of random

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective

More information

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Matthew S. Johnson New York ASA Chapter Workshop CUNY Graduate Center New York, NY hspace1in December 17, 2009 December

More information

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper McGill University Faculty of Science Department of Mathematics and Statistics Part A Examination Statistics: Theory Paper Date: 10th May 2015 Instructions Time: 1pm-5pm Answer only two questions from Section

More information

Module 22: Bayesian Methods Lecture 9 A: Default prior selection

Module 22: Bayesian Methods Lecture 9 A: Default prior selection Module 22: Bayesian Methods Lecture 9 A: Default prior selection Peter Hoff Departments of Statistics and Biostatistics University of Washington Outline Jeffreys prior Unit information priors Empirical

More information

Suggested solutions to written exam Jan 17, 2012

Suggested solutions to written exam Jan 17, 2012 LINKÖPINGS UNIVERSITET Institutionen för datavetenskap Statistik, ANd 73A36 THEORY OF STATISTICS, 6 CDTS Master s program in Statistics and Data Mining Fall semester Written exam Suggested solutions to

More information

Statistics Ph.D. Qualifying Exam

Statistics Ph.D. Qualifying Exam Department of Statistics Carnegie Mellon University May 7 2008 Statistics Ph.D. Qualifying Exam You are not expected to solve all five problems. Complete solutions to few problems will be preferred to

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45 Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved

More information

1. Fisher Information

1. Fisher Information 1. Fisher Information Let f(x θ) be a density function with the property that log f(x θ) is differentiable in θ throughout the open p-dimensional parameter set Θ R p ; then the score statistic (or score

More information

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm Midterm Examination STA 215: Statistical Inference Due Wednesday, 2006 Mar 8, 1:15 pm This is an open-book take-home examination. You may work on it during any consecutive 24-hour period you like; please

More information

Statistical Theory MT 2006 Problems 4: Solution sketches

Statistical Theory MT 2006 Problems 4: Solution sketches Statistical Theory MT 006 Problems 4: Solution sketches 1. Suppose that X has a Poisson distribution with unknown mean θ. Determine the conjugate prior, and associate posterior distribution, for θ. Determine

More information

Statistical Theory MT 2007 Problems 4: Solution sketches

Statistical Theory MT 2007 Problems 4: Solution sketches Statistical Theory MT 007 Problems 4: Solution sketches 1. Consider a 1-parameter exponential family model with density f(x θ) = f(x)g(θ)exp{cφ(θ)h(x)}, x X. Suppose that the prior distribution has the

More information

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. (a) If X and Y are independent, Corr(X, Y ) = 0. (b) (c) (d) (e) A consistent estimator must be asymptotically

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Chapter 8 Maximum Likelihood Estimation 8. Consistency If X is a random variable (or vector) with density or mass function f θ (x) that depends on a parameter θ, then the function f θ (X) viewed as a function

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Chapter 7 Maximum Likelihood Estimation 7. Consistency If X is a random variable (or vector) with density or mass function f θ (x) that depends on a parameter θ, then the function f θ (X) viewed as a function

More information

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources STA 732: Inference Notes 10. Parameter Estimation from a Decision Theoretic Angle Other resources 1 Statistical rules, loss and risk We saw that a major focus of classical statistics is comparing various

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon Final Examination Saturday, 2001 May 5, 9:00am 12:00 noon This is an open-book examination, but you may not share materials. A normal distribution table, a PMF/PDF handout, and a blank worksheet are attached

More information

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018 Mathematics Ph.D. Qualifying Examination Stat 52800 Probability, January 2018 NOTE: Answers all questions completely. Justify every step. Time allowed: 3 hours. 1. Let X 1,..., X n be a random sample from

More information

Introduction to Bayesian Methods

Introduction to Bayesian Methods Introduction to Bayesian Methods Jessi Cisewski Department of Statistics Yale University Sagan Summer Workshop 2016 Our goal: introduction to Bayesian methods Likelihoods Priors: conjugate priors, non-informative

More information

Unobservable Parameter. Observed Random Sample. Calculate Posterior. Choosing Prior. Conjugate prior. population proportion, p prior:

Unobservable Parameter. Observed Random Sample. Calculate Posterior. Choosing Prior. Conjugate prior. population proportion, p prior: Pi Priors Unobservable Parameter population proportion, p prior: π ( p) Conjugate prior π ( p) ~ Beta( a, b) same PDF family exponential family only Posterior π ( p y) ~ Beta( a + y, b + n y) Observed

More information

The binomial model. Assume a uniform prior distribution on p(θ). Write the pdf for this distribution.

The binomial model. Assume a uniform prior distribution on p(θ). Write the pdf for this distribution. The binomial model Example. After suspicious performance in the weekly soccer match, 37 mathematical sciences students, staff, and faculty were tested for the use of performance enhancing analytics. Let

More information

Parameter Estimation

Parameter Estimation Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23 Methods for parameter estimation Methods for parameter estimation Methods

More information

Two examples of the use of fuzzy set theory in statistics. Glen Meeden University of Minnesota.

Two examples of the use of fuzzy set theory in statistics. Glen Meeden University of Minnesota. Two examples of the use of fuzzy set theory in statistics Glen Meeden University of Minnesota http://www.stat.umn.edu/~glen/talks 1 Fuzzy set theory Fuzzy set theory was introduced by Zadeh in (1965) as

More information

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist

More information

Recall that in order to prove Theorem 8.8, we argued that under certain regularity conditions, the following facts are true under H 0 : 1 n

Recall that in order to prove Theorem 8.8, we argued that under certain regularity conditions, the following facts are true under H 0 : 1 n Chapter 9 Hypothesis Testing 9.1 Wald, Rao, and Likelihood Ratio Tests Suppose we wish to test H 0 : θ = θ 0 against H 1 : θ θ 0. The likelihood-based results of Chapter 8 give rise to several possible

More information

Statistical Inference

Statistical Inference Statistical Inference Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA. Asymptotic Inference in Exponential Families Let X j be a sequence of independent,

More information

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review STATS 200: Introduction to Statistical Inference Lecture 29: Course review Course review We started in Lecture 1 with a fundamental assumption: Data is a realization of a random process. The goal throughout

More information

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 00 MODULE : Statistical Inference Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks. The

More information

Principles of Statistics

Principles of Statistics Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 81 Paper 4, Section II 28K Let g : R R be an unknown function, twice continuously differentiable with g (x) M for

More information

Linear Models A linear model is defined by the expression

Linear Models A linear model is defined by the expression Linear Models A linear model is defined by the expression x = F β + ɛ. where x = (x 1, x 2,..., x n ) is vector of size n usually known as the response vector. β = (β 1, β 2,..., β p ) is the transpose

More information

simple if it completely specifies the density of x

simple if it completely specifies the density of x 3. Hypothesis Testing Pure significance tests Data x = (x 1,..., x n ) from f(x, θ) Hypothesis H 0 : restricts f(x, θ) Are the data consistent with H 0? H 0 is called the null hypothesis simple if it completely

More information

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006 Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

f(y θ) = g(t (y) θ)h(y)

f(y θ) = g(t (y) θ)h(y) EXAM3, FINAL REVIEW (and a review for some of the QUAL problems): No notes will be allowed, but you may bring a calculator. Memorize the pmf or pdf f, E(Y ) and V(Y ) for the following RVs: 1) beta(δ,

More information

Introduction to Bayesian Methods. Introduction to Bayesian Methods p.1/??

Introduction to Bayesian Methods. Introduction to Bayesian Methods p.1/?? to Bayesian Methods Introduction to Bayesian Methods p.1/?? We develop the Bayesian paradigm for parametric inference. To this end, suppose we conduct (or wish to design) a study, in which the parameter

More information

Stat 5102 Final Exam May 14, 2015

Stat 5102 Final Exam May 14, 2015 Stat 5102 Final Exam May 14, 2015 Name Student ID The exam is closed book and closed notes. You may use three 8 1 11 2 sheets of paper with formulas, etc. You may also use the handouts on brand name distributions

More information

Master s Written Examination

Master s Written Examination Master s Written Examination Option: Statistics and Probability Spring 016 Full points may be obtained for correct answers to eight questions. Each numbered question which may have several parts is worth

More information

Bayesian Inference. Chapter 2: Conjugate models

Bayesian Inference. Chapter 2: Conjugate models Bayesian Inference Chapter 2: Conjugate models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in

More information

Probability and Estimation. Alan Moses

Probability and Estimation. Alan Moses Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.

More information

Introduction to Probabilistic Machine Learning

Introduction to Probabilistic Machine Learning Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

9 Bayesian inference. 9.1 Subjective probability

9 Bayesian inference. 9.1 Subjective probability 9 Bayesian inference 1702-1761 9.1 Subjective probability This is probability regarded as degree of belief. A subjective probability of an event A is assessed as p if you are prepared to stake pm to win

More information

Testing Statistical Hypotheses

Testing Statistical Hypotheses E.L. Lehmann Joseph P. Romano Testing Statistical Hypotheses Third Edition 4y Springer Preface vii I Small-Sample Theory 1 1 The General Decision Problem 3 1.1 Statistical Inference and Statistical Decisions

More information

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n = Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically

More information

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise.

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise. 1. Suppose Y 1, Y 2,..., Y n is an iid sample from a Bernoulli(p) population distribution, where 0 < p < 1 is unknown. The population pmf is p y (1 p) 1 y, y = 0, 1 p Y (y p) = (a) Prove that Y is the

More information

Time Series and Dynamic Models

Time Series and Dynamic Models Time Series and Dynamic Models Section 1 Intro to Bayesian Inference Carlos M. Carvalho The University of Texas at Austin 1 Outline 1 1. Foundations of Bayesian Statistics 2. Bayesian Estimation 3. The

More information

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor. Final Examination a STA 532: Statistical Inference Wednesday, 2015 Apr 29, 7:00 10:00pm Thisisaclosed bookexam books&phonesonthefloor Youmayuseacalculatorandtwo pagesofyourownnotes Do not share calculators

More information

Part III. A Decision-Theoretic Approach and Bayesian testing

Part III. A Decision-Theoretic Approach and Bayesian testing Part III A Decision-Theoretic Approach and Bayesian testing 1 Chapter 10 Bayesian Inference as a Decision Problem The decision-theoretic framework starts with the following situation. We would like to

More information

Bayesian Inference: Posterior Intervals

Bayesian Inference: Posterior Intervals Bayesian Inference: Posterior Intervals Simple values like the posterior mean E[θ X] and posterior variance var[θ X] can be useful in learning about θ. Quantiles of π(θ X) (especially the posterior median)

More information

Math 494: Mathematical Statistics

Math 494: Mathematical Statistics Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/

More information

Lecture 1: Introduction

Lecture 1: Introduction Principles of Statistics Part II - Michaelmas 208 Lecturer: Quentin Berthet Lecture : Introduction This course is concerned with presenting some of the mathematical principles of statistical theory. One

More information

Stat 5101 Lecture Notes

Stat 5101 Lecture Notes Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random

More information

Department of Statistics

Department of Statistics Research Report Department of Statistics Research Report Department of Statistics No. 208:4 A Classroom Approach to the Construction of Bayesian Credible Intervals of a Poisson Mean No. 208:4 Per Gösta

More information

Foundations of Statistical Inference

Foundations of Statistical Inference Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 20 Lecture 6 : Bayesian Inference

More information

INTRODUCTION TO BAYESIAN METHODS II

INTRODUCTION TO BAYESIAN METHODS II INTRODUCTION TO BAYESIAN METHODS II Abstract. We will revisit point estimation and hypothesis testing from the Bayesian perspective.. Bayes estimators Let X = (X,..., X n ) be a random sample from the

More information

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017 Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 017 Monday May 8th 017, 9:00 AM 1:00 PM NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam;

More information

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3 Hypothesis Testing CB: chapter 8; section 0.3 Hypothesis: statement about an unknown population parameter Examples: The average age of males in Sweden is 7. (statement about population mean) The lowest

More information

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Put your solution to each problem on a separate sheet of paper. Problem 1. (5166) Assume that two random samples {x i } and {y i } are independently

More information

A Course in Statistical Theory

A Course in Statistical Theory A Course in Statistical Theory David J. Olive Southern Illinois University Department of Mathematics Mailcode 4408 Carbondale, IL 62901-4408 dolive@.siu.edu January 2008, notes September 29, 2013 Contents

More information

7. Estimation and hypothesis testing. Objective. Recommended reading

7. Estimation and hypothesis testing. Objective. Recommended reading 7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing

More information

Mathematical statistics

Mathematical statistics October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter

More information

Completeness. On the other hand, the distribution of an ancillary statistic doesn t depend on θ at all.

Completeness. On the other hand, the distribution of an ancillary statistic doesn t depend on θ at all. Completeness A minimal sufficient statistic achieves the maximum amount of data reduction while retaining all the information the sample has concerning θ. On the other hand, the distribution of an ancillary

More information

Bayesian inference: what it means and why we care

Bayesian inference: what it means and why we care Bayesian inference: what it means and why we care Robin J. Ryder Centre de Recherche en Mathématiques de la Décision Université Paris-Dauphine 6 November 2017 Mathematical Coffees Robin Ryder (Dauphine)

More information

λ(x + 1)f g (x) > θ 0

λ(x + 1)f g (x) > θ 0 Stat 8111 Final Exam December 16 Eleven students took the exam, the scores were 92, 78, 4 in the 5 s, 1 in the 4 s, 1 in the 3 s and 3 in the 2 s. 1. i) Let X 1, X 2,..., X n be iid each Bernoulli(θ) where

More information

9 Asymptotic Approximations and Practical Asymptotic Tools

9 Asymptotic Approximations and Practical Asymptotic Tools 9 Asymptotic Approximations and Practical Asymptotic Tools A point estimator is merely an educated guess about the true value of an unknown parameter. The utility of a point estimate without some idea

More information

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1. Problem 1 (21 points) An economist runs the regression y i = β 0 + x 1i β 1 + x 2i β 2 + x 3i β 3 + ε i (1) The results are summarized in the following table: Equation 1. Variable Coefficient Std. Error

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling Due: Tuesday, May 10, 2016, at 6pm (Submit via NYU Classes) Instructions: Your answers to the questions below, including

More information

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002 Name: Last Name 1, First Name 1 Stdnt # StudentNumber1 STAT 450: Final Examination Version 1 Richard Lockhart 16 December 2002 Instructions: This is an open book exam. You may use notes, books and a calculator.

More information

The Delta Method and Applications

The Delta Method and Applications Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order

More information

Estimation of Quantiles

Estimation of Quantiles 9 Estimation of Quantiles The notion of quantiles was introduced in Section 3.2: recall that a quantile x α for an r.v. X is a constant such that P(X x α )=1 α. (9.1) In this chapter we examine quantiles

More information

Interval Estimation. Chapter 9

Interval Estimation. Chapter 9 Chapter 9 Interval Estimation 9.1 Introduction Definition 9.1.1 An interval estimate of a real-values parameter θ is any pair of functions, L(x 1,..., x n ) and U(x 1,..., x n ), of a sample that satisfy

More information

4 Invariant Statistical Decision Problems

4 Invariant Statistical Decision Problems 4 Invariant Statistical Decision Problems 4.1 Invariant decision problems Let G be a group of measurable transformations from the sample space X into itself. The group operation is composition. Note that

More information

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics Mathematics Qualifying Examination January 2015 STAT 52800 - Mathematical Statistics NOTE: Answer all questions completely and justify your derivations and steps. A calculator and statistical tables (normal,

More information

PMR Learning as Inference

PMR Learning as Inference Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013 STAB57H3 Introduction to Statistics Duration: 3 hours Last Name: First Name: Student number:

More information

STAT215: Solutions for Homework 2

STAT215: Solutions for Homework 2 STAT25: Solutions for Homework 2 Due: Wednesday, Feb 4. (0 pt) Suppose we take one observation, X, from the discrete distribution, x 2 0 2 Pr(X x θ) ( θ)/4 θ/2 /2 (3 θ)/2 θ/4, 0 θ Find an unbiased estimator

More information

STA 732: Inference. Notes 2. Neyman-Pearsonian Classical Hypothesis Testing B&D 4

STA 732: Inference. Notes 2. Neyman-Pearsonian Classical Hypothesis Testing B&D 4 STA 73: Inference Notes. Neyman-Pearsonian Classical Hypothesis Testing B&D 4 1 Testing as a rule Fisher s quantification of extremeness of observed evidence clearly lacked rigorous mathematical interpretation.

More information

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it

More information

Noninformative Priors for the Ratio of the Scale Parameters in the Inverted Exponential Distributions

Noninformative Priors for the Ratio of the Scale Parameters in the Inverted Exponential Distributions Communications for Statistical Applications and Methods 03, Vol. 0, No. 5, 387 394 DOI: http://dx.doi.org/0.535/csam.03.0.5.387 Noninformative Priors for the Ratio of the Scale Parameters in the Inverted

More information

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Why uncertainty? Why should data mining care about uncertainty? We

More information

STAT 135 Lab 3 Asymptotic MLE and the Method of Moments

STAT 135 Lab 3 Asymptotic MLE and the Method of Moments STAT 135 Lab 3 Asymptotic MLE and the Method of Moments Rebecca Barter February 9, 2015 Maximum likelihood estimation (a reminder) Maximum likelihood estimation Suppose that we have a sample, X 1, X 2,...,

More information

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference Lecture 23 Maximum Likelihood Estimation and Bayesian Inference Thais Paiva STA 111 - Summer 2013 Term II August 7, 2013 1 / 31 Thais Paiva STA 111 - Summer 2013 Term II Lecture 23, 08/07/2013 Lecture

More information

Probability and Statistics qualifying exam, May 2015

Probability and Statistics qualifying exam, May 2015 Probability and Statistics qualifying exam, May 2015 Name: Instructions: 1. The exam is divided into 3 sections: Linear Models, Mathematical Statistics and Probability. You must pass each section to pass

More information

SOLUTION FOR HOMEWORK 7, STAT p(x σ) = (1/[2πσ 2 ] 1/2 )e (x µ)2 /2σ 2.

SOLUTION FOR HOMEWORK 7, STAT p(x σ) = (1/[2πσ 2 ] 1/2 )e (x µ)2 /2σ 2. SOLUTION FOR HOMEWORK 7, STAT 6332 1. We have (for a general case) Denote p (x) p(x σ)/ σ. Then p(x σ) (1/[2πσ 2 ] 1/2 )e (x µ)2 /2σ 2. p (x σ) p(x σ) 1 (x µ)2 +. σ σ 3 Then E{ p (x σ) p(x σ) } σ 2 2σ

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 2013-14 We know that X ~ B(n,p), but we do not know p. We get a random sample

More information

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Review Quiz 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Cramér Rao lower bound (CRLB). That is, if where { } and are scalars, then

More information

A few basics of credibility theory

A few basics of credibility theory A few basics of credibility theory Greg Taylor Director, Taylor Fry Consulting Actuaries Professorial Associate, University of Melbourne Adjunct Professor, University of New South Wales General credibility

More information

Optimization. The value x is called a maximizer of f and is written argmax X f. g(λx + (1 λ)y) < λg(x) + (1 λ)g(y) 0 < λ < 1; x, y X.

Optimization. The value x is called a maximizer of f and is written argmax X f. g(λx + (1 λ)y) < λg(x) + (1 λ)g(y) 0 < λ < 1; x, y X. Optimization Background: Problem: given a function f(x) defined on X, find x such that f(x ) f(x) for all x X. The value x is called a maximizer of f and is written argmax X f. In general, argmax X f may

More information

Other Noninformative Priors

Other Noninformative Priors Other Noninformative Priors Other methods for noninformative priors include Bernardo s reference prior, which seeks a prior that will maximize the discrepancy between the prior and the posterior and minimize

More information

Charles Geyer University of Minnesota. joint work with. Glen Meeden University of Minnesota.

Charles Geyer University of Minnesota. joint work with. Glen Meeden University of Minnesota. Fuzzy Confidence Intervals and P -values Charles Geyer University of Minnesota joint work with Glen Meeden University of Minnesota http://www.stat.umn.edu/geyer/fuzz 1 Ordinary Confidence Intervals OK

More information