MAS3301 Bayesian Statistics Problems 2 and Solutions

Size: px
Start display at page:

Download "MAS3301 Bayesian Statistics Problems 2 and Solutions"

Transcription

1 MAS33 Bayesian Statistics Problems Solutions Semester 8-9 Problems Useful integrals: In solving these problems you might find the following useful Gamma functions: Let a b be positive Then where Γ(a) If a is a positive integer then Γ(a) (a )! x a e bx dx Γ(a) b a Beta functions: Let a b be positive Then x a e x dx (a )Γ(a ) x a ( x) b dx Γ(a)Γ(b) Γ(a + b) We are interested in the mean, λ, of a Poisson distribution We have a prior distribution for λ with density f () (λ) (λ ) k ( + λ)e λ (λ > ) (a) i Find the value of k ii Find the prior mean of λ iii Find the prior stard deviation of λ (b) We observe data x,, x n where, given λ, these are independent observations from the Poisson(λ) distribution i Find the likelihood ii Find the posterior density of λ iii Find the posterior mean of λ We are interested in the parameter, θ, of a binomial(n, θ) distribution We have a prior distribution for θ with density f () (θ) k θ ( θ) + θ( θ) ( < θ < ) (otherwise) (a) i Find the value of k ii Find the prior mean of θ iii Find the prior stard deviation of θ (b) We observe x, an observation from the binomial(n, θ) distribution

2 i Find the likelihood ii Find the posterior density of θ iii Find the posterior mean of θ 3 We are interested in the parameter θ, of a binomial(n, θ) distribution We have a prior distribution for θ with density f () (θ) k θ ( θ) 3 ( < θ < ) (otherwise) (a) i Find the value of k ii Find the prior mean of θ iii Find the prior stard deviation of θ (b) We observe x, an observation from the binomial(n, θ) distribution i Find the likelihood ii Find the posterior density of θ iii Find the posterior mean of θ 4 In a manufacturing process packages are made to a nominal weight of kg All underweight packages are rejected but the remaining packages may be slightly overweight It is believed that the excess weight X, in g, has a continuous uniform distribution on (, θ) but the value of θ is unknown Our prior density for θ is (a) i Find the value of k ii Find the prior median of θ (θ < ) f () (θ) k / ( θ < ) k θ ( θ < ) (b) We observe packages their excess weights, in g, are as follows Assume that these are independent observations, given θ i Find the likelihood ii Find a function h(θ) such that the posterior density of θ is f () (θ) k h(θ), where k is a constant iii Evaluate the constant k (Note that it is a very large number but you should be able to do the evaluation using a calculator) 5 Repeat the analysis of the Chester Road example in section 63, using the same likelihood but with the following prior density (a) Find the value of k f () (λ) k [ + (8λ) ] ( < λ < ) (otherwise) (b) Use numerical methods in R to do the following i Find the posterior density plot a graph showing both the prior posterior densities ii Find the posterior mean stard deviation Note: For the numerical calculations the plot in part (b) I suggest that you use a range λ When plotting the graph, it is easiest to plot the posterior first as this will determine the length of the vertical axis The value of k can be found analytically If you do use numerical integration to find it, you will need a much wider range of values of λ

3 6 We are interested in the parameter λ of a Poisson(λ) distribution We have a prior distribution for λ with density f () (λ) (λ < ) k λ 3 e λ (λ ) (a) i Find the value of k ii Find the prior mean of λ iii Find the prior stard deviation of λ (b) We observe x,, x n which are independent observations from the Poisson(λ) distribution i Find the likelihood function ii Find the posterior density of λ iii Find the posterior mean of λ 7 In a fruit packaging factory apples are examined to see whether they are blemished A sample of n apples is examined, given the value of a parameter θ, representing the proportion of aples which are blemished, we regard x, the number of blemished apples in the sample, as an obervation from the binomial(n, θ) distribution The value of θ is unknown Our prior density for θ is f () (θ) k (θ( θ) 3 + ) ( θ ) (otherwise) (a) i Show that, for θ, the prior density can be written as f () (θ) Γ(6) Γ()Γ(4) θ ( θ) 4 + Γ() Γ()Γ() θ ( θ) ii Find the prior mean of θ iii Find the prior stard deviation of θ (b) We observe n apples x 4 Homework i Find the likelihood function ii Find the posterior density of θ iii Find the posterior mean of θ iv Use R to plot a graph showing both the prior posterior densities of θ (Hint: It is easier to get the vertical axis right if you plot the posterior density then superimpose the prior density, rather than the other way round) Solutions to Questions 6, 7 of Problems are to be submitted in the Homework Letterbox no later than 4pm on Monday February 3rd 3

4 Solutions (a) i ii iii k / E (λ) E (λ ) f () (λ) dλ k e λ dλ + k + k λf () (λ) dλ λe λ dλ λ f () (λ) dλ λ e λ dλ + λe λ dλ λ e λ dλ λ 3 e λ dλ So ) 6 9 (b) i Likelihood where L var (λ) 8 ( 3 stddev (λ) n e λ λ xi i ii Posterior density proportional to x i! e nλ λ xi xi! S n x i i e nλ λ S xi! The posterior density is f () (λ)l ( + λ)e λ e nλ λ S e (n+)λ λ S + e (n+)λ λ S+ f () (λ) k e (n+)λ λ S + e (n+)λ λ S+ where Γ(S + ) f () Γ(S + ) (λ) dλ k + (n + ) S+ (n + ) S+ Γ(S + ) Γ(S + ) k + (n + ) S+ (n + ) S+ Γ(S + ) f () Γ(S + ) (λ) + (n + ) S+ (n + ) S+ e (n+)λ λ S + e (n+)λ λ S+ 4

5 (a) i iii Posterior mean E (λ) k 6 ii Prior mean iii λf () (λ) dλ k λ S+ e (n+)λ dλ + Γ(S + ) Γ(S + 3) k + (n + ) S+ (n + ) S+3 (S+) (n+) + (S+)(S+) (n+) + (S+) (n+) f () (θ) dθ k θ ( θ) dθ + k Γ(3)Γ() Γ(5) k Γ(3)Γ() Γ(5) λ S+ e (n+)λ dλ + Γ()Γ(3) Γ(5) k!! 4! k 6 E (θ) θf () (θ) dθ k θ 3 ( θ) dθ + Γ(4)Γ() k + Γ(3)Γ(3) Γ(6) Γ(6) 3!! +!! k k 5! E (θ ) θ f () (θ) dθ k θ 4 ( θ) dθ + k Γ(5)Γ() k 4!! + 3!! 6! var (θ) 3 + Γ(4)Γ(3) θ( θ) dθ θ 3 ( θ) dθ 3 ( ) 6 5 stddev (θ) 36 θ ( θ) dθ (b) i Likelihood L ( n x ) θ x ( θ) n x ii Posterior density f () (θ) proportional to f () (θ)l f () (θ) k θ ( θ) + θ( θ) θ x ( θ) n x k θ x+ ( θ) n x+ + θ x+ ( θ) n x+ 5

6 Now f () (θ) dθ k θ x+ ( θ) n x+ dθ + k Γ(x + 3)Γ(n x + ) + θ x+ ( θ) n x+ dθ Γ(x + )Γ(n x + 3) Γ(x + 3)Γ(n x + ) k + f () (θ) Γ(x + 3)Γ(n x + ) + Γ(x + )Γ(n x + 3) Γ(x + )Γ(n x + 3) θ x+ ( θ) n x+ + θ x+ ( θ) n x+ iii Posterior mean E (θ) θf () (θ) dθ k θ x+3 ( θ) n x+ dθ + θ x+ ( θ) n x+ dθ Γ(x + 4)Γ(n x + ) Γ(x + 3)Γ(n x + 3) k + Γ(n + 6) Γ(n + 6) Γ(x+4)Γ(n x+) Γ(n+6) + Γ(x+3)Γ(n x+3) Γ(n+6) Γ(x+3)Γ(n x+) Γ(n+5) + Γ(x+)Γ(n x+3) Γ(n+5) n + 5 n + 5 Γ(x + 4)Γ(n x + ) + Γ(x + 3)Γ(n x + 3) Γ(x + 3)Γ(n x + ) + Γ(x + )Γ(n x + 3) Γ(x + 4) + Γ(x + 3)(n x + ) Γ(x + 3) + Γ(x + )(n x + ) (x + 3)(x + ) + (x + )(n x + ) n + 5 (x + ) + (n x + ) ( ) x + (x + 3) + (n x + ) n + 5 (x + ) + (n x + ) x + n + 4 6

7 3 (a) i ii Prior mean f () (θ) dθ k k E (θ) θ ( θ) 3 Γ(3)Γ(4) dθ k Γ(3)Γ(4) 6!!3! Γ(4)Γ(4) Γ(8) θf () (θ) dθ k θ 3 ( θ) 3 dθ Γ(3)Γ(4) iii E (θ ) Γ(5)Γ(4) Γ(9) θ f () (θ) dθ k θ 4 ( θ) 3 dθ Γ(3)Γ(4) (b) i Likelihood ii Posterior density Now var (θ) 3 4 stddev (θ) L(θ) ( n x ( ) ) θ x ( θ) n x f () (θ) f () (θ)l(θ) k θ x+ ( θ) n x+3 f () (θ) dθ k f () (θ) iii Posterior mean θ x+ ( θ) n x+3 Γ(x + 3)Γ(n x + 4) dθn k Γ(n + 7) k Γ(n + 7) Γ(x + 3)Γ(n x + 4) Γ(n + 7) Γ(x + 3)Γ(n x + 4) θx+ ( θ) n x+3 ( < θ < ) E (θ) θf () (θ) dθ k θ x+3 ( θ) n x+3 dθ Γ(n + 7) Γ(x + 3)Γ(n x + 4) Γ(x + 4)Γ(n x + 4) Γ(n + 8) x + 3 n

8 6 (a) i Value of k : λ 3 e λ dλ λ 4 e λ dλ Γ(4) 3! 6 k 6 ii Prior mean: ( mark) E (λ) λk λ 3 e λ dλ k λ 5 e λ dλ k Γ(5) 4! 3! 4 ( mark) iii Prior stddev: E (λ ) λ k λ 3 e λ dλ k λ 6 e λ dλ k Γ(6) 5! 3! var (λ) E (λ ) [E (λ)] 6 4 prior sd is 4 (b) i Likelihood: ( marks) n e λ λ xi L x i! i e nλ λ xi xi! ii Posterior density proportional to ( mark) λ 3 e λ e nλ λ xi That is Now the posterior density is λ xi+4 e (n+)λ λ a e bλ dλ Γ(a) b a xi+4 (n + ) Γ( x i + 4) λ xi+4 e (n+)λ ( marks) 8

9 iii To find the posterior mean, increase the power of λ by integrate Posterior mean is (n + ) xi+4 Γ( Γ( x i + 5) x i + 4) (n + ) xi + 4 xi+5 n + 7 (a) i The expression given is proportional to the prior density since Now we only need to show that this follows since Γ(6) Γ()Γ(4) Γ() Γ()Γ() Γ(6) Γ()Γ(4) θ ( θ) 4 + 5! 3! 3 Γ() Γ()Γ() θ ( θ) dθ ( mark) ii Prior mean: E (θ) θf () (θ) dθ θ ( θ) 4 dθ Γ()Γ(4) Γ(6) θ ( θ) dθ Γ()Γ() Γ() ( mark) Γ(6) Γ()Γ(4) θ3 ( θ) 4 + Γ() Γ()Γ() θ ( θ) dθ Γ(6) Γ(3)Γ(4) + Γ() Γ()Γ() Γ()Γ(4) Γ()Γ() Γ(3) ( marks) iii Prior std dev: E (θ ) θ f () (θ) dθ Γ(6) Γ()Γ(4) θ4 ( θ) 4 + Γ() Γ()Γ() θ3 ( θ) Γ(6) Γ(4)Γ(4) + Γ() Γ(3)Γ() Γ()Γ(4) Γ(8) Γ()Γ() Γ(4) so the prior std dev is var (θ) 5 ( ) dθ

10 ( marks) (b) i Likelihood: L ( 4 ) θ 4 ( θ) 6 ii Posterior density: ( mark) Posterior propto P rior Likelihood Γ(6) f () (θ) k Γ()Γ(4) θ6 ( θ) + Γ() Γ()Γ() θ5 ( θ) 7 Γ(6) f () Γ(6)Γ() (θ) dθ k + Γ() Γ(5) Γ()Γ(4) Γ(6) Γ()Γ() Γ() k k k k 33 k k Posterior density: f () (θ) 9 θ 6 ( θ) + θ 5 ( θ) 7 iii Posterior mean: ( marks) E (θ) θf () (θ) dθ 9 θ 7 ( θ) + θ 6 ( θ) 7 dθ 9 Γ() + Γ(6) Γ(3) ( marks) iv Plot: suitable R comms: > theta<-seq(,,) > prior<-5*(*theta*((-theta)^3)+) > post<-9*(*(theta^5)*((-theta)^9)+(theta^4)*((-theta)^6)) > plot(theta,post,type"l",xlabexpression(theta),ylab"density") > lines(theta,prior,lty) The plot is shown in Figure ( marks)

11 Density θ Figure : Prior (dashes) posterior (solid line) density functions for θ

MAS3301 Bayesian Statistics

MAS3301 Bayesian Statistics MAS331 Bayesian Statistics M. Farrow School of Mathematics and Statistics Newcastle University Semester 2, 28-9 1 9 Conjugate Priors II: More uses of the beta distribution 9.1 Geometric observations 9.1.1

More information

HPD Intervals / Regions

HPD Intervals / Regions HPD Intervals / Regions The HPD region will be an interval when the posterior is unimodal. If the posterior is multimodal, the HPD region might be a discontiguous set. Picture: The set {θ : θ (1.5, 3.9)

More information

Weakness of Beta priors (or conjugate priors in general) They can only represent a limited range of prior beliefs. For example... There are no bimodal beta distributions (except when the modes are at 0

More information

MAS3301 Bayesian Statistics

MAS3301 Bayesian Statistics MAS3301 Bayesian Statistics M. Farrow School of Mathematics and Statistics Newcastle University Semester 2, 2008-9 1 11 Conjugate Priors IV: The Dirichlet distribution and multinomial observations 11.1

More information

Weakness of Beta priors (or conjugate priors in general) They can only represent a limited range of prior beliefs. For example... There are no bimodal beta distributions (except when the modes are at 0

More information

MAS3301 Bayesian Statistics Problems 5 and Solutions

MAS3301 Bayesian Statistics Problems 5 and Solutions MAS3301 Bayesian Statistics Problems 5 and Solutions Semester 008-9 Problems 5 1. (Some of this question is also in Problems 4). I recorded the attendance of students at tutorials for a module. Suppose

More information

MAS3301 Bayesian Statistics

MAS3301 Bayesian Statistics MAS3301 Bayesian Statistics M. Farrow School of Mathematics and Statistics Newcastle University Semester, 008-9 1 13 Sequential updating 13.1 Theory We have seen how we can change our beliefs about an

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013 STAB57H3 Introduction to Statistics Duration: 3 hours Last Name: First Name: Student number:

More information

More on Bayes and conjugate forms

More on Bayes and conjugate forms More on Bayes and conjugate forms Saad Mneimneh A cool function, Γ(x) (Gamma) The Gamma function is defined as follows: Γ(x) = t x e t dt For x >, if we integrate by parts ( udv = uv vdu), we have: Γ(x)

More information

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1 Chapter 4 HOMEWORK ASSIGNMENTS These homeworks may be modified as the semester progresses. It is your responsibility to keep up to date with the correctly assigned homeworks. There may be some errors in

More information

Bayesian Inference: Posterior Intervals

Bayesian Inference: Posterior Intervals Bayesian Inference: Posterior Intervals Simple values like the posterior mean E[θ X] and posterior variance var[θ X] can be useful in learning about θ. Quantiles of π(θ X) (especially the posterior median)

More information

Suggested solutions to written exam Jan 17, 2012

Suggested solutions to written exam Jan 17, 2012 LINKÖPINGS UNIVERSITET Institutionen för datavetenskap Statistik, ANd 73A36 THEORY OF STATISTICS, 6 CDTS Master s program in Statistics and Data Mining Fall semester Written exam Suggested solutions to

More information

1 Introduction. P (n = 1 red ball drawn) =

1 Introduction. P (n = 1 red ball drawn) = Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

MATH : EXAM 2 INFO/LOGISTICS/ADVICE MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:

More information

Stats 579 Intermediate Bayesian Modeling. Assignment # 2 Solutions

Stats 579 Intermediate Bayesian Modeling. Assignment # 2 Solutions Stats 579 Intermediate Bayesian Modeling Assignment # 2 Solutions 1. Let w Gy) with y a vector having density fy θ) and G having a differentiable inverse function. Find the density of w in general and

More information

Conjugate Priors: Beta and Normal Spring January 1, /15

Conjugate Priors: Beta and Normal Spring January 1, /15 Conjugate Priors: Beta and Normal 18.05 Spring 2014 January 1, 2017 1 /15 Review: Continuous priors, discrete data Bent coin: unknown probability θ of heads. Prior f (θ) = 2θ on [0,1]. Data: heads on one

More information

Conjugate Priors: Beta and Normal Spring 2018

Conjugate Priors: Beta and Normal Spring 2018 Conjugate Priors: Beta and Normal 18.05 Spring 2018 Review: Continuous priors, discrete data Bent coin: unknown probability θ of heads. Prior f (θ) = 2θ on [0,1]. Data: heads on one toss. Question: Find

More information

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference Lecture 23 Maximum Likelihood Estimation and Bayesian Inference Thais Paiva STA 111 - Summer 2013 Term II August 7, 2013 1 / 31 Thais Paiva STA 111 - Summer 2013 Term II Lecture 23, 08/07/2013 Lecture

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

Conjugate Priors: Beta and Normal Spring 2018

Conjugate Priors: Beta and Normal Spring 2018 Conjugate Priors: Beta and Normal 18.05 Spring 018 Review: Continuous priors, discrete data Bent coin: unknown probability θ of heads. Prior f (θ) = θ on [0,1]. Data: heads on one toss. Question: Find

More information

Lecture 17: The Exponential and Some Related Distributions

Lecture 17: The Exponential and Some Related Distributions Lecture 7: The Exponential and Some Related Distributions. Definition Definition: A continuous random variable X is said to have the exponential distribution with parameter if the density of X is e x if

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap

More information

Bayesian Inference. p(y)

Bayesian Inference. p(y) Bayesian Inference There are different ways to interpret a probability statement in a real world setting. Frequentist interpretations of probability apply to situations that can be repeated many times,

More information

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

Parameter Estimation

Parameter Estimation Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23 Methods for parameter estimation Methods for parameter estimation Methods

More information

Introduction to Bayesian Methods

Introduction to Bayesian Methods Introduction to Bayesian Methods Jessi Cisewski Department of Statistics Yale University Sagan Summer Workshop 2016 Our goal: introduction to Bayesian methods Likelihoods Priors: conjugate priors, non-informative

More information

SPRING 2007 EXAM C SOLUTIONS

SPRING 2007 EXAM C SOLUTIONS SPRING 007 EXAM C SOLUTIONS Question #1 The data are already shifted (have had the policy limit and the deductible of 50 applied). The two 350 payments are censored. Thus the likelihood function is L =

More information

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4. UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,

More information

Applied Stochastic Models (SS 09)

Applied Stochastic Models (SS 09) University of Karlsruhe Institute of Stochastics Prof Dr P R Parthasarathy Dipl-Math D Gentner Applied Stochastic Models (SS 9) Problem Set Problem Let X, X 2 U(θ /2, θ + /2) be independent Show that the

More information

2016 SISG Module 17: Bayesian Statistics for Genetics Lecture 3: Binomial Sampling

2016 SISG Module 17: Bayesian Statistics for Genetics Lecture 3: Binomial Sampling 2016 SISG Module 17: Bayesian Statistics for Genetics Lecture 3: Binomial Sampling Jon Wakefield Departments of Statistics and Biostatistics University of Washington Outline Introduction and Motivating

More information

(1) Introduction to Bayesian statistics

(1) Introduction to Bayesian statistics Spring, 2018 A motivating example Student 1 will write down a number and then flip a coin If the flip is heads, they will honestly tell student 2 if the number is even or odd If the flip is tails, they

More information

9 Bayesian inference. 9.1 Subjective probability

9 Bayesian inference. 9.1 Subjective probability 9 Bayesian inference 1702-1761 9.1 Subjective probability This is probability regarded as degree of belief. A subjective probability of an event A is assessed as p if you are prepared to stake pm to win

More information

Statistics/Mathematics 310 Larget April 14, Solution to Assignment #9

Statistics/Mathematics 310 Larget April 14, Solution to Assignment #9 Solution to Assignment #9 1. A sample s = (753.0, 1396.9, 1528.2, 1646.9, 717.6, 446.5, 848.0, 1222.0, 748.5, 610.1) is modeled as an i.i.d sample from a Gamma(α, λ) distribution. You may think of this

More information

Stat 5102 Final Exam May 14, 2015

Stat 5102 Final Exam May 14, 2015 Stat 5102 Final Exam May 14, 2015 Name Student ID The exam is closed book and closed notes. You may use three 8 1 11 2 sheets of paper with formulas, etc. You may also use the handouts on brand name distributions

More information

You may use your calculator and a single page of notes.

You may use your calculator and a single page of notes. LAST NAME (Please Print): KEY FIRST NAME (Please Print): HONOR PLEDGE (Please Sign): Statistics 111 Midterm 2 This is a closed book exam. You may use your calculator and a single page of notes. The room

More information

CSC321 Lecture 18: Learning Probabilistic Models

CSC321 Lecture 18: Learning Probabilistic Models CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

SOCIETY OF ACTUARIES EXAM STAM SHORT-TERM ACTUARIAL MODELS EXAM STAM SAMPLE SOLUTIONS

SOCIETY OF ACTUARIES EXAM STAM SHORT-TERM ACTUARIAL MODELS EXAM STAM SAMPLE SOLUTIONS SOCIETY OF ACTUARIES EXAM STAM SHORT-TERM ACTUARIAL MODELS EXAM STAM SAMPLE SOLUTIONS Questions -37 have been taken from the previous set of Exam C sample questions. Questions no longer relevant to the

More information

Statistical Theory MT 2007 Problems 4: Solution sketches

Statistical Theory MT 2007 Problems 4: Solution sketches Statistical Theory MT 007 Problems 4: Solution sketches 1. Consider a 1-parameter exponential family model with density f(x θ) = f(x)g(θ)exp{cφ(θ)h(x)}, x X. Suppose that the prior distribution has the

More information

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist

More information

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Objective A: Mean, Median and Mode Three measures of central of tendency: the mean, the median, and the mode.

Objective A: Mean, Median and Mode Three measures of central of tendency: the mean, the median, and the mode. Chapter 3 Numerically Summarizing Data Chapter 3.1 Measures of Central Tendency Objective A: Mean, Median and Mode Three measures of central of tendency: the mean, the median, and the mode. A1. Mean The

More information

Practice Exam 1. (A) (B) (C) (D) (E) You are given the following data on loss sizes:

Practice Exam 1. (A) (B) (C) (D) (E) You are given the following data on loss sizes: Practice Exam 1 1. Losses for an insurance coverage have the following cumulative distribution function: F(0) = 0 F(1,000) = 0.2 F(5,000) = 0.4 F(10,000) = 0.9 F(100,000) = 1 with linear interpolation

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective

More information

Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification

Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification Michael Anderson, PhD Hélène Carabin, DVM, PhD Department of Biostatistics and Epidemiology The University of Oklahoma

More information

Bayes Factors for Discovery

Bayes Factors for Discovery Glen Cowan RHUL Physics 3 April, 22 Bayes Factors for Discovery The fundamental quantity one should use in the Bayesian framework to quantify the significance of a discovery is the posterior probability

More information

GOV 2001/ 1002/ E-2001 Section 3 Theories of Inference

GOV 2001/ 1002/ E-2001 Section 3 Theories of Inference GOV 2001/ 1002/ E-2001 Section 3 Theories of Inference Solé Prillaman Harvard University February 11, 2015 1 / 48 LOGISTICS Reading Assignment- Unifying Political Methodology chs 2 and 4. Problem Set 3-

More information

Statistical Theory MT 2006 Problems 4: Solution sketches

Statistical Theory MT 2006 Problems 4: Solution sketches Statistical Theory MT 006 Problems 4: Solution sketches 1. Suppose that X has a Poisson distribution with unknown mean θ. Determine the conjugate prior, and associate posterior distribution, for θ. Determine

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Bayesian Inference for Binomial Proportion

Bayesian Inference for Binomial Proportion 8 Bayesian Inference for Binomial Proportion Frequently there is a large population where π, a proportion of the population, has some attribute. For instance, the population could be registered voters

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

Parameter Learning With Binary Variables

Parameter Learning With Binary Variables With Binary Variables University of Nebraska Lincoln CSCE 970 Pattern Recognition Outline Outline 1 Learning a Single Parameter 2 More on the Beta Density Function 3 Computing a Probability Interval Outline

More information

1 Hierarchical Bayes and Empirical Bayes. MLII Method.

1 Hierarchical Bayes and Empirical Bayes. MLII Method. ISyE8843A, Brani Vidakovic Handout 8 Hierarchical Bayes and Empirical Bayes. MLII Method. Hierarchical Bayes and Empirical Bayes are related by their goals, but quite different by the methods of how these

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Bayesian course - problem set 5 (lecture 6)

Bayesian course - problem set 5 (lecture 6) Bayesian course - problem set 5 (lecture 6) Ben Lambert November 30, 2016 1 Stan entry level: discoveries data The file prob5 discoveries.csv contains data on the numbers of great inventions and scientific

More information

Chapter 4. Bayesian inference. 4.1 Estimation. Point estimates. Interval estimates

Chapter 4. Bayesian inference. 4.1 Estimation. Point estimates. Interval estimates Chapter 4 Bayesian inference The posterior distribution π(θ x) summarises all our information about θ to date. However, sometimes it is helpful to reduce this distribution to a few key summary measures.

More information

Homework 1: Solution

Homework 1: Solution University of California, Santa Cruz Department of Applied Mathematics and Statistics Baskin School of Engineering Classical and Bayesian Inference - AMS 3 Homework : Solution. Suppose that the number

More information

STAT 514 Solutions to Assignment #6

STAT 514 Solutions to Assignment #6 STAT 514 Solutions to Assignment #6 Question 1: Suppose that X 1,..., X n are a simple random sample from a Weibull distribution with density function f θ x) = θcx c 1 exp{ θx c }I{x > 0} for some fixed

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

ST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks

ST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks (9) Model selection and goodness-of-fit checks Objectives In this module we will study methods for model comparisons and checking for model adequacy For model comparisons there are a finite number of candidate

More information

λ(x + 1)f g (x) > θ 0

λ(x + 1)f g (x) > θ 0 Stat 8111 Final Exam December 16 Eleven students took the exam, the scores were 92, 78, 4 in the 5 s, 1 in the 4 s, 1 in the 3 s and 3 in the 2 s. 1. i) Let X 1, X 2,..., X n be iid each Bernoulli(θ) where

More information

STA 250: Statistics. Notes 7. Bayesian Approach to Statistics. Book chapters: 7.2

STA 250: Statistics. Notes 7. Bayesian Approach to Statistics. Book chapters: 7.2 STA 25: Statistics Notes 7. Bayesian Aroach to Statistics Book chaters: 7.2 1 From calibrating a rocedure to quantifying uncertainty We saw that the central idea of classical testing is to rovide a rigorous

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

1 Maximum Likelihood Estimation

1 Maximum Likelihood Estimation heads tails Figure 1: A simple thumbtack tossing experiment. L(θ :D) 0 0.2 0.4 0.6 0.8 1 Figure 2: The likelihood function for the sequence of tosses H,T,T,H,H. 1 Maximum Likelihood Estimation In this

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

(4) One-parameter models - Beta/binomial. ST440/550: Applied Bayesian Statistics

(4) One-parameter models - Beta/binomial. ST440/550: Applied Bayesian Statistics Estimating a proportion using the beta/binomial model A fundamental task in statistics is to estimate a proportion using a series of trials: What is the success probability of a new cancer treatment? What

More information

A Very Brief Summary of Bayesian Inference, and Examples

A Very Brief Summary of Bayesian Inference, and Examples A Very Brief Summary of Bayesian Inference, and Examples Trinity Term 009 Prof Gesine Reinert Our starting point are data x = x 1, x,, x n, which we view as realisations of random variables X 1, X,, X

More information

HW3 Solutions : Applied Bayesian and Computational Statistics

HW3 Solutions : Applied Bayesian and Computational Statistics HW3 Solutions 36-724: Applied Bayesian and Computational Statistics March 2, 2006 Problem 1 a Fatal Accidents Poisson(θ I will set a prior for θ to be Gamma, as it is the conjugate prior. I will allow

More information

4.5.1 The use of 2 log Λ when θ is scalar

4.5.1 The use of 2 log Λ when θ is scalar 4.5. ASYMPTOTIC FORM OF THE G.L.R.T. 97 4.5.1 The use of 2 log Λ when θ is scalar Suppose we wish to test the hypothesis NH : θ = θ where θ is a given value against the alternative AH : θ θ on the basis

More information

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order. Order Statistics The order statistics of a set of random variables 1, 2,, n are the same random variables arranged in increasing order. Denote by (1) = smallest of 1, 2,, n (2) = 2 nd smallest of 1, 2,,

More information

1. Non-Uniformly Weighted Data [7pts]

1. Non-Uniformly Weighted Data [7pts] Homework 1: Linear Regression Writeup due 23:59 on Friday 6 February 2015 You will do this assignment individually and submit your answers as a PDF via the Canvas course website. There is a mathematical

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Chapter 8 Maximum Likelihood Estimation 8. Consistency If X is a random variable (or vector) with density or mass function f θ (x) that depends on a parameter θ, then the function f θ (X) viewed as a function

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Exam C Solutions Spring 2005

Exam C Solutions Spring 2005 Exam C Solutions Spring 005 Question # The CDF is F( x) = 4 ( + x) Observation (x) F(x) compare to: Maximum difference 0. 0.58 0, 0. 0.58 0.7 0.880 0., 0.4 0.680 0.9 0.93 0.4, 0.6 0.53. 0.949 0.6, 0.8

More information

One Parameter Models

One Parameter Models One Parameter Models p. 1/2 One Parameter Models September 22, 2010 Reading: Hoff Chapter 3 One Parameter Models p. 2/2 Highest Posterior Density Regions Find Θ 1 α = {θ : p(θ Y ) h α } such that P (θ

More information

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling Due: Tuesday, May 10, 2016, at 6pm (Submit via NYU Classes) Instructions: Your answers to the questions below, including

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.

PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation. PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.. Beta Distribution We ll start by learning about the Beta distribution, since we end up using

More information

Introduction to Probabilistic Machine Learning

Introduction to Probabilistic Machine Learning Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning

More information

Applied Probability Models in Marketing Research: Introduction

Applied Probability Models in Marketing Research: Introduction Applied Probability Models in Marketing Research: Introduction (Supplementary Materials for the A/R/T Forum Tutorial) Bruce G. S. Hardie London Business School bhardie@london.edu www.brucehardie.com Peter

More information

Foundations of Statistical Inference

Foundations of Statistical Inference Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 20 Lecture 6 : Bayesian Inference

More information

Stat 5102 Lecture Slides Deck 3. Charles J. Geyer School of Statistics University of Minnesota

Stat 5102 Lecture Slides Deck 3. Charles J. Geyer School of Statistics University of Minnesota Stat 5102 Lecture Slides Deck 3 Charles J. Geyer School of Statistics University of Minnesota 1 Likelihood Inference We have learned one very general method of estimation: method of moments. the Now we

More information

STAB57: Quiz-1 Tutorial 1 (Show your work clearly) 1. random variable X has a continuous distribution for which the p.d.f.

STAB57: Quiz-1 Tutorial 1 (Show your work clearly) 1. random variable X has a continuous distribution for which the p.d.f. STAB57: Quiz-1 Tutorial 1 1. random variable X has a continuous distribution for which the p.d.f. is as follows: { kx 2.5 0 < x < 1 f(x) = 0 otherwise where k > 0 is a constant. (a) (4 points) Determine

More information

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Matthew S. Johnson New York ASA Chapter Workshop CUNY Graduate Center New York, NY hspace1in December 17, 2009 December

More information

Compute f(x θ)f(θ) dθ

Compute f(x θ)f(θ) dθ Bayesian Updating: Continuous Priors 18.05 Spring 2014 b a Compute f(x θ)f(θ) dθ January 1, 2017 1 /26 Beta distribution Beta(a, b) has density (a + b 1)! f (θ) = θ a 1 (1 θ) b 1 (a 1)!(b 1)! http://mathlets.org/mathlets/beta-distribution/

More information

Teaching S1/S2 statistics using graphing technology

Teaching S1/S2 statistics using graphing technology Teaching S1/S2 statistics using graphing technology CALCULATOR HINTS FOR S1 & 2 STATISTICS - STAT MENU (2) on Casio. It is advised that mean and standard deviation are obtained directly from a calculator.

More information

Comparing Non-informative Priors for Estimation and. Prediction in Spatial Models

Comparing Non-informative Priors for Estimation and. Prediction in Spatial Models Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Vigre Semester Report by: Regina Wu Advisor: Cari Kaufman January 31, 2010 1 Introduction Gaussian random fields with specified

More information

Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72

Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72 Proof. We know that which is S X Λ (x λ) = exp S X Λ (x λ) = exp Furthermore, S X (x) = { x } h X Λ (t λ)dt, 0 { x } λ a(t)dt = exp { λa(x)}. 0 Edward Furman Risk theory 4280 21 / 72 Proof. We know that

More information

Lecture 2: Conjugate priors

Lecture 2: Conjugate priors (Spring ʼ) Lecture : Conjugate priors Julia Hockenmaier juliahmr@illinois.edu Siebel Center http://www.cs.uiuc.edu/class/sp/cs98jhm The binomial distribution If p is the probability of heads, the probability

More information

Poisson Distributions

Poisson Distributions Poisson Distributions Engineering Statistics Section 3.6 Josh Engwer TTU 24 February 2016 Josh Engwer (TTU) Poisson Distributions 24 February 2016 1 / 14 Siméon Denis Poisson (1781-1840) Josh Engwer (TTU)

More information

Part 3 Robust Bayesian statistics & applications in reliability networks

Part 3 Robust Bayesian statistics & applications in reliability networks Tuesday 9:00-12:30 Part 3 Robust Bayesian statistics & applications in reliability networks by Gero Walter 69 Robust Bayesian statistics & applications in reliability networks Outline Robust Bayesian Analysis

More information

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2 STA 248 H1S MIDTERM TEST February 26, 2008 SURNAME: SOLUTIONS GIVEN NAME: STUDENT NUMBER: INSTRUCTIONS: Time: 1 hour and 50 minutes Aids allowed: calculator Tables of the standard normal, t and chi-square

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics March 14, 2018 CS 361: Probability & Statistics Inference The prior From Bayes rule, we know that we can express our function of interest as Likelihood Prior Posterior The right hand side contains the

More information

Definition: A "system" of equations is a set or collection of equations that you deal with all together at once.

Definition: A system of equations is a set or collection of equations that you deal with all together at once. System of Equations Definition: A "system" of equations is a set or collection of equations that you deal with all together at once. There is both an x and y value that needs to be solved for Systems

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Chapter 7 Maximum Likelihood Estimation 7. Consistency If X is a random variable (or vector) with density or mass function f θ (x) that depends on a parameter θ, then the function f θ (X) viewed as a function

More information

MIT Spring 2015

MIT Spring 2015 Assessing Goodness Of Fit MIT 8.443 Dr. Kempthorne Spring 205 Outline 2 Poisson Distribution Counts of events that occur at constant rate Counts in disjoint intervals/regions are independent If intervals/regions

More information