DEPARTMENT OF MATHEMATICS AND STATISTICS QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, SEPTEMBER 24, 2016

Similar documents
QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, MAY 9, Examiners: Drs. K. M. Ramachandran and G. S. Ladde

MAY 2005 SOA EXAM C/CAS 4 SOLUTIONS

Stat 643 Problems. a) Show that V is a sub 5-algebra of T. b) Let \ be an integrable random variable. Find EÐ\lVÑÞ


PART I. Multiple choice. 1. Find the slope of the line shown here. 2. Find the slope of the line with equation $ÐB CÑœ(B &.

SIMULATION - PROBLEM SET 1

Dr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products

Stat 643 Problems, Fall 2000

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Stat 5102 Final Exam May 14, 2015

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :)

STATISTICS SYLLABUS UNIT I

Notes on the Unique Extension Theorem. Recall that we were interested in defining a general measure of a size of a set on Ò!ß "Ó.

Propagation of Uncertainty in a Maritime Risk Assessment Model utilizing Bayesian Simulation and Expert Judgment Aggregation Techniques

SIMULTANEOUS CONFIDENCE BANDS FOR THE PTH PERCENTILE AND THE MEAN LIFETIME IN EXPONENTIAL AND WEIBULL REGRESSION MODELS. Ping Sa and S.J.

Statistical machine learning and kernel methods. Primary references: John Shawe-Taylor and Nello Cristianini, Kernel Methods for Pattern Analysis

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

8œ! This theorem is justified by repeating the process developed for a Taylor polynomial an infinite number of times.

Gene expression experiments. Infinite Dimensional Vector Spaces. 1. Motivation: Statistical machine learning and reproducing kernel Hilbert Spaces

Part 2: One-parameter models

Stat 5101 Lecture Notes

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Suggestions - Problem Set (a) Show the discriminant condition (1) takes the form. ln ln, # # R R

STA 4322 Exam I Name: Introduction to Statistics Theory

It's Only Fitting. Fitting model to data parameterizing model estimating unknown parameters in the model

Problem Selected Scores

7. Random variables as observables. Definition 7. A random variable (function) X on a probability measure space is called an observable

A Very Brief Summary of Statistical Inference, and Examples

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

STAT J535: Chapter 5: Classes of Bayesian Priors

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Qualifying Exam in Probability and Statistics.

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

Inner Product Spaces. Another notation sometimes used is X.?

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices.

Statistical Theory MT 2006 Problems 4: Solution sketches

Bayesian Inference for Normal Mean

The Hahn-Banach and Radon-Nikodym Theorems

Statistical Theory MT 2007 Problems 4: Solution sketches

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

1 Introduction. P (n = 1 red ball drawn) =

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Review: General Approach to Hypothesis Testing. 1. Define the research question and formulate the appropriate null and alternative hypotheses.

Problem 1 (20) Log-normal. f(x) Cauchy

Infinite Dimensional Vector Spaces. 1. Motivation: Statistical machine learning and reproducing kernel Hilbert Spaces

Math 562 Homework 1 August 29, 2006 Dr. Ron Sahoo

10. Exchangeability and hierarchical models Objective. Recommended reading

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Example Let VœÖÐBßCÑ À b-, CœB - ( see the example above). Explain why

Relations. Relations occur all the time in mathematics. For example, there are many relations between # and % À

Qualifying Exam in Probability and Statistics.

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistical Methods in HYDROLOGY CHARLES T. HAAN. The Iowa State University Press / Ames

An Introduction to Bayesian Linear Regression

Part 4: Multi-parameter and normal models

Math 1M03 (Version 1) Sample Exam

SPRING 2007 EXAM C SOLUTIONS

Spring 2012 Math 541B Exam 1

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models

Binomial Randomization 3/13/02

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

E.G., data mining, prediction. Function approximation problem: How >9 best estimate the function 0 from partial information (examples)

A Very Brief Summary of Bayesian Inference, and Examples

Asymptotic Statistics-III. Changliang Zou

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory

Practice Problems Section Problems

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

HANDBOOK OF APPLICABLE MATHEMATICS

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STAT FINAL EXAM

Sampling Distributions

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Introduction to Bayesian Methods

Example: A Markov Process

Chapter 8: Sampling distributions of estimators Sections

First Year Examination Department of Statistics, University of Florida

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Economics 573 Problem Set 4 Fall 2002 Due: 20 September

Math 50: Final. 1. [13 points] It was found that 35 out of 300 famous people have the star sign Sagittarius.

The Jeffreys Prior. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) The Jeffreys Prior MATH / 13

Math 131 Exam 4 (Final Exam) F04M

Eco517 Fall 2004 C. Sims MIDTERM EXAM

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

Chapter 8.8.1: A factorization theorem

Central Limit Theorem ( 5.3)

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Probability and Statistics qualifying exam, May 2015

e) D œ < f) D œ < b) A parametric equation for the line passing through Ð %, &,) ) and (#,(, %Ñ.

Sampling Distributions

Here are proofs for some of the results about diagonalization that were presented without proof in class.

Mathematical statistics

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources

Mathematical statistics

Module 22: Bayesian Methods Lecture 9 A: Default prior selection

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

Transcription:

DEPARTMENT OF MATHEMATICS AND STATISTICS QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, SEPTEMBER 24, 2016 Examiners: Drs. K. M. Ramachandran and G. S. Ladde INSTRUCTIONS: a. The quality is more important than the quantity. b. Attempt at least 2 problems from each part totaling at most 5 problems. c. In the absence of any Tables, students are expected to provide the solution of the data oriented problems in the form of the problem solving process. d. Students are also expected to exhibit the reading, writing and problem solving abilities. e. Just the mechanical work with an answer (if any) is not enough to receive the full credit. PART-1 1. Let \ be a real-valued random variable defined on a complete probability space ÐH ß ¹ ß TÑ with singleton set range (R( \) œ ÖB any B R; where R is a set of real numbers). a. Find: a distribution function J \ of \. Justify bþ Find a probability density function 0 \ of \ (if exists). c. Draw the sketches of J\ and 0\ with justifications. d. What conclusion can be drawn from Parts (a), (b) and (c)? Justify. 2. Let \ß\ßÞÞß\ " # 8be a Bernoulli-type random sample drawn from a population # # mean. Ò!ß "Ó and variance 5 Ò!ß "ÓÞ \ and W sample mean and variance of the random sample. a. Find the joint distribution of the random sample ; b. Find: IÒ\ Ó; c. Find: Var Ò\Ó. # # d. Show that IÒW Ó œ 5 ß e. Based on your response in (a)-(d), what conclusions can you draw. Þ

3. Let \" ß \# ß ÞÞÞß \ 8ß ÞÞÞ be a sequence of iid random variables with mean ( IÒ\ 8 Ó œ., Z+<Ð\ Ñœ5 # ÑÞShow that: 8. # # # (a) W8 converges in probability to 5 as 8Ä, whenever Z+<ÐW8ÑÄ!Þ (b) Prove or disprove convergence or divergence of W 8. # (c) On the basis of (a) and (b),what types of conclusion can you draw about W and W? 8 4. Let \ß\ßÞÞß\ be a random sample defined in Problem 2 ÞLet XÐ\Ñbe defined " # 8 8 3 3œ" by: XÐ\Ñ œ \ Þ Prove or disprove the following statements: a. The distribution of XÐ\Ñis an exponential family. b. XÐ\Ñis sufficient statistics. 8 B 3 3œ" c. The Bernoulli MLE of. is.sœ Þ 8 5. Let \ ß \ ß ÞÞß \ be a random sampleß \ œ Ð\ ß \ ß ÞÞß \ Ñ ß \ œ B ; V be observed sample set. Let 0ÐBl )) be the joint pdf of the given the sample \, and let 1) ÐÑbe a prior distribution of ). Let @ be an entire parameter set in Vand @! Á9. - Moreover, let T Ð) @! lbñ œ T Ð H! is true lbñ and T Ð) @! lbñ œ T Ð H " is truelbñþ (a) Define the "action set" with respect to the "Bayesian hypothesis testing problem". (b) Is the "hypothesis test" identical with a "decision rule"? Justify. (c) Define the loss function in the "hypothesis testing problem". (d) Define the rejection region for the "Bayesian hypothesis testing problem". (e) Determine the costs for Type I and II Error " # 8 " # 8 X 8 PART-2 1. Let \ be a binomial random variable with mean. Ò!ß "Ó and variance 5 # Ò!ß "Ó, and let \ß\ßÞÞß\ " # 8be a corresponding random sample drawn from this populatiom. In the Bayesian analysis, it is assumed that the parameter. is random variable. Let :ÐCß. Ñ be the joint distribution of Ð\ß. ÑÞ The binomial sampling model is denoted by :ÐCl. ÑÞ a. Justify the exchangeability property of :ÐCß. Ñ. b. Find the expression for the joint probability of \ and. c. Find the marginal distribution of the given random. d. Define the likelihood function, PÐ. lcñ, where B is a realization of random variable \.

2. Let ] µ Beta Ðα" ß Ñ and Z µ 0. Further assume Beta Ðα" ß Ñ and 0 have common Z Ò!ß "Ó with α œ #Þ(ß " œ 'Þ$ and - œ 2.67 œ sup 0 ] ÐCÑ Þ Let ÐY ß Z Ñ be independent and uniformly distributed random variables to generate a random variable. Show that: ÑÑ ÑÑ C (a) T Ö= ÀZÐ= ÑŸCand YÐ= ÑŸ œ -.?.@Þ ÑÑ " - - (b) T Ö= À YÐ= Ñ Ÿ = Þ -!! (c) T Ö= À ]Ð= ÑŸC œ T Ö= ÀZÐ= ÑŸC Ö = ÀYÐ= ÑŸ Þ (d) What conclusions can you draw from the results (a)-(c)? Z ÑÑ - 3. Let \ be a binomial random variable corresponding to the random sample in Problem 1 (Part 2). In the Bayesian analysis, it is assumed that the parameter. is random variable. Let :ÐCß. Ñ be the joint distribution of Ð\ß. ÑÞ The binomial sampling model is denoted by :ÐCl. ÑÞ In addition assume that :Ð. Ñ is prior distribution.. a. Find a posterior distribution of., :(. lc). b. If. µ Beta ( α, "), the find :(. lc). c. What conclusions can you draw from the conclusion of (b)? d. Using (b) find IÒ. lcó and Var Ð. lcñ. " C C e. Show that IÒ. lcó VarÐ. lcñ Ð" Ñthe α -level credible interval for.. 88 8 4. Prove or disprove the following statement. a. The normal distribution has conjugate prior distribution. b. The Binomial distribution Ð8ß. Ñ has no conjugate prior distribution. c. The Poisson distribution has conjugate prior distribution. 5. Assume that all the conditions in Problem 1 (Part 2) are valid. a. Is it possible to determine the Fisher information MÐ. Ñ,. V? Justify. b. If the answer to the question in (a) is "YES", then find the Jeffrey's prior density. c. Based on your work in (b), is the prior in (b) proper or improper? Justify. GOOD LUCK

DEPARTMENT OF lvll\.thematics Ai'!D STATISTICS QUALIFYING EXAMINATION Il IN MATHEMATICAL STATISTICS SATURDAY, JA.1'TUARY 28, 2017 Examiners: Drs. K. M. Ramachandran and G. S. Ladde INSTRUCTIONS: a. The quality is more important than the quantity. b. Attempt at least 2 problems from each part totaling at most 5 problems. c. In the absence of any Tables, students are expected to provide the solution of the data oriented problems in the form of the problem solving process. d. Students are also expected to exhibit the reading, writing and problem solving abilities. e. Just the mechanical work with an answer (if any) is not enough to receive the full credit. PART-1 1. Prove or disprove the following statements: a. The distribution of statistics is population distribution. b. If the marginal densities of random variables are identical,then they are independent. c. Var(a1X1 + a2x2) = a?var(x1) + a Var(X2) + 2a1a2Cov(X1, X2). 2. Let X 1, X 2, X 3 be random sample drawn from exponential(,8) population. Find: (i) X(i) (ii) x(3) (iii) the sample range. 3. Let X 1, X 2,.., X n be a sample, X = (Xi, X2,.., X n f, T(X) be a statistics, X = x E x R n be observed sample set, and R(T) be the range of statistics T. Let f (xl8) be the joint pdf of the given the sample, X (a) If X 1, X 2,.., X n are Bernoulli random variables with parameter O < e < land n T(X) = I:X i, then show that T(X) is sufficient statistics. i=l (b) If X 1, X 2,.., X n are iid observations from discrete uniform distribution on {1, 2, 3,..., 8}, e E N and T(X) = max i {X 1, X2,.., X n }, then show that T(X) is sufficient statistics.

DEPARTMENT OF MATHEMATICS AND STATISTICS QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, MAY 13, 2017 Examiners: Dr's. K. M. Ramachandran and G. S. Ladde INSTRUCTIONS: a. The quality is more important than the quantity. b. Attempt at least 2 problems from each part totaling at most 5 problems. c. In the absence of any Tables, students are expected to provide the solution of the data oriented problems in the form of the problem solving process. d. Students are also expected to exhibit the reading, writing and problem solving abilities. e. Just the mechanical work with an answer (if any) is not enough to receive the full credit. 1. Let \ be a real-valued random variable defined on a complete probability space ÐH ß ¹ ß TÑ with singleton set range (R( \) œ ÖB any B R; where R is a set of real numbers). a. Find: a distribution function J \ of \. Justify bþ Find a probability density function 0 \ of \ (if exists). c. Draw the sketches of J\ and 0\ with justifications. d. What conclusion can be drawn from Parts (a), (b) and (c)? Justify. 2. Let \ ß \ ß ÞÞß \ be a sample ß \ œ Ð\ ß \ ß ÞÞß \ Ñ from iid binomial Ð8ß :ÑÞ It is that both 8 and : are unknown. (a) Use the Method of Moment to find point estimators for both parameters 8 and :. (b) Identify the limitations and drawbacks of this method " # 8 " # 8 X 3. Let X be any unbiased estimator of 7) ( ), and let [ be a sufficient statistics for ). Define 9Ð[ Ñ œ I) ÒX l[ ÓÞ Show that: (a) 9Ð[ Ñ is unbiased estimator of 7 ()), (b) Var) ÐX Ñ œ Var) Ð9Ð[ ÑÑ I) ÒVar ) ÐX l[ ÑÓ, (c) Var) Ð9Ð[ÑÑ Ÿ Var ) ÐXÑ, (d) Is the inequality in Part 3(c) uniformly in )? Justify. (e) Is the 9Ð[ Ñ uniformly better unbiased estimator of 7 ())? Justify.

4. Let H o, H 1, @ and @ 9 are defined in a hypothesis testing problem. Suppose that e stands for the rejection region, and ") ÐÑœTÐ\ ) eñ. (a) Show that: (i) TÐ\ ) eñis Type I Error, and (ii) TÐ\ ) eñœ" minus Type II Error. (b) For!Ÿα Ÿ", find a α Level α Test. " " (c) Let \ µ binomial (#ß:Ñ, and let H o: : {:::Ÿ # }versus H 1: : {::: # } and reject the H o if \œ2. Find the: (i) power function for this test, (ii) Type I Error, and (iii) Type II Error. PART-2 1. Let \ and ] be two random variables. It is known that: VarÐ\Ñ œ IÒÐ\ IÒ\ÓÑ Ó # # # œ IÒ\ Ó ÐIÒ\ÓÑ ] ß IÒ\Ó œ IÒIÒ\l] ÓÓ and Var Ð\l] Ñ œ IÒÐ\ IÒ\l] ÓÑ l] Ó. Show that: # # (a) Var Ð\l] Ñ œ IÒ\ l] Ó ÐIÒ\l] ÓÑ ß # # (b)var ÐIÒ\l] ÓÑ œ EÒÐIÒ\l] ÓÑ Ó ÐIÒIÒ\l] ÓÓÑ ß # # # (c) VarÐ\Ñ œ IÒIÒÐ\ IÒ\ÓÑ l] ÓÓ œ IÒIÒ\ l] Ó ÐIÒIÒ\l] ÓÓ Ñ ß (d) VarÐ\Ñ œ VarÐIÒ\l] ÓÑ IÒVarÐ\l] ÑÓÞ # 2. Let \ be a binomial random variable corresponding to the random sample in Problem 2 (Part 1). In the Bayesian analysis, it is assumed that the parameter. is random variable. Let :ÐCß. Ñ be the joint distribution of Ð\ß. ÑÞ The binomial sampling model is denoted by :ÐCl. ÑÞ a. Justify the exchangeability property of :ÐCß. Ñ. b. Find the expression for the joint probability of \ and. c. Find the marginal distribution of the given random. d. Define the likelihood function, PÐ. lcñ, where B is a realization of random variable \. 3. Assume that all the conditions in Problem 2 are valid. In addition assume that :Ð. Ñ is prior distribution.. a. Find a posterior distribution of., :(. lc). b. If. µ Beta ( α, "), the find :(. lc). c. What conclusions can you draw from the conclusion of (b)? d. Using (b) find IÒ. lcó and Var Ð. lcñ. " C C e. Show that IÒ. lcó VarÐ. lcñ Ð" Ñthe α -level credible interval for.. 88 8

4. Let ] µ Beta Ðα" ß Ñ and Z µ 0. Further assume Beta Ðα" ß Ñ and 0 have common Z Ò!ß "Ó with α œ #Þ(ß " œ 'Þ$ and - œ 2.67 œ sup 0 ] ÐCÑ Þ Let ÐY ß Z Ñ be independent and uniformly distributed random variables to generate a random variable. Show that: ÑÑ ÑÑ C (a) T Ö= ÀZÐ= ÑŸCand YÐ= ÑŸ œ -.?.@Þ ÑÑ " - - (b) T Ö= À YÐ= Ñ Ÿ = Þ -!! (c) T Ö= À ]Ð= ÑŸC œ T Ö= ÀZÐ= ÑŸC Ö = ÀYÐ= ÑŸ Þ (d) What conclusions can you draw from the results (a)-(c)? GOOD LUCK Z ÑÑ -

DEPARTMENT OF MATHEMATICS AND STATISTICS QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, SEPTEMBER 30, 2017 Examiners: Dr's. K. M. Ramachandran and G. S. Ladde INSTRUCTIONS: a. The quality is more important than the quantity. b. Attempt at least 2 problems from each part totaling at most 5 problems. c. In the absence of any Tables, students are expected to provide the solution of the data oriented problems in the form of the problem solving process. d. Students are also expected to exhibit the reading, writing and problem solving abilities. e. Just the mechanical work with an answer (if any) is not enough to receive the full credit. PART-1 1. Let Ö\ 8 " be a sequence of random variables defined on the complete probability " space ÐH ß ¹ ß T Ñ with the range of each term \ 8 is singleton set Ö" Þ Find: (a) ÖJ \ " and Ö0 \ " (if possible). Justify. 8 8 (b) Does Ö\ 8 " converge in distribution? Justify. (c) Does Ö\ 8 " converge in almost sure sense? Justify. (d) Based on your responses to above questions, can you draw any conclusions? 8 2. Let \ß\ßÞÞß\ " # 8be a random sample from Binomial Ð8ß:Ñdistribution, where both 8 and : are unknown. (a) By employing the Method of Moment, find estimators for both 8 and :. (b) Discuss a situation in which both 8 and : are unknown. (c) Five realizations of a binomial Ð8ß :Ñ experiment are observed. (i) The First data set is: 16, 18, 22, 25, 27, and (ii) the second data set is: 16, 18, 22, 25, 28. For these data sets, using the method of moment,calculate the estimates for 8 and :. (d) Compare the estimates in Problem 2(c). (e) What kind of conclusions can you draw from Problem 2(d)?

3. Let 1 be twice continuously differentiable function defined on R into itself. Let \ be a random variable defined on the complete probability space ÐH ß ¹ ß TÑwith mean.. Show that: (a) IÒ1Ð\ÑÓ 1Ð. Ñ. Justify.. # (b) Var Ð1Ð\ÑÑ l 1Ð. Ñl Var( \). Justify..B 4. Let \ ß \ ß ÞÞß \ be a random sampleß \ œ Ð\ ß \ ß ÞÞß \ Ñ ß \ œ B ; V be observed sample set. Let 0ÐBl )) be the joint pdf of the given the sample, \, and let sup@ 9 PÐ) lbñ @ be an entire parameter set in V. For @ 9 Á 9, let us define -ÐBÑ œ sup PÐ lbñ. @ ) (a) Show that: " # 8 " # 8 X 8 (i)!ÿ- ÐBÑŸ"ßfor all B ;, (ii) As sup @ PÐ) lbñincreases, -ÐBÑdecreases, PÐs) 9lBÑ sup@ 91ÐXÐBÑl) Ñ (iii) -ÐBÑ œ PÐs) lbñ, (iv) -ÐBÑ œ sup 1ÐX ÐBÑl) Ñ œ - ÐX ÐBÑÑ, where X Ð\Ñ is @ sufficient statistics and 0ÐBl) ) œ 1ÐX ÐBÑl) Ñ2ÐBÑà (b) Under what condition(s) on @ 9, -Ð\Ñ is called as the Likelihood Ratio Test Statistics? (c) Assuming that -Ð\Ñ is Likelihood Ratio Test statistics, what can you say about : ÖB À -ÐBÑ Ÿ - for any number - satisfying! Ÿ - Ÿ "? PART-2 1. Under the Bayesian approach, for given \œb, the 1) Ð lb) is posterior distribution of ). For E @ ßthe credible probability of Eis defined by: T Ð) ElBÑ œ 1 ÐÐ) lb ).). E (a) Is there any relationship between the credible set and confidence set? Justify. (b) For given \ß\ßÞÞß\ " # 8 be random sample with Poisson (-) and - has a " " gamma Ð+ß,Ñprior. The posterior pdf of - is gamma Ð+ XÐBÑßÒ8 Ó Ñ, 8 where XÐ\Ñ œ \. Find the credible set for. 3œ" 3 - (c) In addition to the conditions in (b), assume that: 8œ"! and XÐBÑœ'. Find 90% credible set for -. 2. Let \" ß \# ß ÞÞÞß \ 8be a random sample of size 8 from a population RÐ. ß 5 # Ñ with 5 # œ %Þ Further assume that. µ RÐ!ß "ÑÞ # (a) Find IÒ. lbó and IÒÐ. IÒ. lbóñ lbó Þ (b) Find the precision of the normal distribution. (c) Find *&% credible interval for..,

3. Let \ be normal random variable with mean. and variance 5 # ÞLet us further # # assume that. µrð. : ß 5: Ñand its prior distribution is 1 (.)..: and 5: are assumed to be either known or estimated. (a) Find the marginal distribution of the given random. (b) Define the likelihood function, PÐ) lbñ, where B is a realization of random variable \. (c) Find a posterior distribution of ), 1 ()lb). (d) Find the α -level credible interval.. 4. Assume that all the given statements in Problem # 2 remain unchanged. Further assume that.: œ "!!ß 5: œ "&ß 5 œ "! and B œ ""&Þ (a) Find IÒ. lbó and IÒÐ. IÒ. ÓÑ # lbó Þ (b) Find the precision of the normal distribution. (c) Show that the IÒ. Ó œ IÒIÒ. lbóó and interpret it. (d) Show that IÒvar Ð. lbñó œ var(.) var( IÒ. lbó) and interpret it GOOD LUCK