Semi Bayes Estimation of the Parameters of Binomial Distribution in the Presence of Outliers

Size: px
Start display at page:

Download "Semi Bayes Estimation of the Parameters of Binomial Distribution in the Presence of Outliers"

Transcription

1 Journal of Statistical Theory and Applications Volume 11, Number 2, 2012, pp ISSN Semi Bayes Estimation of the Parameters of Binomial Distribution in the Presence of Outliers U. J. Dixit Shailaja G. Kelkar Department of Statistics, University of Mumbai, Mumbai-India Abstract Semi Bayes estimation of the Binomial distribution of n based on some prior distribution is studied in the presence of outliers when probability of success is known and unknown. By simulation study it is conjectured that some prior distributions are more useful with respect to the generalized variance of the semi bayes estimator of n and p. Key Words: Binomial distribution, Bayes estimators, outliers. AMS Subject classification:62f10 ulhasdixit@yahoo.co.in kelkar shailaja@yahoo.com

2 U. J. Dixit and S. G. Kelkar Introduction Hamedani and Walter1988) obtained Bayes estimate of the binomial parameter n based on a general prior distribution of n. As special cases such as improper prior and Poisson prior were also considered and formulae for the marginal and posterior distributions were obtained. The standard problem associated with the binomial distribution is that of estimating its probability of success, p. A much less well studied and considerably harder problem is that of estimating the number, n. The estimation of the parameter n is not so straightforward nor well-known and does not appear to have a very large literature. Consider a problem of shooter in Target archery. A shooter needs to aim at the target and hit precisely. To shoot the arrow, the shooter has to adopt the correct stance. He should place the body at a perpendicular angle to the shooting line with the feet apart at shoulder-width distance. The type of stance that the shooter adopts is vital to his performance. There are four types of stances, even, open, close and oblique stance. A shooter practices various stances by marking the exact placement of his feet on the shooting line. Deviation of even by a couple of inches can wreck a shooters aiming and sighting; and it is needless to say that it can begin to plague him with accuracy problems. Let X 1,X 2,...,X m be the number of successful attempts made by the shooter while hitting the target. The shooter has changed his stance on some attempts, say kknown) which results into different probability of success. It implies that out of m attempts, in m k) attempts the probability of success is p and in the remaining k attempts the probability of success is changed to αp, where α > 0, 0 < αp < 1 and α is unknown. Thus, we assume that the random variables X 1,X 2,...,X m are such that k of them are distributed with probability mass function gx, n, p, α), where gx,n,p,α) = Cn,x).αp) x q 1 n x,x = 0,1,...,n, 0 < p < 1, 0 < αp < 1, q 1 = 1 αp, 1) and the remaining m k) random variables are distributed with probability mass function fx,n,p), where fx,n,p) = Cn,x)p x q n x ; x = 0,1,...,n, 0 < p < 1, q = 1 p. 2)

3 Binomial Distribution in the Presence of Outliers 145 In this paper, we have obtained the Bayes estimate of n when the prior distribution of n is i)improper prior ii) Poisson iii)negative binomial, when p is known and unknown. 2 Improper prior Of n We will consider the joint distribution of X 1,X 2,...,X m ) in the presence of k outliers. fx 1,x 2,...x m, n,p) = [Cm,k)] 1 p T q mn T q 1 q )nk) Gx,α,p) where Cm,k) = m! m k)!k!, Gx,α,p) = m k+1 A 1 =1 and T = m x i, see Dixit1989). m k+2 A 2 =A m A k =A k 1 +1 Now we will consider the prior distribution of n as gn), where The joint distribution of X 1,X 2,...,X m ) is fx 1,x 2,...x m,p) = where X = k x A i, and n=x m [Cm,k)] 1 Let n X m) = j n = X m) +j. Consider m n=x m) m k+1 = Cn,x i) )q mn T q 1 q )nk) = A 1 =1 gn) = 1 n. m m k+2 A 2 =A 1 +1 m j=0 m Cn,x i ) 3) αq q 1 ) k x A i, Cn,x i )p T q mn T q 1 q )nk)... m A k =A k αq q 1 ) X gn), 4) Cx m) +j,x i) )q m k q k 1) j q m k)xm T q x m)k 1, 5) = ) Γxm) +j +1) m [ m 1 j=0 [ m 1 Γx m) +1) Γx m) x i) +1) Γx m) x i) +j +1) ] [Γx m) +1)] Γx i) +1)Γx m) x i) +1) q m k q1 k)j, 6) j! ]

4 U. J. Dixit and S. G. Kelkar 146 = m where ) Cx m),x i) ) m F m 1 x m) +1) m ;x m) x 1 +1,...,x m) x m 1) +1;q m k q1) k, 7) x m) +1) m = x m) +1,..m times)...,x m) +1, and mf m 1.) is a hypergeometric function. fx,p) = [Cm,k)] 1 p T q m k)xm) T q x m 1 m)k) αq 1 ) X Cx q m),x i) ) 1 ) m F m 1 x m) +1) m ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1) k 8) En) = x m) + = x m) + n=x m) n x m)) m Cn,x i))p T q mn T q 1 q ) nk αq fx,p) q 1 ) X n=x m) m 1 Cn,x i)) n! x m)!n x m) 1)! pt q mn T q 1 q ) nk αq q 1 ) X Let n X m) 1 = j n = X m) +j +1, = x m) + j=0 m 1 fx,p) Cx m) +j +1,x i) ) x m)+j+1)! x m)!j! fx,p) p T q m k q k 1 )j q m k)x m)+1) T q x m)+1)k 1,,, Consider = x m) + x m) +1) j=0 Cx m) +j +1,x m) +1) fx,p) j=0 m 1 Cx m) +j +1,x i) )p T q m k q k 1) j q m k)x m)+1) T q x m)+1)k 1. m 1 Cx m) +j +1,x m) +1) Cx m) +j +1,x i) )q m k q1) k j = j=0 Γx m) +2) ) Γxm) +j +2) m [ m 1 Γxm) x i) +2) [ m 1 Γx m) x i) +j +2) ) Γx m) +2) ] q m k q1 k)j, Γx i) +1)Γx m) x i) +2) j! ) ] m 1 ) ˆn B = x m) +x m) +1)q m k q1 k xm) +1 x m) x i) +1 m F m 1 xm) +2) m ;x m) x 1) +2,...,x m) x m 1) +2;q m k q1 k ) ). mf m 1 xm) +1) m ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1 k 9)

5 Binomial Distribution in the Presence of Outliers 147 For homogeneous case consider α = 1 in 9) When m = 1 in 10), m 1 ) ˆn B = x m) +x m) +1)q m xm) +1 x m) x i) +1 mf m 1 xm) +2) m ;x m) x 1) +2,...,x m) x m 1) +2;q m) mf m 1 xm) +1) m ;x m) x 1) +1,...,x m) x m 1) +1;qm). 10) ˆn B = x 1 +q. 11) p The expression in 11) is given Hamedani and Walter1988)Page:1832). Case1) p is known and α = 1 We can estimate ˆn B from 10). Case2) p is unknown and α = 1 By using moment estimate of p, where m i = m 1 m j=1 x j i ; i = 1,2 Then we can estimate ˆn B from 10). Case3) p and α are known We can estimate ˆn B from 9). Case4) p and α are unknown From 3) ˆp = 1 m 2 m 1 )2 m, 12) 1 lnlp,α) T lnp+mn T)lnq +nk[lnq 1 lnq]+ln αq q 1 ) X, 13) Let αq q 1 ) X = S, X αq q 1 ) X = S x,and X 2 αq q 1 ) X = S xx. The likelihood Equations are as follows: T p mn T q +nk[ α + 1 α 1S x ]+ q 1 q qq 1 S = 0, 14) nkp + 1 S x q 1 αq 1 S = 0. 15)

6 U. J. Dixit and S. G. Kelkar 148 From 14) and 15) we get one more equation m 1 = npbα+ b), 16) write 15) as nkpα S x S = 0. 17) Let t 1 p,α) = x npbα+ b), 18) and t 2 p,α) = S x S nkpα. 19) Expanding the equations 18) and 19) by Taylor series around p 0,α 0 ), t 1 p,α) = t 1 p 0,α 0 )+ pt 1pp 0,α 0 )+ αt 1αp 0,α 0 )+om 1 ), 20) and t 2 p,α) = t 2 p 0,α 0 )+ pt 2pp 0,α 0 )+ αt 2αp 0,α 0 )+om 1 ), 21) where α = α α 0, p = p p 0, t 1p p,α) and t 1α p,α) are the first order derivative of t 1 p,α) with respect to p and α respectively. Similarly t 2p p,α) and t 2α p,α) are the first order derivative of t 2 p,α) with respect to p and α respectively. t 1p = nbα+ b), and Write 18) and 19) as follows t 1α = npb, t 2p = α 1 qq 1 [ S xx S S x S )2 ] nkα, t 2α = 1 αq 1 [ S xx S S x S )2 ] nkp. t 1 p,α) = c 1 +g p+h α = 0, 22)

7 Binomial Distribution in the Presence of Outliers 149 and t 2 p,α) = c 2 +e p+f α = 0. 23) Solving 22) and 23) and α = c 2g +c 1 e fg eh, p = c 2h+c 1 f eh fg. The corrections p and α are obtained such that we may expect p = p+p 0, α = α+α 0. Selectthe initial solutions p 0 and α 0 then from 9) calculate ˆn B and call it as n 0. After iterations we will get p and α. Kale1962) had considered the multiparameter case and shown that this method is justified. From that we will get ˆn B. For details, see Dixit and Kelkar2011). 3 Prior of n is Poissonλ) Let gn) = e λ λ n, n = 0,1,2,..., λ > 0. 24) n! The joint probability distribution of X 1,X 2,...,X m ) is fx 1,x 2,...x m,n,p) = [Cm,k)] 1 m Cn,x i )p T q mn T q 1 q )nk) αq ) X e λ λ n, 25) q 1 n! fx 1,x 2,...x m,p) = n=x m) [Cm,k)] 1 m Cn,x i )p T q mn T q 1 q )nk) αq ) X e λ λ n, 26) q 1 n!

8 U. J. Dixit and S. G. Kelkar 150 After simplification fx,p) = [Cm,k)] 1 p q )T e λ q mx m) q m 1 1 αq q )x m)k ) X Cx q m),x i) )[x m)!] 1 1 ) m 1 F m 1 x m) +1) m 1 ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1λ) k. 27) Bayes estimate of n in squared error loss function is m 1 ) ˆn B = x m) +q m k) q1 k xm) +1 x m) x i) +1 m 1 F m 1 xm) +2) m 1 ;x m) x 1) +2,...,x m) x m 1) +2;q m k q1 k λ) m 1F m 1 xm) +1) m 1 ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1 k 28) λ). For homogeneous case α = 1, ˆn B = x m) +q m λ When m = 1, m 1 xm) +1 x m) x i) +1 ) m 1 F m 1 xm) +2) m 1 ;x m) x 1) +2,...,x m) x m 1) +2;q m λ ) ). m 1F m 1 xm) +1) m 1 ;x m) x 1) +1,...,x m) x m 1) +1;q m 29) λ ˆn B = x 1 +qλ. 30) The expression in 30) is given Hamedani and Walter1988)Page:1834). Case1) p, λ are known and α = 1 We can estimate ˆn B from 29). Case2) p is unknown, λ is known and α = 1 By using moment estimate of p from 12) and then estimate ˆn B from 29). Case3) p, α and λ are known We can estimate ˆn B from 28). Case4) p and α are unknown, λ is known By using method of mle expressions from 13) to 23), we can estimate ˆn B. 4 Prior of n is Negative Binomialθ, r) gn) = Γr +n) Γr)Γn+1) 1 θ)r θ n, n = 0,1,2,...,r = 1,2,3..., θ > 0. 31)

9 Binomial Distribution in the Presence of Outliers 151 The joint probability distribution of X 1,X 2,...,X m ) is fx 1,x 2,...x m,n,p) = [Cm,k)] 1 m fx 1,x 2,...x m,p) = After simplification Cn,x i )p T q mn T q 1 q )nk) αq q 1 ) X Γr +n) Γr)Γn+1) 1 θ)r θ n, 32) n=x m) [Cm,k)] 1 m Cn,x i )p T q mn T q 1 q )nk) αq q 1 ) X Γr +n) Γr)Γn+1) 1 θ)r θ n, 33) m 1 fx,p) = [Cm,k)] 1 Cr +x m) 1,x m) ) Cx m),x i ) p q )T 1 θ) r q m k) q1θ) k x αq m) ) X q 1 ) m F m 1 x m) +1) m 1,r +x m) ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1θ) k.34) Bayes estimate of n in squared error loss function is m 1 ) xm) ) ˆn B = x m) +q m k) q1θ k x m) +1 +r x m) x i) +1 r m F m 1 xm) +2) m 1,r+x m) +1;x m) x 1) +2,...,x m) x m 1) +2;q m k q1 k θ) mf m 1 xm) +1) m 1,r+x m) ;x m) x 1) +1,...,x m) x m 1) +1;q m k q1 k.35) θ) For homogeneous case, α = 1 m 1 ) xm) ) ˆn B = x m) +q m x m) +1 +r θ x m) x i) +1 r m F m 1 xm) +2) m 1,r+x m) +1;x m) x 1) +2,...,x m) x m 1) +2;q m θ ) mf m 1 xm) +1) m 1,r +x m) ;x m) x 1) +1,...,x m) x m 1) +1;q m θ ). 36) Case1) p, θ, r are known and α = 1 We can estimate ˆn B from 36). Case2) p is unknown, θ, r are known and α = 1 By using moment estimate of p from 12) and then estimate ˆn B from 36). Case3) p, α, θ and r are known We can estimate ˆn B from 35). Case4) p and α are unknown, θ and r are known By using method of mle expressions from 13) to 23), we can estimate ˆn B.

10 U. J. Dixit and S. G. Kelkar Numerical Study Our all results of estimation of n are same as Draper and Guttman1971) and Hamedani and Walter1988) in case of improper and poisson prior λ=13) of n. i.e. In homogeneous case when improper and poisson priors are used the estimator of n is 13 for m=2, x 1 =10, x 2 =12 and p=0.8. Moreover we also have used Negative Binomial prior for n and the estimate of n is 12 where m=2, x 1 =10, x 2 =12, p=0.8, θ=0.6 and r=3. In order to get the idea of selecting prior of n, we have generated one thousand samples of size m = 1020)50 from the Binomial distribution with n = 10, p=.4,.5,.6 and.8, α=.4,.5,.6 and.8 and k=0,1,2,3 for the following three prior distributions of n, i)g 1 n) = 1 for all n ii)g 2 n) = e λ λ n n! n = 0,1,2,..., λ = 2 and 3 iii)g 3 n) = Γr+n) ΓrΓn+1) 1 θ)r θ n n = 0,1,2,...,r = 1,2,3, θ =.4 and.6 We would like to comment on bias of ˆn, ˆp and ˆα. A)Homogeneous case k = 0): 1.Bias of ˆn, ˆp need not be positive for m = 1010)50. 2.Bias of ˆp is more for m=10, 20 for an improper prior distribution of n as compared to the other prior distributions of n. 3.Bias of ˆp is almost same for m=30 for all prior distributions of n. 4.Bias of ˆp is less for m=40, 50 for an improper prior distribution of n as compared to the other prior distributions of n. 5.We can not draw any such conclusions in case of bias of ˆn for all prior distributions of n. 6.For m > 50 the bias of ˆn is less than 1 and bias of ˆp tends to zero. B)Outliers case k > 0): We have plotted graphs for various values of p,α) for n = Bias of ˆp is always positive for all prior distributions of n. 2.Bias of ˆn and ˆα need not be positive for all prior distributions of n. 3.Bias of ˆn, ˆp and ˆα is more in case of improper prior distribution of n as compared to the Poison and Negative Binomial prior distributions of n.

11 Binomial Distribution in the Presence of Outliers Interestingly for k=3, bias of ˆα is almost same for m=1010)50. 5.For m > 50 bias of ˆn is less than 1, bias of ˆp and bias of ˆα tends to zero. In case of multiparametric case we have to compare the efficiencies of estimators on the basis of the determinant of variance-covariance matrix. Here we have calculated the determinant of variance-covariance matrix of ˆp, ˆα,ˆn) under g 1, g 2 and g 3. In homogeneous case i.e.k = 0 and α = 1, we have presented determinant in Tables 1-4 for m = 1020)50 with n = 10, p=.4,.5,.6 and.8. Determinant of variance-covariance matrix under g 1 is more as compared to under g 2 and g 3.We can conjecture that estimates of n and p under g 2 and g 3 are better than under g 1. But there is no significant difference in the determinant under g 2 and g 3. For outlier case i.e.k=1 and 3, we have plotted the generalized variance-covariance matrix. Determinant under g 1 is significantly less than the determinant under g 2 and g 3. In this case estimates of p, α and n are better under g 1 than the estimates of p, α and n under g 2 and g 3. Here also determinant does not change significantly under g 2 and g 3.

12 U. J. Dixit and S. G. Kelkar 154 Appendix Determinant of ˆp and ˆn Table 1.λ = 2 θ = 0.4 r=1 m Prior p = 0.4) p = 0.5) p = 0.6) p = 0.8) 10 Improper Poisson NegBin Improper Poisson NegBin Improper Poisson NegBin Table 2.λ = 2 θ = 0.4 r=3 m Prior p = 0.4) p = 0.5) p = 0.6) p = 0.8) 10 Improper Poisson NegBin Improper Poisson NegBin Improper Poisson NegBin

13 Binomial Distribution in the Presence of Outliers 155 Determinant of ˆp and ˆn Table 3.λ = 3 θ = 0.6 r=1 m Prior p = 0.4) p = 0.5) p = 0.6) p = 0.8) 10 Improper Poisson NegBin Improper Poisson NegBin Improper Poisson NegBin Table 4.λ = 3 θ = 0.6 r=3 m Prior p = 0.4) p = 0.5) p = 0.6) p = 0.8) 10 Improper Poisson NegBin Improper Poisson NegBin Improper Poisson NegBin

14 U. J. Dixit and S. G. Kelkar 156 Bias of nˆ n=10 k=1!" #!$%&' r=3 n=10 k=3!"'' #!$%&''' r=3

15 Binomial Distribution in the Presence of Outliers 157 Bias of pˆ n=10 k=1!" #!$%&' r=3 n=10 k=3!"'' #!$%&''' r=3

16 U. J. Dixit and S. G. Kelkar 158 Bias of ˆ n=10 k=1!" #!$%6 r=3 n=10 k=3!"& #!$%6 r=3

17 Binomial Distribution in the Presence of Outliers 159 Generalized variance ofpˆ,ˆ and nˆ n=10 k=1 =2 r= 1 =0.4 n=10 k=1 =2 r= 3 =0.4

18 U. J. Dixit and S. G. Kelkar 160 Generalized variance ofpˆ, ˆ and nˆ n=10 k=1!=3 r= 1 "=0.6 n=10 k=1!=3 r= 3 "=0.6

19 Binomial Distribution in the Presence of Outliers 161 Generalized variance ofpˆ,ˆ and nˆ n=10 k=3 =2 r= 1 =0.4 n=10 k=3 =2 r=3 =0.4

20 U. J. Dixit and S. G. Kelkar 162 Generalized variance ofpˆ,ˆ and nˆ n=10 k=3 =3 r= 1 =0.6 n=10 k=3 =3 r= 3 =0.6

21 Binomial Distribution in the Presence of Outliers 163 Acknowledgment The authors are thankful to the editors and the referees for their valuable comments. References [1] Dixit, U.J.1989).Estimation of parameters of the gamma distribution in the presence of outliers. Commun.Statist Theor. Meth., 18: [2] Dixit, U.J. and Kelkar, S.G.2011).Estimation of the Parameter of Binomial Distribution in the Presence of Outliers.Accepted for publication in Journal of Applied Statistical Sciences. [3] Draper, N. and Guttman, I.1971).Bayesian estimation of the Binomial parametertechnometrics, 13: [4] Hamedani, G.G. and Walter, G.G.1988).Bayes estimation of the binomial parameter. ncommun.statist Theor. Meth., 176): [5] Kale, B.K.1962).On the solution of the likelihood equation by iteration process-the multiparametric case.biometrika,49:

Bayesian Inference. Chapter 9. Linear models and regression

Bayesian Inference. Chapter 9. Linear models and regression Bayesian Inference Chapter 9. Linear models and regression M. Concepcion Ausin Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in Mathematical Engineering

More information

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017 Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 017 Monday May 8th 017, 9:00 AM 1:00 PM NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam;

More information

Stat 5101 Lecture Notes

Stat 5101 Lecture Notes Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random

More information

Practice Problems Section Problems

Practice Problems Section Problems Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 8 1 / 11 The Prior Distribution Definition Suppose that one has a statistical model with parameter

More information

Key Words: binomial parameter n; Bayes estimators; admissibility; Blyth s method; squared error loss.

Key Words: binomial parameter n; Bayes estimators; admissibility; Blyth s method; squared error loss. SOME NEW ESTIMATORS OF THE BINOMIAL PARAMETER n George Iliopoulos Department of Mathematics University of the Aegean 83200 Karlovassi, Samos, Greece geh@unipi.gr Key Words: binomial parameter n; Bayes

More information

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm Midterm Examination STA 215: Statistical Inference Due Wednesday, 2006 Mar 8, 1:15 pm This is an open-book take-home examination. You may work on it during any consecutive 24-hour period you like; please

More information

Math 2: Algebra 2, Geometry and Statistics Ms. Sheppard-Brick Chapter 4 Test Review

Math 2: Algebra 2, Geometry and Statistics Ms. Sheppard-Brick Chapter 4 Test Review Chapter 4 Test Review Students will be able to (SWBAT): Write an explicit and a recursive function rule for a linear table of values. Write an explicit function rule for a quadratic table of values. Determine

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

Hybrid Censoring; An Introduction 2

Hybrid Censoring; An Introduction 2 Hybrid Censoring; An Introduction 2 Debasis Kundu Department of Mathematics & Statistics Indian Institute of Technology Kanpur 23-rd November, 2010 2 This is a joint work with N. Balakrishnan Debasis Kundu

More information

March Algebra 2 Question 1. March Algebra 2 Question 1

March Algebra 2 Question 1. March Algebra 2 Question 1 March Algebra 2 Question 1 If the statement is always true for the domain, assign that part a 3. If it is sometimes true, assign it a 2. If it is never true, assign it a 1. Your answer for this question

More information

Georgia Department of Education Common Core Georgia Performance Standards Framework CCGPS Advanced Algebra Unit 2

Georgia Department of Education Common Core Georgia Performance Standards Framework CCGPS Advanced Algebra Unit 2 Polynomials Patterns Task 1. To get an idea of what polynomial functions look like, we can graph the first through fifth degree polynomials with leading coefficients of 1. For each polynomial function,

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 2013-14 We know that X ~ B(n,p), but we do not know p. We get a random sample

More information

Institute of Actuaries of India

Institute of Actuaries of India Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2018 Examinations Subject CT3 Probability and Mathematical Statistics Core Technical Syllabus 1 June 2017 Aim The

More information

Clustering using Mixture Models

Clustering using Mixture Models Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative

More information

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models

More information

Applied Probability Models in Marketing Research: Introduction

Applied Probability Models in Marketing Research: Introduction Applied Probability Models in Marketing Research: Introduction (Supplementary Materials for the A/R/T Forum Tutorial) Bruce G. S. Hardie London Business School bhardie@london.edu www.brucehardie.com Peter

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

Indicate the answer choice that best completes the statement or answers the question.

Indicate the answer choice that best completes the statement or answers the question. Name: 9th Grade Final Test Review160 pts Class: Date: Indicate the answer choice that best completes the statement or answers the question. Use the Distributive Property to write each expression as an

More information

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

Pithy P o i n t s Picked I ' p and Patljr Put By Our P e r i p a tetic Pencil Pusher VOLUME X X X X. Lee Hi^h School Here Friday Ni^ht

Pithy P o i n t s Picked I ' p and Patljr Put By Our P e r i p a tetic Pencil Pusher VOLUME X X X X. Lee Hi^h School Here Friday Ni^ht G G QQ K K Z z U K z q Z 22 x z - z 97 Z x z j K K 33 G - 72 92 33 3% 98 K 924 4 G G K 2 G x G K 2 z K j x x 2 G Z 22 j K K x q j - K 72 G 43-2 2 G G z G - -G G U q - z q - G x) z q 3 26 7 x Zz - G U-

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Put your solution to each problem on a separate sheet of paper. Problem 1. (5166) Assume that two random samples {x i } and {y i } are independently

More information

Sample Geometry. Edps/Soc 584, Psych 594. Carolyn J. Anderson

Sample Geometry. Edps/Soc 584, Psych 594. Carolyn J. Anderson Sample Geometry Edps/Soc 584, Psych 594 Carolyn J. Anderson Department of Educational Psychology I L L I N O I S university of illinois at urbana-champaign c Board of Trustees, University of Illinois Spring

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

Hybrid Censoring Scheme: An Introduction

Hybrid Censoring Scheme: An Introduction Department of Mathematics & Statistics Indian Institute of Technology Kanpur August 19, 2014 Outline 1 2 3 4 5 Outline 1 2 3 4 5 What is? Lifetime data analysis is used to analyze data in which the time

More information

Background and Definitions...2. Legendre s Equation, Functions and Polynomials...4 Legendre s Associated Equation and Functions...

Background and Definitions...2. Legendre s Equation, Functions and Polynomials...4 Legendre s Associated Equation and Functions... Legendre Polynomials and Functions Reading Problems Outline Background and Definitions...2 Definitions...3 Theory...4 Legendre s Equation, Functions and Polynomials...4 Legendre s Associated Equation and

More information

Learning Objectives for Stat 225

Learning Objectives for Stat 225 Learning Objectives for Stat 225 08/20/12 Introduction to Probability: Get some general ideas about probability, and learn how to use sample space to compute the probability of a specific event. Set Theory:

More information

The Jeffreys Prior. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) The Jeffreys Prior MATH / 13

The Jeffreys Prior. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) The Jeffreys Prior MATH / 13 The Jeffreys Prior Yingbo Li Clemson University MATH 9810 Yingbo Li (Clemson) The Jeffreys Prior MATH 9810 1 / 13 Sir Harold Jeffreys English mathematician, statistician, geophysicist, and astronomer His

More information

BAYESIAN ESTIMATION OF THE EXPONENTI- ATED GAMMA PARAMETER AND RELIABILITY FUNCTION UNDER ASYMMETRIC LOSS FUNC- TION

BAYESIAN ESTIMATION OF THE EXPONENTI- ATED GAMMA PARAMETER AND RELIABILITY FUNCTION UNDER ASYMMETRIC LOSS FUNC- TION REVSTAT Statistical Journal Volume 9, Number 3, November 211, 247 26 BAYESIAN ESTIMATION OF THE EXPONENTI- ATED GAMMA PARAMETER AND RELIABILITY FUNCTION UNDER ASYMMETRIC LOSS FUNC- TION Authors: Sanjay

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School

More information

General Bayesian Inference I

General Bayesian Inference I General Bayesian Inference I Outline: Basic concepts, One-parameter models, Noninformative priors. Reading: Chapters 10 and 11 in Kay-I. (Occasional) Simplified Notation. When there is no potential for

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

On the Efficiencies of Several Generalized Least Squares Estimators in a Seemingly Unrelated Regression Model and a Heteroscedastic Model

On the Efficiencies of Several Generalized Least Squares Estimators in a Seemingly Unrelated Regression Model and a Heteroscedastic Model Journal of Multivariate Analysis 70, 8694 (1999) Article ID jmva.1999.1817, available online at http:www.idealibrary.com on On the Efficiencies of Several Generalized Least Squares Estimators in a Seemingly

More information

SOLUTION FOR HOMEWORK 6, STAT 6331

SOLUTION FOR HOMEWORK 6, STAT 6331 SOLUTION FOR HOMEWORK 6, STAT 633. Exerc.7.. It is given that X,...,X n is a sample from N(θ, σ ), and the Bayesian approach is used with Θ N(µ, τ ). The parameters σ, µ and τ are given. (a) Find the joinf

More information

Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of

Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of Probability Sampling Procedures Collection of Data Measures

More information

Non Uniform Bounds on Geometric Approximation Via Stein s Method and w-functions

Non Uniform Bounds on Geometric Approximation Via Stein s Method and w-functions Communications in Statistics Theory and Methods, 40: 45 58, 20 Copyright Taylor & Francis Group, LLC ISSN: 036-0926 print/532-45x online DOI: 0.080/036092090337778 Non Uniform Bounds on Geometric Approximation

More information

Generating Functions of Partitions

Generating Functions of Partitions CHAPTER B Generating Functions of Partitions For a complex sequence {α n n 0,, 2, }, its generating function with a complex variable q is defined by A(q) : α n q n α n [q n ] A(q). When the sequence has

More information

1. Density and properties Brief outline 2. Sampling from multivariate normal and MLE 3. Sampling distribution and large sample behavior of X and S 4.

1. Density and properties Brief outline 2. Sampling from multivariate normal and MLE 3. Sampling distribution and large sample behavior of X and S 4. Multivariate normal distribution Reading: AMSA: pages 149-200 Multivariate Analysis, Spring 2016 Institute of Statistics, National Chiao Tung University March 1, 2016 1. Density and properties Brief outline

More information

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

Section 5.1 Practice Exercises. Vocabulary and Key Concepts

Section 5.1 Practice Exercises. Vocabulary and Key Concepts Section 5.1 Practice Exercises Vocabulary and Key Concepts 1. 1. A(n) is used to show repeated multiplication of the base. 2. For b 0, the expression b 0 is defined to be. 3. For b 0, the expression b

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

Exam C Solutions Spring 2005

Exam C Solutions Spring 2005 Exam C Solutions Spring 005 Question # The CDF is F( x) = 4 ( + x) Observation (x) F(x) compare to: Maximum difference 0. 0.58 0, 0. 0.58 0.7 0.880 0., 0.4 0.680 0.9 0.93 0.4, 0.6 0.53. 0.949 0.6, 0.8

More information

More on Bayes and conjugate forms

More on Bayes and conjugate forms More on Bayes and conjugate forms Saad Mneimneh A cool function, Γ(x) (Gamma) The Gamma function is defined as follows: Γ(x) = t x e t dt For x >, if we integrate by parts ( udv = uv vdu), we have: Γ(x)

More information

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Honors Advanced Algebra Unit 3: Polynomial Functions November 9, 2016 Task 11: Characteristics of Polynomial Functions

Honors Advanced Algebra Unit 3: Polynomial Functions November 9, 2016 Task 11: Characteristics of Polynomial Functions Honors Advanced Algebra Name Unit 3: Polynomial Functions November 9, 2016 Task 11: Characteristics of Polynomial Functions MGSE9 12.F.IF.7 Graph functions expressed symbolically and show key features

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

1 Hierarchical Bayes and Empirical Bayes. MLII Method.

1 Hierarchical Bayes and Empirical Bayes. MLII Method. ISyE8843A, Brani Vidakovic Handout 8 Hierarchical Bayes and Empirical Bayes. MLII Method. Hierarchical Bayes and Empirical Bayes are related by their goals, but quite different by the methods of how these

More information

Lecture 2: Linear Models. Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011

Lecture 2: Linear Models. Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011 Lecture 2: Linear Models Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011 1 Quick Review of the Major Points The general linear model can be written as y = X! + e y = vector

More information

COM336: Neural Computing

COM336: Neural Computing COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Bayesian Statistics and Data Assimilation. Jonathan Stroud. Department of Statistics The George Washington University

Bayesian Statistics and Data Assimilation. Jonathan Stroud. Department of Statistics The George Washington University Bayesian Statistics and Data Assimilation Jonathan Stroud Department of Statistics The George Washington University 1 Outline Motivation Bayesian Statistics Parameter Estimation in Data Assimilation Combined

More information

Stat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS

Stat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS Stat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS 1a. Under the null hypothesis X has the binomial (100,.5) distribution with E(X) = 50 and SE(X) = 5. So P ( X 50 > 10) is (approximately) two tails

More information

Theoretical Properties of Weighted Generalized. Rayleigh and Related Distributions

Theoretical Properties of Weighted Generalized. Rayleigh and Related Distributions Theoretical Mathematics & Applications, vol.2, no.2, 22, 45-62 ISSN: 792-9687 (print, 792-979 (online International Scientific Press, 22 Theoretical Properties of Weighted Generalized Rayleigh and Related

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

A NOTE ON BAYESIAN ESTIMATION FOR THE NEGATIVE-BINOMIAL MODEL. Y. L. Lio

A NOTE ON BAYESIAN ESTIMATION FOR THE NEGATIVE-BINOMIAL MODEL. Y. L. Lio Pliska Stud. Math. Bulgar. 19 (2009), 207 216 STUDIA MATHEMATICA BULGARICA A NOTE ON BAYESIAN ESTIMATION FOR THE NEGATIVE-BINOMIAL MODEL Y. L. Lio The Negative Binomial model, which is generated by a simple

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning Expectation Maximization Mark Schmidt University of British Columbia Winter 2018 Last Time: Learning with MAR Values We discussed learning with missing at random values in data:

More information

Antiderivatives and Indefinite Integrals

Antiderivatives and Indefinite Integrals Antiderivatives and Indefinite Integrals MATH 151 Calculus for Management J. Robert Buchanan Department of Mathematics Fall 2018 Objectives After completing this lesson we will be able to use the definition

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Lior Wolf 2014-15 We know that X ~ B(n,p), but we do not know p. We get a random sample from X, a

More information

Construction Of Binomial Sums For π And Polylogarithmic Constants Inspired By BBP Formulas

Construction Of Binomial Sums For π And Polylogarithmic Constants Inspired By BBP Formulas Applied Mathematics E-Notes, 77, 7-46 c ISSN 67-5 Available free at mirror sites of http://www.math.nthu.edu.tw/ amen/ Construction Of Binomial Sums For π And Polylogarithmic Constants Inspired By BBP

More information

Linear Models for Classification

Linear Models for Classification Linear Models for Classification Oliver Schulte - CMPT 726 Bishop PRML Ch. 4 Classification: Hand-written Digit Recognition CHINE INTELLIGENCE, VOL. 24, NO. 24, APRIL 2002 x i = t i = (0, 0, 0, 1, 0, 0,

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

1) The line has a slope of ) The line passes through (2, 11) and. 6) r(x) = x + 4. From memory match each equation with its graph.

1) The line has a slope of ) The line passes through (2, 11) and. 6) r(x) = x + 4. From memory match each equation with its graph. Review Test 2 Math 1314 Name Write an equation of the line satisfying the given conditions. Write the answer in standard form. 1) The line has a slope of - 2 7 and contains the point (3, 1). Use the point-slope

More information

Review of Probability Theory II

Review of Probability Theory II Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function

More information

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

11. Learning graphical models

11. Learning graphical models Learning graphical models 11-1 11. Learning graphical models Maximum likelihood Parameter learning Structural learning Learning partially observed graphical models Learning graphical models 11-2 statistical

More information

Monte Carlo integration

Monte Carlo integration Monte Carlo integration Eample of a Monte Carlo sampler in D: imagine a circle radius L/ within a square of LL. If points are randoml generated over the square, what s the probabilit to hit within circle?

More information

The linear model is the most fundamental of all serious statistical models encompassing:

The linear model is the most fundamental of all serious statistical models encompassing: Linear Regression Models: A Bayesian perspective Ingredients of a linear model include an n 1 response vector y = (y 1,..., y n ) T and an n p design matrix (e.g. including regressors) X = [x 1,..., x

More information

Subject CS1 Actuarial Statistics 1 Core Principles

Subject CS1 Actuarial Statistics 1 Core Principles Institute of Actuaries of India Subject CS1 Actuarial Statistics 1 Core Principles For 2019 Examinations Aim The aim of the Actuarial Statistics 1 subject is to provide a grounding in mathematical and

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

A note on Gaussian distributions in R n

A note on Gaussian distributions in R n Proc. Indian Acad. Sci. (Math. Sci. Vol., No. 4, November 0, pp. 635 644. c Indian Academy of Sciences A note on Gaussian distributions in R n B G MANJUNATH and K R PARTHASARATHY Indian Statistical Institute,

More information

Chapter 5: Quadratic Functions

Chapter 5: Quadratic Functions Section 5.1: Square Root Property #1-20: Solve the equations using the square root property. 1) x 2 = 16 2) y 2 = 25 3) b 2 = 49 4) a 2 = 16 5) m 2 = 98 6) d 2 = 24 7) x 2 = 75 8) x 2 = 54 9) (x 3) 2 =

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2015 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.) Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori

More information

MA6451 PROBABILITY AND RANDOM PROCESSES

MA6451 PROBABILITY AND RANDOM PROCESSES MA6451 PROBABILITY AND RANDOM PROCESSES UNIT I RANDOM VARIABLES 1.1 Discrete and continuous random variables 1. Show that the function is a probability density function of a random variable X. (Apr/May

More information

AMCS243/CS243/EE243 Probability and Statistics. Fall Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A

AMCS243/CS243/EE243 Probability and Statistics. Fall Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A AMCS243/CS243/EE243 Probability and Statistics Fall 2013 Final Exam: Sunday Dec. 8, 3:00pm- 5:50pm VERSION A *********************************************************** ID: ***********************************************************

More information

Probability & Statistics - FALL 2008 FINAL EXAM

Probability & Statistics - FALL 2008 FINAL EXAM 550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

10. Exchangeability and hierarchical models Objective. Recommended reading

10. Exchangeability and hierarchical models Objective. Recommended reading 10. Exchangeability and hierarchical models Objective Introduce exchangeability and its relation to Bayesian hierarchical models. Show how to fit such models using fully and empirical Bayesian methods.

More information

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain 0.1. INTRODUCTION 1 0.1 Introduction R. A. Fisher, a pioneer in the development of mathematical statistics, introduced a measure of the amount of information contained in an observaton from f(x θ). Fisher

More information

BTRY 4090: Spring 2009 Theory of Statistics

BTRY 4090: Spring 2009 Theory of Statistics BTRY 4090: Spring 2009 Theory of Statistics Guozhang Wang September 25, 2010 1 Review of Probability We begin with a real example of using probability to solve computationally intensive (or infeasible)

More information

Some Curiosities Arising in Objective Bayesian Analysis

Some Curiosities Arising in Objective Bayesian Analysis . Some Curiosities Arising in Objective Bayesian Analysis Jim Berger Duke University Statistical and Applied Mathematical Institute Yale University May 15, 2009 1 Three vignettes related to John s work

More information

Department of Statistical Science FIRST YEAR EXAM - SPRING 2015

Department of Statistical Science FIRST YEAR EXAM - SPRING 2015 Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 2015 Monday 4th May 2015 NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam; please write

More information

Page Max. Possible Points Total 100

Page Max. Possible Points Total 100 Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic

More information

Part 6: Multivariate Normal and Linear Models

Part 6: Multivariate Normal and Linear Models Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of

More information

Petr Volf. Model for Difference of Two Series of Poisson-like Count Data

Petr Volf. Model for Difference of Two Series of Poisson-like Count Data Petr Volf Institute of Information Theory and Automation Academy of Sciences of the Czech Republic Pod vodárenskou věží 4, 182 8 Praha 8 e-mail: volf@utia.cas.cz Model for Difference of Two Series of Poisson-like

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

The Complementary Exponential-Geometric Distribution Based On Generalized Order Statistics

The Complementary Exponential-Geometric Distribution Based On Generalized Order Statistics Applied Mathematics E-Notes, 152015, 287-303 c ISSN 1607-2510 Available free at mirror sites of http://www.math.nthu.edu.tw/ amen/ The Complementary Exponential-Geometric Distribution Based On Generalized

More information

MIT Spring 2015

MIT Spring 2015 Assessing Goodness Of Fit MIT 8.443 Dr. Kempthorne Spring 205 Outline 2 Poisson Distribution Counts of events that occur at constant rate Counts in disjoint intervals/regions are independent If intervals/regions

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 00 MODULE : Statistical Inference Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks. The

More information

On discrete distributions with gaps having ALM property

On discrete distributions with gaps having ALM property ProbStat Forum, Volume 05, April 202, Pages 32 37 ISSN 0974-3235 ProbStat Forum is an e-journal. For details please visit www.probstat.org.in On discrete distributions with gaps having ALM property E.

More information

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout

More information