PREDICTION INTERVALS FOR FUTURE SAMPLE MEAN FROM INVERSE GAUSSIAN DISTRIBUTION

Similar documents
Topic 9: Sampling Distributions of Estimators

Expectation and Variance of a random variable

Topic 9: Sampling Distributions of Estimators

Stat 319 Theory of Statistics (2) Exercises

Topic 9: Sampling Distributions of Estimators

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Stat 421-SP2012 Interval Estimation Section

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

Chapter 6 Principles of Data Reduction

Data Analysis and Statistical Methods Statistics 651

R. van Zyl 1, A.J. van der Merwe 2. Quintiles International, University of the Free State

x = Pr ( X (n) βx ) =

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Lecture 7: Properties of Random Samples

Statistics 20: Final Exam Solutions Summer Session 2007

Section 14. Simple linear regression.

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

Stat 200 -Testing Summary Page 1

STA Learning Objectives. Population Proportions. Module 10 Comparing Two Proportions. Upon completing this module, you should be able to:

Comparing Two Populations. Topic 15 - Two Sample Inference I. Comparing Two Means. Comparing Two Pop Means. Background Reading

Confidence Interval for Standard Deviation of Normal Distribution with Known Coefficients of Variation

Chapter 13: Tests of Hypothesis Section 13.1 Introduction

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Chapter 22. Comparing Two Proportions. Copyright 2010, 2007, 2004 Pearson Education, Inc.

Chapter 22. Comparing Two Proportions. Copyright 2010 Pearson Education, Inc.

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

t distribution [34] : used to test a mean against an hypothesized value (H 0 : µ = µ 0 ) or the difference

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Chapter 6 Sampling Distributions

1 Introduction to reducing variance in Monte Carlo simulations

5. Likelihood Ratio Tests

Module 1 Fundamentals in statistics

Statistics 203 Introduction to Regression and Analysis of Variance Assignment #1 Solutions January 20, 2005

A goodness-of-fit test based on the empirical characteristic function and a comparison of tests for normality

Lecture 11 and 12: Basic estimation theory

2 1. The r.s., of size n2, from population 2 will be. 2 and 2. 2) The two populations are independent. This implies that all of the n1 n2

BIOS 4110: Introduction to Biostatistics. Breheny. Lab #9

Matrix Representation of Data in Experiment

Bayesian and E- Bayesian Method of Estimation of Parameter of Rayleigh Distribution- A Bayesian Approach under Linex Loss Function

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

x 2 x x x x x + x x +2 x

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Background Information

Sample Size Determination (Two or More Samples)

Estimation for Complete Data

Statistical Theory MT 2009 Problems 1: Solution sketches

Chapter 13, Part A Analysis of Variance and Experimental Design

Test of Statistics - Prof. M. Romanazzi

Chapter 8: Estimating with Confidence

Lecture 2: Monte Carlo Simulation

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

1 Inferential Methods for Correlation and Regression Analysis

Questions and Answers on Maximum Likelihood

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

f(x i ; ) L(x; p) = i=1 To estimate the value of that maximizes L or equivalently ln L we will set =0, for i =1, 2,...,m p x i (1 p) 1 x i i=1


Statistical Theory MT 2008 Problems 1: Solution sketches

Statistical Intervals for a Single Sample

Bayesian Control Charts for the Two-parameter Exponential Distribution

Lecture 23: Minimal sufficiency

Topic 10: Introduction to Estimation

Lecture 3. Properties of Summary Statistics: Sampling Distribution

7.1 Convergence of sequences of random variables

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

Last Lecture. Wald Test

Confidence interval for the two-parameter exponentiated Gumbel distribution based on record values

Lesson 2. Projects and Hand-ins. Hypothesis testing Chaptre 3. { } x=172.0 = 3.67

Confidence Intervals for the Population Proportion p

TMA4245 Statistics. Corrected 30 May and 4 June Norwegian University of Science and Technology Department of Mathematical Sciences.

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

1 Approximating Integrals using Taylor Polynomials

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

[412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

A statistical method to determine sample size to estimate characteristic value of soil parameters

1 Review of Probability & Statistics

Access to the published version may require journal subscription. Published with permission from: Elsevier.

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker

University of California, Los Angeles Department of Statistics. Practice problems - simple regression 2 - solutions

Approximate Confidence Interval for the Reciprocal of a Normal Mean with a Known Coefficient of Variation

Linear Regression Demystified

Goodness-Of-Fit For The Generalized Exponential Distribution. Abstract

The Boolean Ring of Intervals

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

Tests of Hypotheses Based on a Single Sample (Devore Chapter Eight)

1.010 Uncertainty in Engineering Fall 2008

Maximum likelihood estimation from record-breaking data for the generalized Pareto distribution

Topic 18: Composite Hypotheses

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

Confidence Intervals For P(X less than Y) In The Exponential Case With Common Location Parameter

Transcription:

Qatar Uiv. Sci. J. (1991), 11: 19-26 PREDICTION INTERVALS FOR FUTURE SAMPLE MEAN FROM INVERSE GAUSSIAN DISTRIBUTION By MUHAMMAD S. ABU-SALIH ad RAFIQ K. AL-BAITAT Departmet of Statistics, Yarmouk Uiversity, Irbid, Jorda ABSTRACT A radom sample Xt. X 2,, X from Iverse Gaussia distributio I(Jt,A) is observed. O the basis of this observed sample a 113 predictio iterval of the mea Y of a future sample Yt...., Ym from I(Jt,A) has bee costructed whe either or both Jt & ;>,. are ukow. INTRODUCTION A predictio iterval is a iterval which cotais the results of a future sample from a populatio depedig o the results of a past sample from the same populatio with a specified probability. Predictio itervals play a importat role i quality cotrol ad reliability. I statistics, they are beig used i goodess of fit tests, hypothesis testig, classifyig observatios, sample surveys,... etc. (Eglehardt & Bai 1979). This topic has bee discussed by may authors, of whom we metio a few who derived predictio itervals for derived predictio itervals for the future sample mea. Lawless (1972) gave predictio limits for Y i the case of the expoetial distributio. Kamisky ad Nelso (1974) gave predictio iterval for the mea of a future sample usig susbsets of a observed sample whe the samples are from the expoetial distributio. Hah (1975) gave a predictio iterval to cotai the differece betwee the sample meas of two future samples from ormal distributios. Meeker ad Hah (1980) gave predictio iterval for the ratio of the meas of two future samples from ormal distributios. Azzam ad Awad (1981) gave predictio itervals for the differece ad ratio of the meas of two future samples uder the assumptio that the samples are from a

Predictio itervals for future sample mea ormal ad a expoetial distributio. Vee-Mig (1984) costructed predictio itervals for the mea life time of a future sample based o icomplete data take from a 2-parameters expoetial distributio. I this work we derive predictio itervals for the future sample mea from the iverse Gaussia distributio. Chhiekara ad Guttma (1982) used a observed sample from the iverse Gaussia distributio to costruct a predictio iterval to cotai the ext future observatio. The probability desity fuctio (pdf) f(.) of a iverse Gaussia distributio deoted by I(JL,A) is give by f(x,jl,a) (1.1) 0 o.w. X. is shape parameter (Tweedi, 1957). The pdf is skewed uimodal ad is a member of the expoetial family. Iverse Gaussia distributio is used as a life time model (Chhikara & Folks, 1977). It is kow that if X - I(JL,A), the E(X)=JL, ad Var(X)=JL 3 /X.. PREDICTION INTERVAL FOR THE FUTURE SAMPLE MEAN Let XI>..., X be a radom sample from a iverse Gaussia distributio I(JL,A) with desity fuctio f(x,jl,a), as give i (1.1). Let Y 1,..., Y m be a future radom sample from the same populatio with the same parameters JL ad X.. Assume that the two samples are idepedet of oe aother ad that both JL ad X. are ukow parameters. Shuster (1968) showed that (2.1) 20

MUHAMMAD S. ABU-SALIH ad RAFIQ K. AL-BATIAT (2.2) (2.3) m X= I X/, Y I Y/m i=1 i=j It may be see that X ad I ( 11 Xi - 1 I X ) are the maximum likelihood i=1 estimators of JL ad ~ respectively ad the two statistics are idepedet, (Tweedi, 1957), Furthermore, X - l(p., ~) ad ~I ( 1/ xi-11 X ) - X 2-1 (Wasa, ad Roy, 1969). i=1 Clearly ~(X - P-) 2 - x\ i view of (2.1). P-2 x Thus from (2.2) ad (2.3) (2.4) m I ( 11 xi - 1/.x) + I ( 11 Yi- H Y) (2.5) i=1 02 ~(X - p.) 2 d 0 m~(y - p.)z -:...-=--~ a 3 = --'-----'--'-- IL2 X IL2 y are idepedetly distributed as x~ 1 + m _ 2, xi ad xi respectively (Hogg & Craig, 1978).. Now, 0 2 ad 0 3 ca be combied differetly so that the right side of (2.4) equals 0 1 +A+ B. 21

Predictio itervals for future sample mea ad A= ma(x- Y) 2 XY(X +my) (2.6) A[(X +my) -~t( + mh B = -------==----=-- ~A-2 (X + my) (2.7) I view of (2.1), it ca be see that B has x\. Therefore, A, B give i (2.6) ad (2.7) are idepedetly distributed each as x\ (Hogg & Craig, 1978, p. 279). Notice that 0 1 ad A do ot ivolve the parameter IL ad ca be used to obtai 1~ percet predictio itervals for Y whe IL is ukow ad A is kow. I this case we use the statistic A= ma(x- Y) /XY(X +my) for the predictio of Y. (2.8) Sice A~ x\ the 1~% predictio iterval for Y will be obtaied by solvig P[A =::::; x\,~] = 1- ~for Y. The predictio iterval for Y whe A is kow will be give by: (2.9) Whe IL ad A are ukow, cosider the statistic R = ( + m - Ql 2 ) A, so that substitutig for A ad 0 1, we get: - -2 -- - - R = ( + m- 2) (X- Y) /XY(X +my) V (2.10) V = 0 1 /m (2.11) R does ot deped o ay parameter IL or A ad has F distributio with 1 ad (+m-2) degrees of freedom. Hece, R ca be used to costruct predictio itervals for Y whe both IL ad A are ukow. The 1~ predictio iterval for Y will be obtaied by ivertig the iequality R =::::; F 1,+ m _, 13 F 1, +m-, 13 is the 1~% poit of the F 1,+'m-,2 distributio. 22

MUHAMMAD S. ABU-SALIH ad RAFIQ K. AL-BA IT AT Usig (2.10) ad simplifyig we get the 113% predictio iterval for Y to be: [ 1 + (F1,+m-2, 13 ) V + {(VFl,+m-2,13)2 + V(+m)Fl,+m-2,13}]<~. 12 ) X 2(+m-2) 4(+m-2) 2 (+m-2)x Whe J.L is kow, ad A is ukow, cosider the followig statistic Q =I (Xi - i=l J.L) 2 1 xi T = m(y - J.L) 2 YQ It is easy to see that Tis distributed as F 1, Agai the 113% predictio iterval for Y (for J.L kow) will be obtaied by solvig T :!S F 1,, 13 ad is give by [ Q ] _ Q [ 4mJ.L 2 ] lf2 +-F +- --F +F J.L 2m l,, 13 2m Q l,, 13 l,, 13 (2.13) Note that A, R ad T do ot always provide two sided itervals because there is a possibility that the differece i the two terms of (2.9), (2.12) ad (2.13) may be egative. Sice y > 0 it ca be see that oly the oe-sided itervals with oo as the upper limit are admissible ad these are obtaied by restrictig the solutio of a iequality to the positive real lie. Example: Cosider the data give by Chhikara ad Folks (1977) with = 46, jl = x = 3.61 ad A= 1.66, V = -A- 1 = 0.66. Based o these data a 113 percet predictio limits usig (2.9) ad (2.12) were computed for various values of 13. These limits are give i Table 1 ad Table 2 respectively. Table 1 Predictio itervals for varyig A for m 13 A = 1.66 46, X 3.61 (formula (2.9)) A= 2.0 0.9 2.29 0.950 2.12 0.975 1.84 0.990 1.74 6.515 2.38 6.127 7.5 2.22 6.927 10.332 1.94 9.095 11.904 1.85 10.221 23

Predictio itervals for future sample mea A= 2.5 A= 3.0 0.9 0.950 0.975 0.990 2.48 2.32 2.05 1.96 5.747 6.383 8.016 8.814 2.55 5.491 2.40 6.025 2.15 7.349 2.06 7.973 A= 3.5 A= 4.0 Lowet 0.9 0.950 0.975 0.990 2.62 2.47 2.22 2.14 5.304 5.769 6.891 7.407 2.67 5.160 2.53 5.574 229 6.555 2.20 6.997 Commet: As we ca see from the results give above that the legth of the iterval becomes smaller as A icreases, which is a desirable property. Table 2 Predictio itervals for varyig m for 46, X 3.61 (formula 2.12) m = 4 m= 16 0.9 0.950 0.975 0.990 0.46 0.45 0.28 0.21 57.02 138.48 1416.28 0.54 0.41 0.33 0.26 m = 36 m = 76 0.9 0.950 0.975 0.990 0.63 0.49 0.40 0.32 0.74 0.60 0.51 0.42 Commet: It ca be see from the above table that as m icreases the upper limit goes to oo ad the lower limit icreases. If we compare the result of Table 1 ad 2, we ifer that we get shorter predictio itervals whe A is kow, rather tha whe A is ukow. 24

MUHAMMAD S. ABU-SALIH ad RAFIQ K. AL-BATIAT REFERENCES Azzam, M.M. ad A.M. A wad, 1981. Predictio itervals for a future sample mea ad for the differece or ratio of two future sample meas. Dirasat, 8: 21-27. Chbikara, R.S. ad I. Guttma, 1982. Predictio limits for the iverse Gaussia distributio. Techometrics, 24: 319-324. Egelhardt, M. ad L.J. Bai, 1979. Predictio limits ad two-sample problems with complete or cesored Weibull data. Techometrics, 21: 233-237. Hah, G.J. 1975. A simultaeous predictio limit o the mea of future samples from a expoetial distributio. Techometrics, 17: 341-344. Hogg, R. V. ad A. T. Craig, 1978. Itroductio to Mathematical Statistics, Fourth Ed. MacMilla Publishig Co. Ic. New York. Kamisky, K.S. ad P.I. Nelso, 1974. Predictio itervals for the expoetial distributio usig subsets of the data. Techometrics, 16: 57-59. Lawless, J.F. 1972. O predictio itervals for samples from the expoetial distributio ad predictio limits for system survival. Sakhya, SerB, 34: 1-12. Meeker, J.V. ad G.J. Hah, 1980. Predictio iterval for the ratio of ormal distributio samples mea. Techometrics, 22: 357-366. Shuster, J.J. 1968. O the iverse Gaussia distributio fuctio. J. Amer. Statist. Ass., 63: 1514-1516. Tweedie, M.C.K. 1957. Statistical Properties of Iverse Gaussia Distributio I. A. Math. Statist. 28: 362-377. Wasa, M.T. ad L.K. Roy, (1969). Tables of Iverse Gaussia Percetage Poits. Techometrics. 11: 591-604. 25

......... I - ~~-.i ' I...,, "0... ::t,...l:t.i.u t_.-_,j ~ u J Y Y - I-- :::' _I y I. II- -- - 'I :::-t.. 'I I"._. b..., ".: : uo ~.-..._,... ~~ <lij-1-8 "'~. <ut...o ~.a w~ ~I )IS,1 0A ~1 tijll-4 ~..~&. 4Jb ~ l(p.,x) ~I '-"'"'~ ~~ 0A :\~~to. ~.:u:;ji ~ 0A Yl,,Y ~L..., JL L~l di~,j A.,J JL 26