Statistical Analysis of Unreplicated Factorial Designs Using Contrasts

Size: px
Start display at page:

Download "Statistical Analysis of Unreplicated Factorial Designs Using Contrasts"

Transcription

1 Georgia Southern University Digital Southern Electronic Theses & Dissertations Jack N. Averitt College of Graduate Studies COGS) Summer 204 Statistical Analysis of Unreplicated Factorial Designs Using Contrasts Meixi Yang Georgia Southern University Follow this and additional works at: Part of the Design of Experiments and Sample Surveys Commons Recommended Citation Yang, Meixi, "Statistical Analysis of Unreplicated Factorial Designs Using Contrasts" 204). Electronic Theses & Dissertations This thesis open access) is brought to you for free and open access by the Jack N. Averitt College of Graduate Studies COGS) at Digital Southern. It has been accepted for inclusion in Electronic Theses & Dissertations by an authorized administrator of Digital Southern. For more information, please contact

2 STATISTICAL ANALYSIS OF UNREPLICATED FACTORIAL DESIGNS USING CONTRASTS by MEIXI YANG Under the Direction of Charles W. Champ) ABSTRACT Factorial designs can have a large number of treatments due to the number of factors and the number of levels of each factor. The number of experimental units required for a researcher to conduct a k factorial experiment is at least the number of treatments. For such an experiment, the total number of experimental units will also depend on the number of replicates for each treatment. The more experimental units used in a study the more the cost to the researcher. The minimum cost is associated with the case in which there is one experimental unit per treatment. That is, an unreplicated k factorial experiment would be the least costly. In an unreplicated experiment, the researcher cannot use analysis of variance to analyze the data. We propose a method that analyzes the data using normal probability plot of estimated contrast of the main effects and interactions. This method is applied to data and compared with Tukey s method that test for non-additivity. Our method is also discussed for use when the response is a multivariate set of measurements. Key Words: Contrast, normal probability plot, factorial design, Tukey 2009 Mathematics Subject Classification: 2K5, 2H99

3 STATISTICAL ANALYSIS OF UNREPLICATED FACTORIAL DESIGNS USING CONTRASTS by MEIXI YANG B.S., Shandong University of Science and Technology, China, 2009 M.S., Shandong University of Science and Technology, China, 202 A Thesis Submitted to the Graduate Faculty of Georgia Southern University in Partial Fulfillment of the Requirement for the Degree MASTER OF SCIENCE STATESBORO, GEORGIA 204

4 c 204 MEIXI YANG All Rights Reserved iii

5 STATISTICAL ANALYSIS OF UNREPLICATED FACTORIAL DESIGNS USING CONTRASTS by MEIXI YANG Major Professor: Charles W. Champ Committee: Broderick O. Oluyede Lili Yu Electronic Version Approved: July 24, 204 iv

6 ACKNOWLEDGMENTS I would like to thank my advisor Dr. Charles W. Champ for guidance of me in my research. I also want to thank my committee members Dr. Broderick O. Oluyede and Dr. Lili Yu for reviewing my thesis and their guidance in my academic endeavors. v

7 TABLE OF CONTENTS Page ACKNOWLEDGMENTS v LIST OF TABLES viii LIST OF FIGURES ix CHAPTER Introduction Two Factor Design Model with a Univariate Response Introduction The Two Factor Design with Replicates One Observation per Treatment No Interaction Assumed Tukey s Method for One Observation per Treatment Using a Normal Probability Plot Some Examples Conclusion Three Factor Experiments Introduction Tukey s Method for Three Factors Analyzing a Reduced Model Analysis of Contrasts Unreplicated 3 k Factorial Designs vi

8 3. Example Conclusion Unreplicated Multivariate Factorial Designs Introduction Design and Data Models Parameter Estimation and Contrasts Example Conclusion CONCLUSION General Conclusions Areas for Further Research REFERENCES vii

9 LIST OF TABLES Table Page 2. Montgomery s Example Kutner s Exercise Montgomery s Example Johnson s Example viii

10 LIST OF FIGURES Figure Page 2. Montgomery Example, Probability Plot Montgomery Example, Probability Plot and Fitted Line Points Kutner Example, Probability Plot Kutner Example, Probability Plot and Fitted Line 9 Points Kutner Example, Probability Plot and Fitted Line 8 Points Montgomery Example, Probability Plot Montgomery Example, Probability Plot and Fitted Line 8 Points Johnson Example, Probability Plot Johnson Example, Probability Plot ix

11 CHAPTER INTRODUCTION Analyzing data from a designed experiment using ANalysis Of VAriance ANOVA), generally requires at least two replicates for at least one treatment. There are, however, researchers who need to use unreplicated one observation per treatment) designs. Under the independent normal model with common variance, these designs do not provide enough data to independently estimate the overall mean, main effects, interactions, and the common variance. Montgomery 997) states concerning the analysis of data from an unreplicated two factor fixed effects design that there are no tests on main effects unless the interaction effect is zero. He also points out that even a moderate number of factors, the total number of treatment combinations in a 2 k factorial design is large. This is even a large number of total treatments for factorial designs in which the levels of one or more of the factors is greater than 2. Three methods are discussed in the literature for analyzing the response data in a two factor fixed effects model with one observation per treatment. The first of these is to assume there is no interaction between the factors. This is the additive model. The second method uses a regression model that elimates higher-order polynomials. The third method is a test developed by Tukey 949) for determining if there is an interaction. He states that the professional practitioner of the analysis of variance will have no difficulty in extending the process to more complex designs. These methods are discussed in Alin and Kurt 200) and Francka, Nielsenb, and Osbornec 203). We will examine an extension of Tukey s method to a three factor design. Various authors have examined method for evaluating the data from a 2 k factorial design with no replicates for a univariate response. One of these methods that is commonly recommended is the use of a normal probability plot of the estimates of the

12 2 main effects and interactions. We plan to study the use of normal probability plots in the analysis of unreplicated k factorial designs in which each of the factors has two or more levels with at least one factor having three or more levels. This will include cases in which there is a univariate response and there is a multivariate response. As we will demonstrate, the estimators of the main effects and interactions in an unreplicated fixed effects k factor design in which at least one factor has more than two levels are correlated. We propose a transformation of these estimators to a collection of independent random variables with common variance. Under the hypothesis of no main effects or interactions, these estimators under the independent normal model with common variance σ 2 Σ for a multivariate response) will be a random sample with common N 0, σ 2 ) N p 0, Σ) for a multivariate response) distributon. A normal probability plot of the transformed estimates of the main effects and interactions will be used to determine which linear combinations of the main effects and interactions are significantly different from zero. An examimation of the associated parameters will reveal which, if any, of the main effects and interactions are significantly different from zero.

13 CHAPTER 2 TWO FACTOR DESIGN MODEL WITH A UNIVARIATE RESPONSE 2. Introduction In a variety of studies, researchers are interested in studying the effect of two or more factors on a response variable. As has been shown by a several authors, factorial designs are the most efficient way to conduct such studies. A factor is a variable whose values are selected by the researcher. The possible values of a factor are called the levels of the factor. How the values of a factor are selected determines if the study is a fixed effect or random effect factorial design. If the levels of a factor are the only ones of interest to the researcher, then the study is a fixed effect factorial design with respect this factor. If random selection is used to select from a collection of possible values of a factor the levels of the factor to be studied, then the factorial design is a random effects factorial design with respect to this factor. The treatments in a factorial design are all the possible factor level combinatins. In our study, it will be convenient to discuss first two-factor design with replications before examining designs without replicates. 2.2 The Two Factor Design with Replicates We begin our study of factorial designs by examining two k = 2) factor designs. Under the additive model, the response variable Y ijl can be expressed as Y ijl = µ ij + ɛ ijl with µ ij = µ + τ ) i + τ 2 ) j + τ 2 ) ij

14 4 for i =,..., a, j =,..., b, and l =,..., n with n >. We have expressed the mean µ ij of the response variable Y ijl as the sum of an overall mean µ, the effect τ ) i due to setting the first factor at level i, the effect τ 2 ) j of setting the second factor at its jth level, and an effect τ 2 ) ij due to the interaction between the two factors when the first is set at its ith level and the second at its jth level. It is assumed that a a i= τ ) i = 0; b j= τ 2) j = 0; i= τ 2) ij = 0 for j =,..., b; and b j= τ 2) ij = 0 for i =,..., a. We also assume that the Y ijl s are independent and ɛ ijl N 0, σ 2 ij). We refer to these assumptions as the independent normal model. The model is further simplified by assuming a common variance, that is, σ 2 ijl = σ2 the common variance for i =,..., a, j =,..., b, and l =,..., n. The design is an unreplicated one if n =. Using matrix notation, we can write our additive model in the form Y = Xθ + ɛ, where Y is the abn vector of observations, X is the abn ab design matrix, θ is the ab vector of model parameters, and ɛ is the abn vector of error terms. An analysis of variance ANOVA) of the response data is based on the following partition of the sum of squares total SST ). SST = SSA + SSB + SSAB + SSE,

15 5 where SST = a SSA = a SSB = a SSAB = a SSE = a b n i= j= l= b n i= j= l= b n i= j= l= b n i= j= l= b n i= j= l= ɛ2 ijl. Yijl Y... ) 2 ; Y i.. Y... ) 2 ; Y.j. Y... ) 2 ; Y ij. Y i.. Y.j. + Y... ) 2 ; and It can be shown that SSA, SSB, SSAB, and SSE are stochastically independent under our independent normal model. The degrees of freedom of these sums of squares are df SST = abn ; df SSA = a ; df SSB = b ; df SSAB = a ) b ) ; and df SSE = n ) ab. The mean squares associated with SSA, SSB, SSAB, and SSE are, respectively, MSA = SSA df SSA, MSB = SSB df SSB, MSAB = SSAB df SSAB, and MSE = SSE df SSE. Note that if n =, then df SSE = 0 and the MSE is undefined. The null H 0 ) and alternative H ) hypotheses of interest can be written in terms of the following hypotheses. H A,0 : τ ) =... = τ ) a = 0 and H A, : H A,0 ; H B,0 : τ 2 ) =... = τ 2 ) b = 0 and H B, : H B,0 ; and H AB,0 : τ 2 ) =... = τ 2 ) ab = 0 and H AB, : H AB,0 The alternative hypothesis in this study is H : H AB, [H AB,0 H A, H B, )]

16 with the null hypothesis H 0 : H. The statistical test has decision rule that rejects the null hypotheis in favor of the alternative hypothesis if the observed value of MSAB MSE It can be shown that c AB [ MSAB MSE < c AB MSA MSE c A MSB )] MSE c B. SSA σ 2 χ 2 a,ξ, SSB A 2 σ 2 χ 2 b,ξ, SSAB B 2 σ 2 χ 2 a )b ),ξ, and SSE AB 2 σ 2 χ 2 n )ab, where ξa 2 = nb a i= τ ) 2 i, ξ 2 aσ 2 B = na b j= τ 2) 2 j, and ξ 2 bσ 2 AB = n a b i= j= τ 2) 2 ij σ 2 [a ) b ) + ]. If the null hypothesis is true, then we have ξ A = ξ B = ξ AB = 0. The size of the test α is given by [ MSAB MSAB MSA α = P MSE c AB MSE < c AB MSE c A MSB )]) MSE c B ) χ 2 a )b ) / [a ) b )] = P c χ 2 AB n )ab/ [n ) ab] ) χ 2 a )b ) / [a ) b )] χ 2 + P χ 2 n )ab / [n ) ab] < c AB, a / a ) χ 2 n )ab / [n ) ab] c A ) χ 2 a )b ) / [a ) b )] χ 2 b / b ) + P χ 2 n )ab / [n ) ab] < c AB, χ 2 n )ab / [n ) ab] c B = F Fa )b ),n )ab c AB ) F χ 2 a )b ) n ) abxcab a ) b ) F χ 2 a )b ) n ) abxcab a ) b ) ) ) n ) abxca F χ 2 a a ) F χ 2 b n ) abxcb b f χ 2 n )ab x) dx ) f χ 2 n )ab x) dx,

17 7 where ) n ) abxca F χ 2 a a ) F χ 2 b n ) abxcb b ) n ) abxca = F χ 2 a a = F χ 2 b n ) abxcb b ). and The power of the test is determined by power = F Fa )b ),m ab,ξab c AB ) ) [ )] m ab) xcab m ab) xca + F χ 2 a )b ),ξab F 0 a ) b ) χ 2 a,ξa f a χ 2 m ab x) dx ) [ )] m ab) xcab m ab) xcb + F χ 2 a )b ),ξab F a ) b ) χ 2 b,ξb f b χ 2 m ab x) dx, 0 where at least one of the value ξ A, ξ B, and ξ AB is not equal to zero. 2.3 One Observation per Treatment No Interaction Assumed For the case of one observation per treatment n = ), there is only enough data to estimate independently the overall mean, the main effects, and the interactions but not the common variance in our model. One approach to analyzing the data for main effects is to assume there is no interaction between the two factors. In this case the model becomes µ ij = µ + τ ) i + τ 2 ) j. The null and alternative hypotheses are H 0 : τ ) =... = τ ) a = 0 and τ 2 ) =... = τ 2 ) b = 0; and H : H 0. The total sum of squares SST ) can be partitioned into the sum of square due to the first factor SSA), the sum of squares due to the second factor SSB), and the sum

18 8 of squares SSE) due to error. That is, SST = SSA + SSB + SSE. There respective degrees of freedom are df SST = ab, df SSA = a, df SSB = b, and df SSE = a ) b ). Under this model, it can be shown that SSA σ 2 χ 2 a,ξ, SSB A 2 σ 2 χ 2 b,ξ, and SSE B 2 σ 2 χ 2 n )ab, where ξa 2 = nb a i= τ ) 2 i, and ξ 2 aσ 2 B = na b j= τ 2) 2 j. bσ 2 The test based on the analysis of variances rejects H 0 in favor of H if the observed value of where MSA = b a ) 2 i= Y i. Y.. a MSE = MSA MSE c A MSB MSE c B, SST SSA SSB. a ) b ), MSB = a b ) 2 j= Y.j Y.. b, and The size of the test is ) ) ) ) MSA MSB MSA MSB α = P MSE c A + P MSE c B P MSE c A P MSE c B = P ) ) F a,a )b ) c A + P Fb,a )b ) c B P χ 2 a xc ) A P χ 2 b xc ) B f b a χ 2 a )b ) x) dx 0 = F Fa,a )b ) c A ) + F Fb,a )b ) c B ) ) ) xca xcb F χ 2 a F b χ 2 b f a χ 2 a )b ) x) dx, 0

19 9 where ) xca F χ 2 a b ) xca = F χ 2 a b ) ) xcb xcb and F χ 2 b = F a χ 2 b. a For example in the case in which a = 3 and b = 5, if the researcher selects the critical values c A and c B to be c A = F 3,3 )5 ),0.05 and c B = F 5,3 )5 ),0,05, then the actual size of the test is )) xfinv α; a, a ) b )) α = 2α ChiSquareDist ; a 0 b )) xfinv α; b, a ) b )) ChiSquareDist ; a a ChiSquareDen x; a ) b )) dx = Tukey s Method for One Observation per Treatment Tukey 949) developed a test for determining if there is an interaction between the two factors which assumes the interactions are of the form τ 2 ) ij = λ τ ) i τ 2 ) j. In this model, the a + b + 2 parameters including the common variance can be estimated. To determine these estimates using least squares we define the function Q = Q µ, τ ),..., τ ) a, τ 2 ),..., τ 2 ) b, λ) = a b ) 2 Y ij µ τ ) i τ 2 ) j λ τ ) i τ 2 ) j i= j= The least square estimates are the solutions to the following system of equations. Q µ = 0; Q = 0; τ ) i Q τ 2 ) j = 0; Q λ = 0.

20 The least squares estimates of the parameters µ, τ ),..., τ ) a, τ 2 ),...,τ 2 ) b, and λ are given, respectively, by 0 µ = Y.., τ ) i = Y i. Y.., τ 2 ) j = Y.j Y.., and a b ) ) i= j= Y i. Y.. Y.j Y.. Yij λ = a b ) 2 ) 2. i= j= Y i. Y.. Y.j Y.. We can now express Y ij as Y ij = Y.. + Y i. Y.. ) + Y.j Y.. ) + λ Y i. Y.. ) Y.j Y.. ) + ɛij. It follows that ɛ ij can be expressed as ɛ ij = Y ij Y.. Y i. Y.. ) Y.j Y.. ) λ Y i. Y.. ) Y.j Y.. ). The total sum of squares SST can now be partitioned into SST = SSA + SSB + SSAB + SSE, where SST, SSA, and SSB are defined in the previous section with SSAB = a SSE = a b i= j= b i= j= ɛ2 ij. λ 2 Y i. Y.. ) 2 Y.j Y.. ) 2 and The degrees of freedom of SSAB and SSE are, respectively, and ab a b. It can be shown that under H 0 : λ = 0, the random variables SSAB and SSE are stochastically independent with SSAB σ 2 χ 2 and SSE σ 2 χ 2 ab a b. The appropriate hypotheses in this case are H 0 : H with H : H λ, [H λ,0 H A, H B, )],

21 where H λ,0 : λ = 0 and H λ, : λ 0. The statistical test has decision rule that rejects the null hypotheis in favor of the alternative hypothesis if the observed value of where MSAB MSE The size of the test α is α = F F,ab a b c AB ) + xcab + F χ 2 ab a b 0 [ MSAB c AB < c MSE AB MSAB = SSAB MSA MSE c A MSB )] MSE c B, and MSE = xcab F χ 2 0 ab a b ) b ) xcb F χ 2 b ab a b ) SSE ab a b. ) a ) xca F χ 2 a f ab a b χ 2 ab a b x) dx ) f χ 2 ab a b x) dx. 2.5 Using a Normal Probability Plot The least squares estimate θ of the vector of parameters θ in the full model with no replicates is θ = X T X ) X T Y. It can be shown that if a > 2, then the estimators τ ),..., τ ) a of the parameters τ ),..., τ ) a are not independent under our independent normal model. Likewise, if b > 2, the estimators τ 2 ),..., τ 2 ) b of the parameters τ 2 ),..., τ 2 ) b are not stochastically independent as are the estimators τ 2 ) ij of the parameters τ 2 ) ij when a > 2 and/or b > 2. This can be seen by first noting that ) Σ θ = cov θ = X T X ) σ 2

22 and observing that X T X ) is not the identity matrix. However, it can be shown 2 that X T X ) has the form w X T X ) 0 W a ) a ) A 0 0 = 0 0 W b ) b ) B W a )b ) a )b ) AB, where w corresponds to the overall mean, W a ) a ) A effects due to factor A, W b ) b ) B B, and W a )b ) a )b ) AB and B. We can now see that, for example, whereas where is associated with the main is associated with the main effects due to factor is associated with the interactions between factors A cov τ )) = W a ) a ) A σ 2 cov τ, τ 2 )) = 0, τ = [ τ ),..., τ ) a ] T and τ2 = [ τ 2 ),..., τ 2 ) b ] T. Observe that X T X ) is a real symmetric matrix. In particular, observe that W A, W B, and W AB are real symmetric matrices. It follows that there exist matrices P A, P B, and P AB such that W A = P A P T A, W B = P B P T B, and W AB = P AB P T AB. We define P by P = w P A P B P AB

23 3 It follows that the vector of transformed estimators θ define by θ = P θ are stochastically independent since = cov P θ ) = P X T X ) ) Σ θ P T σ 2 = Iσ 2. The vector estimator θ is associated with the contrast given in W A, W B, and W AB of the vector of parameters θ. This suggest that under our null hypothesis of no main effects or interactions that the estimators θ 2,..., θ a )b ) are stochastically independent and identically distributed N 0, σ 2 ). A normal probability plot of the observed values of these estimators should reveal which if any of these linear combinations of the estimators of the main effects and interactions are different from zero. Exact plotting positions for a normal probability plot can be found in Harter 9) and Teichroew 95) for selected values of the sample size. Often the ith plotting position E Z i:n ) for a normal probability plot is usually approximated by ) i E Z i:n ) = Φ, n where Z i:n is the ith order statistics of a random sample of size n from a standard normal distribution and Φ z) is the cumulative distribution function of a standard normal distribution. This approximation was originally proposed by Blom 958). A discussion on the selection of plotting positions are discussed in Champ and Vora 2005). 2. Some Examples Example :

24 4 Montgomery 997) gives an example of a two factor experiment in which n ij = for i =, 2, 3 and j =, 2, 3, 4, 5. He states in his Example -2 that the impurity present in a chemical product is affected by two factors presure and temperature. His data is presented in the following table. Table 2.: Montgomery s Example -2 Temperature Pressure Using Tukey s method, he conclude that there was no interaction effect but that the main effects due to both temperature and pressure are significant.

25 5 The design matrix for this experiment is X =

26 It follows that w = 5, W A = W B = W AB = ,, and

27 7 It can be shown P A = P B = P AB = , , and We observe that our least squares estimates are θ = X T X ) X T y [ = ] T, where [ y = ] T. It then follows that θ = P θ [ = ] T.

28 8 Omitting the estimate for µ = 44 5/5, we find the observed order statistics for the remaining fourteen contrast estimates. From Teicherow 95), we obtain the plotting position for a normal probability plot for a sample size of fourteen. These ordered pairs are given in the following 4 2 matrix with the plotting position in the first column and the ordered data in the second

29 9 Figure 2.: Montgomery Example, Probability Plot This plots suggest that eleven of the points are plotting about a line whereas three are not plotting about this line. These are the points ) , 2, , ) ) 0, and , A simple fitting of a line to the eleven points, we have that θ i:a )b ) = δ i:a )b ). Plotting this line with our points, we obtain the graph in Figure 2.2.

30 20 Figure 2.2: Montgomery Example, Probability Plot and Fitted Line Points The three points , 2 ), , ) 0, and , ). are associated with the contrast estimators τ 2 ) 3, τ ) 2, and τ ), respectively. This plot provides evidence that there is no interaction between the two factors but there is effects due to the two factors. Example 2: Kutner, Nachtsheim, Neter, and Li 2005) on page 890 state in Exercise 20.8 that A food Technologist, testing storage capabilities for a newly developed type of imitation sausage made from soybeans, conducted an experiment to test the effects of humidity level factor A) and temperature level factor B) in the freezer compartment on color change in the sausage. Three humidity levels and four tem-

31 2 perature levels were considered. Five hundred sausages were stored at each of the 2 humidity-temperature combinations for 90 days. At the end of the storage period, the researcher determined the proportion of sausages for each humidity-temperature combination that exhibited color changes. The researcher transformed the data by means of the arcsine transformation to stabilize the variances. The transformed data Y = 2 arcsin Y follow. Table 2.2: Kutner s Exercise 20.8 Temperature level Humidity level j= j=2 j=3 j=4 i= i= i=

32 22 We see that Y. = = 8.350; Y 2. = = 9.325; Y 3. = = 9.25; Y. = = 4.9; Y.2 = = 5.3; Y.3 = = 20.7; Y.4 = = 24.83; 3 Y.. = = 227.2; and Y.. = = 8.93 Assuming interaction between the two factors has the form τ 2 ) ij = λ τ ) i τ 2 ) j, then the least squares estimates of the parameters µ, τ ), τ ) 2,τ ) 3, τ 2 ), τ 2 ) 2, τ 2 ) 3, and τ 2 ) 4 are

33 23 µ = Y.. = 8.93; τ ) = Y. Y.. = = 0.583; τ ) 2 = Y 2. Y.. = = 0.39; τ ) 3 = Y 3. Y.. = = 0.9; τ 2 ) = Y. Y.. = = 4.03; τ 2 ) 2 = Y.2 Y.. = = 3.3; τ 2 ) 3 = Y.3 Y.. = =.7; τ 2 ) 4 = Y.4 Y.. = = 5.9. Recall that the formula to be used to obtain an estimate of λ is 3 4 ) ) i= j= Y i. Y.. Y.j Y.. Yij λ = 3 4 ) 2 ) 2. i= j= Y i. Y.. Y.j Y..

34 24 It follows that 3 4 i= j= Y i. Y.. ) Y.j Y.. ) Yij = ) ) Y. Y.. Y. Y.. Y + ) ) Y. Y.. Y.2 Y.. Y2 + ) ) Y. Y.. Y.3 Y.. Y3 + ) ) Y. Y.. Y.4 Y.. Y4 + ) ) Y 2. Y.. Y. Y.. Y2 + ) ) Y 2. Y.. Y.2 Y.. Y22 + ) ) Y 2. Y.. Y.3 Y.. Y23 + ) ) Y 2. Y.. Y.4 Y.. Y24 + ) ) Y 3. Y.. Y. Y.. Y3 + ) ) Y 3. Y.. Y.2 Y.. Y32 + ) ) Y 3. Y.. Y.3 Y.. Y33 + ) ) Y 3. Y.. Y.4 Y.. Y34 = ) 4.03 ) 3.9) ) 3.3 ) 4.2) ).7 ) 20.5) ) 5.9) 24.8) ) 4.03 ) 5.7) ) 3.3 ).3) ).7 ) 2.7) ) 5.9) 23.) ) 4.03 ) 5.) ) 3.3 ) 5.4) ).7 ) 9.9) ) 5.9) 2.) = and

35 i= j= Y i. Y.. ) 2 Y.j Y.. ) 2 = ) 2 ) 2 ) 2 ) 2 Y. Y.. Y. Y.. + Y. Y.. Y.2 Y.. + ) 2 ) 2 ) 2 ) 2 Y. Y.. Y.3 Y.. + Y. Y.. Y.4 Y.. + ) 2 ) 2 ) 2 ) 2 Y 2. Y.. Y. Y.. + Y 2. Y.. Y.2 Y.. + ) 2 ) 2 ) 2 ) 2 Y 2. Y.. Y.3 Y.. + Y 2. Y.. Y.4 Y.. + ) 2 ) 2 ) 2 ) 2 Y 3. Y.. Y. Y.. + Y 3. Y.. Y.2 Y.. + ) 2 ) 2 ) 2 ) 2 Y 3. Y.. Y.3 Y.. + Y 3. Y.. Y.4 Y.. = ) 2 ) 2 ) 2 ) ) 2 ) 2 ) ) ) 2 ) 2 ) 2 ) ) 2 ) 2 ) ) ) 2 ) 2 ) 2 ) ) 2 ) 2 ) ) 2 = Thus, we have λ = =

36 2 It follows that the SST, SSA, SSB, and SSAB are SST = a b i= j= Yij Y.. ) 2 = ) ) ) 2 ) 2 ) ) 2 ) 2 ) ) 2 ) 2 ) ) 2 = , SSA = a ) 2 ) 2 ) 2 ) 2 Y i. Y.. = i= = , SSB = b ) 2 ) 2 ) 2 ) 2 Y.j Y.. = ) 2 SSAB = a j= = , and b i= j= λ 2 Y i. Y.. ) 2 Y.j Y.. ) 2 = ) ) = We then have SSE = SST SSA SSB SSAB = =

37 27 The associated mean squares are MSA = MSB = = ; = 22.45; MSAB = = ; and MSE = ) 4) 3 4 = The observed value of F = MSAB /MSE is F observed = = We see that if λ = 0, then the probability of the random variable F,5 is greater than or equal to F observed is P F,5 F observed ) = P F,5 < F observed ) = FDist ;, 5) = These results suggest there is no two factor interaction. Lets assume the reduced model of no interaction between Factors A and B. Our model is µ ij = µ + τ ) i + τ 2 ) j.

38 28 The design matrix is X = Our least squares estimates are X T X ) X T y = Our total sum of squares, the sums of squares due to Factors A and B, and the

39 29 sum of squares due to error are SST = 2) ) = 227.2; SSA = ) ) ) 2 = ; SSB = ) ) ) ) 2 = ; and SSE = SST SSA SSB = We observe that SSA/ a ) SSE/ a ) b )) SSB/ b ) SSE/ a ) b )) = / 3 ) / 3 ) 4 )) = ; and = / 4 ) / 3 ) 4 )) = The associated p-values are, respectively, P F 2, ) = FDist ; 2, ) = ; and P F 3, ) = FDist ; 3, ) = These results suggest that there is no effect due to either of the two factors. For the full model, the design matrix for the full model and our data in vector

40 30 form are X = and y = The matrix P associated with X T X ) is

41 3 P = The least squares estimates for the main effects and interaction is given by θ = X T X ).7 X T y =

42 32 the contrasts of the main effects and interactions are 2 3θ θ2 + θ 3 ) 2 θ2 θ 3 ) 2 θ 4 + θ 5 + θ ) θ 2 4 θ ) 2 θ = P θ = θ 2 4 2θ 5 + θ ). 2 θ7 + θ 8 + θ 9 + θ 0 + θ + θ 2 ) 3 θ 7 + θ 8 + θ 9 θ 0 θ θ 2 ) 3 θ 2 7 θ 9 + θ 0 θ 2 ) θ 2 7 2θ 8 + θ 9 + θ 0 2θ + θ 2 ) 2 θ 7 θ 9 θ 0 + θ 2 ) 3 θ 7 2θ 8 + θ 9 θ 0 + 2θ θ 2 ) The coordinates of the random vector θ = P θ of estimators of the vector θ of the contrasts of the main effects and interactions are independent. The estimates for

43 33 these contrasts are θ = P θ = Removing the estimate θ = , we have the vector of ordered estimates of the given linear contrasts of the main effects and interactions along with the plotting positions for the corresponding normal probability plot given in the

44 34 following 2 matrix A plot of these points is given in the following figure.

45 35 Figure 2.3: Kutner Example, Probability Plot All the points seem to be plotting about a line except for the points with coordinates ,.8) and.09520, ). Using the other nine points, we estimate the line to be y = x. A plot of this line along with our normal probability plot of the data is shown in the following figure.

46 3 Figure 2.4: Kutner Example, Probability Plot and Fitted Line 9 Points The point with coordinates , ) may also be an outlier. To examine this possibility, we used the other eight points to estimate the line. This line is y = x. A plot of this line along with the normal probability plot of the data is given in the following figure.

47 37 Figure 2.5: Kutner Example, Probability Plot and Fitted Line 8 Points The contrast associated with the three estimates θ 4, θ 5, and θ are, respectively, 2 θ 4 + θ 5 + θ ), 2 θ 4 θ ), and 2 2 θ 4 2θ 5 + θ ). These contrast are all associated with Factor B: temperature. The plot shows no evidence there is an effect due to Factor A pressure) or interactions between Factors A and B which is expected.

48 Conclusion A method for analyzing unreplicated two factor experiments using selected contrasts has been presented. This method is based on a normal probability plot of the estimates the main effect contrasts and the interaction contrasts. This method provides the researcher a method of identifying the contrasts that are significantly different from zero. As was illustrated, each contrast is a contrast of a particular main effect or interaction. Hence, if a contrast is identified as being significantly different from zero, then it follows that the associated main effect or interaction is different from zero.

49 CHAPTER 3 THREE FACTOR EXPERIMENTS 3. Introduction The model for a three factor experiment k = 3) under the additive model expresses the response variable Y ijrs as Y ijrs = µ ijr + ɛ ijrs with µ ijr = µ + τ ) i + τ 2 ) j + τ 3 ) r + τ 2 ) ij + τ 3 ) ir + τ 23 ) jr + τ 23 ) ijr and ɛ ijrs iid N 0, σ 2 ) for i =,..., a, j =,..., b, r =,..., c, and s =,..., n. It is assumed that main effects and interactions are such that a b c c c i= τ ) i = 0; b j= τ 2) j = 0; c j= τ 2) ij = 0 for i =,..., a; a r= τ 3) r = 0; i= τ 2) ij = 0 for j =,..., b; τ 3) r= ir = 0 for i =,..., a; a τ 3) i= ir = 0 for r =,..., c; τ 23) r= jr = 0 for j =,..., b; b τ 23) j= jr = 0 for r =,..., c; r= τ 23) ijr = 0 for i =,..., a, j =,..., b; b j= τ 23) ijr = 0 for i =,..., a, r =,..., c; and a i= τ 23) ijr = 0 for j =,..., b, r =,..., c. This is referred to as the full model. One can reduce the model by assuming some of the interactions are zero. If this is done, we will refer to this model as the reduced model. We also assume that the ɛ ijrs s are independent and ɛ ijrs N 0, σ 2 ). We refer to these assumptions as the independent normal model. The design is an unreplicated

50 40 one if n =. Using matrix notation, we can write our additive model in the form Y = Xθ + ɛ, where Y is the abcn vector of observations, X is the abcn abc design matrix, θ is the abc vector of model parameters, and ɛ is the abcn vector of error terms. We are interested in studying the case in which n =. A study in which there is only one replicate per treatment does not allow one to perform an analysis of variance if the full model is assumed. For these data, there is not enough informations in the data to independently estimate the main effects and interactions and the common variance. Two methods have been suggested in the literature for analyzing the data from a design without replicates. The first of these is an extention of Tukey s method used to test for non-additivity. This is discussed in the next section. The second of these analyzes the data under a reduced model. This will be examined in Section 3. We present a third method in Section 4. In Section 5, we discuss the analysis of 3 k factorial designs. Some examples are given in Section. 3.2 Tukey s Method for Three Factors Tukey 949) method can be extended to develop tests for non-additivity for three factor experiments. In this case, one is to assume that the two and three factor interactions can be expressed in terms of the main effects and the parameters λ 2, λ 3, λ 23, and λ 23. Under Tukey s model, it is assumed that τ 2 ) ij = λ 2 τ ) i τ 2 ) j ; τ 3 ) ir = λ 3 τ ) i τ 3 ) r ; τ 23 ) jr = λ 23 τ 2 ) j τ 3 ) r ; and τ 23 ) ijr = λ 23 τ ) i τ 2 ) j τ 3 ) r.

51 In this model, there are a + b + c + 5 parameters including the common variance to be estimated. To determine these estimates using least squares, we define the function 4 Q = Q µ, τ ),..., τ ) a, τ 2 ),..., τ 2 ) b, τ 3 ),..., τ 3 ) c, λ 2, λ 3, λ 23, λ 23 ) = a b c i= j= r= Y ij µ τ ) i τ 2 ) j τ 3 ) r λ 2 τ ) i τ 2 ) j λ 3 τ ) i τ 3 ) r λ 23 τ 2 ) j τ 3 ) r λ 23 τ ) i τ 2 ) j τ 3 ) r ) 2 The least square estimates are the solutions to the following system of equations. Q µ = 0; Q τ ) i = 0; Q λ 2 = 0; Q λ 3 = 0; Q τ 2 ) j = 0; Q λ 23 = 0; and Q τ 3 ) j = 0; Q λ 23 = 0. It follows that the estimators for the model parameters τ ) i, τ 2 ) j, τ 3 ) r, λ 2, λ 3, λ 23, and λ 23 are τ ) i = Y i.. Y... ; τ 2 ) j = Y.j. Y... ; τ 3 ) r = Y..r Y... ; λ 2 = λ 3 = λ 23 = λ 23 = a b c i= j= r= c a b i= j= a b c i= j= r= b a c i= r= a b c i= j= r= a b c j= r= a b c i= j= r= a i= ) ) Y i.. Y... Y.j. Y... Yijr ) 2 ) 2 ; Y i.. Y... Y.j. Y... ) ) Y i.. Y... Y..r Y... Yijr ) 2 ) 2 ; Y i.. Y... Y..r Y... ) ) Y.j. Y... Y..r Y... Yijr ) 2 ) 2 ; and Y.j. Y... Y..r Y... ) ) ) Y i.. Y... Y.j. Y... Y..r Y... Yijr b c ) 2 ) 2 ) 2. j= r= Y i.. Y... Y.j. Y... Y..r Y...

52 42 We can now express Y ijr as Y ijr = Y... + ) ) ) Y i.. Y... + Y.j. Y... + Y..r Y... ) ) ) ) + λ 2 Y i.. Y... Y.j. Y... + λ3 Y i.. Y... Y..r Y... ) ) ) ) ) + λ 23 Y.j. Y... Y..r Y... + λ23 Y i.. Y... Y.j. Y... Y..r Y... + ɛijr. It follows that ɛ ijr can be expressed as ɛ ijr = Y ijr Y... ) ) ) Y i.. Y... Y.j. Y... Y..r Y... ) ) ) ) λ 2 Y i.. Y... Y.j. Y... λ3 Y i.. Y... Y..r Y... ) ) ) ) ) λ 23 Y.j. Y... Y..r Y... λ23 Y i.. Y... Y.j. Y... Y..r Y... The total sum of squares SST can be partitioned into the follow sums of squares. SSA = bc a SSB = ac b SSC = ab c SSAB = c a SSAC = b a SSBC = a b SSABC = a i= j= Y i.. Y... ) 2 with dfssa = a ; r= b Y.j. Y... ) 2 with dfssb = b ; Y..r Y... ) 2 with dfssc = c ; λ 2 2 i= j= c i= λ 2 3 r= c λ 2 23 j= r= b c λ 2 23 i= j= r= Y i.. Y... ) 2 Y.j. Y... ) 2 Y i.. Y... ) 2 Y..r Y... ) 2 Y.j. Y... ) 2 Y..r Y... ) 2 ; Y i.. Y... ) 2 Y.j. Y... ) 2 Y..r Y... ) 2 ; and SSE = SST SSA SSB SSC SSAB SSAC SSBC SSABC, where df SSAB = df SSAC = df SSBC = df SSABC =, df SSE = abc a b c 2, and SST = a b c ) 2 Yijr Y.... i= j= r=

53 43 Following the derivations in Tukey 949), one can shown that SSAB σ 2 χ 2, SSAC σ 2 χ 2, SSBC σ 2 χ 2, SSABC σ 2 χ 2,and SSE σ 2 χ 2 ab a b c 2. The observed values of the significance levels SLs) ) SSABC / SL 23 = P F,abc a b c 2 ; SSE / abc a b c 2) ) SSAB / SL 2 = P F,abc a b c 2 ; SSE / abc a b c 2) ) SSAC / SL 3 = P F,abc a b c 2 ; and SSE / abc a b c 2) ) SSBC / SL 23 = P F,abc a b c 2. SSE / abc a b c 2) are then examined. The observed significance levels OSLs) OSL 23, OSL 2, OSL 3, and OSL 23 can be used to judge if there is strong enough evidence in the data against the null hypotheses H 0 : λ 23 = 0, H 0 : λ 2 = 0, H 0 : λ 3 = 0, and H 0 : λ 23 = 0, respectively. Note that an observed significance level is commonly referred to as a p-value. The test for non-additivity has null and alternative hypotheses given by H 0 : λ 2 = λ 3 = λ 23 = λ 23 and H : H 0. The statistical test has decision rule that rejects the null hypotheis in favor of the alternative hypothesis if the observed value of MSAB MSE c AB MSAC MSE c AC MSBC MSE c BC MSABC MSE c ABC, where MSAB = SSAB, MSAC = SSAC MSABC = SSABC, and MSE =, MSBC = SSBC, SSE ab a b c 2.

54 44 The size of the test α is ) ) MSAB MSAC α = P c MSE AB + P c MSE AC ) ) MSBC MSABC +P c MSE BC + P c MSE ABC. If each of the critical values are selected to be the 00 γ)th percentile of the appropriate F -distribution, then α = 4γ or γ = α/ Analyzing a Reduced Model For the case in which n =, a reduced model can be entertained by assuming some of the parameters in the model associated with interactions are zero. Under this new assumption there is information in the data that can be used to estimate the common variance. For example, if there are no three factor interactions, our reduced model becomes µ ijr = µ + τ ) i + τ 2 ) j + τ 3 ) r + τ 2 ) ij + τ 3 ) ir + τ 23 ) jr. It follows that SSE under this reduced model is the SSABC under the full model. The SST can be partitioned as SST = SSA + SSB + SSC + SSAB + SSAC + SSBC + SSE. An ANOVA can then be used to analyze the data. There are many other possible reduced models that assumes various parameters representing interactions are zero. For example, suppose that a = 5 and b = 7.

55 Analysis of Contrasts The analysis of contrast is the same for any factorial experiment. The matrix P is determined such that X T X ) = PP T. The estimates θ of the vector of parameters θ for the full model are transformed into the vector of contrast θ = P θ. Under the independent normal model, θ N abc θ = P θ, Iσ 2). We observe that the contrasts of the estimates in the vector θ associated with a main effect or an interaction are the corresponding components of θ. Removing the contrast associate with the overall mean in θ, a normal probability plot of the remaining components can be examined. Points on the plot that provide evidence against the hypothesis θ = 0 are analyzed. These points suggest that the given parameter contrast differs from zero. 3.5 Unreplicated 3 k Factorial Designs In the analysis of a 3 k factorial experiment using contrasts, one needs the design matrix X to estimate the parameters in the full model and the matrix P such that X T X ) = PP T. However, one may have software that can be used to determine the estimates θ and then one can find θ = P θ. In what follows, we demonstrate that the matrix P has a general form.

56 4 Define the sequence of matrices, B 2 0, B 2,..., B 2 k as B 2 0 = 3 [], k B 2 = 3 k 2 2 = 2B 2 0 B 2 0 B 2 0 2B 2 0, B 2 2 = = 3 k 2B 2 B 2, B 2 2B B 2 3 = k = B 2 2 B 2 2 B 2 2 2B 2 2, B 2 k =. 2B 2 k B 2 k B 2 k 2B 2 k. The matrix X T X ) can be expressed as block diagonal matrix with 2 k matrices on the block diagonal. The first block matrix is B 2 0, the matrix B 2 is the next ) k block matrices, the matrix B 2 2 is the next k 2) block matrices etc. It follows that P is a block diagonal matrix of the same construct as X T X ) with B2 i replaced with P 2 i for i =,..., k, where B 2 i = P 2 ip T 2 i.

57 47 for i = 0,, 2,..., k. 3. Example Montgomery 997) gives an example of a three factor experiment in which n =. He states in his example -3 that the process engineer can control three variables during the filling process: the percent carbonationa), the operating pressure in the filler B), and the bottles produced per minute or the line speed C). His data is presented in the following table. Table 3.: Montgomery s Example -3 Operating pressure 25psi 30psi Line speed Line speed Percent carbonation

58 48 We see that Y.. = = ; 4 Y 2.. = = 5; Y 3.. = = 4.75; Y.. = = 4.3; Y.2. = = 8.; Y.. = = 3.5; Y..2 = = 9; Y... = = 75; and Y... = 75 2 =.25. Assuming interaction between the three factors has the form τ 2 ) ij = λ 2 τ ) i τ 2 ) j, τ 3 ) ij = λ 3 τ ) i τ 3 ) r, τ 23 ) ij = λ 23 τ 2 ) j τ 3 ) r, and τ 23 ) ijr = λ 23 τ ) i τ 2 ) j τ 3 ) r then the least squares estimates of the parameters µ, τ ), τ ) 2, τ ) 3, τ 2 ), τ 2 ) 2,

59 49 τ 3 ), and τ 3 ) 2 are µ = Y... =.25, τ ) = Y.. Y... =.25 = 7.25, τ ) 2 = Y 2.. Y... = 5.25 =.25, τ ) 3 = Y 3.. Y... = = 8.5, τ 2 ) = Y.. Y... = =.9, τ 2 ) 2 = Y.2. Y... = =.9, τ 3 ) = Y.. Y... = = 2.75, and τ 3 ) 2 = Y..2 Y... = 9.25 = Recall that the formula to be used to obtain an estimate of λ is a b c ) ) i= j= r= Y i.. Y... Y.j. Y... Yijr λ 2 = c a b ) 2 ) 2, i= j= Y i.. Y... Y.j. Y... a b c ) ) i= j= r= Y i.. Y... Y..r Y... Yijr λ 3 = b a c ) 2 ) 2, i= r= Y i.. Y... Y..r Y... a b c ) ) i= j= r= Y.j. Y... Y..r Y... Yijr λ 23 = a b c ) 2 ) 2, and j= r= Y.j. Y... Y..r Y... a b c ) ) ) i= j= r= Y i.. Y... Y.j. Y... Y..r Y... Yijr λ 23 = c ) 2 ) 2 ) 2. r= Y i.. Y... Y.j. Y... Y..r Y... a b i= j=

60 50 It follows that i= = 2 3 j= 2 i= j= r= Y i.. Y... ) Y.j. Y... ) Yijr Y i.. Y... ) Y.j. Y... ) Yijr = 2 ) ) Y.. Y... Y.. Y... Y + 2 ) ) Y.. Y... Y.. Y... Y2 + 2 ) ) Y.. Y... Y.2. Y... Y2 + 2 ) ) Y.. Y... Y.2. Y... Y ) ) Y 2.. Y... Y.. Y... Y2 + 2 ) ) Y 2.. Y... Y.. Y... Y ) ) Y 2.. Y... Y.2. Y... Y ) ) Y 2.. Y... Y.2. Y... Y ) ) Y 3.. Y... Y.. Y... Y3 + 2 ) ) Y 3.. Y... Y.. Y... Y ) ) Y 3.. Y... Y.2. Y... Y ) ) Y 3.. Y... Y.2. Y... Y322 = ).9 ) 4) ).9 ) ) ).9 ) ) ).9 ) 2) ).9 ) ) ).9 ) 5) ).9 ) 3) ).9 ) ) ).9 ) 9) ).9 ) ) ).9 ) 3) ).9 ) 2) = 88.7,

61 i= = 2 3 j= 2 i= r= r= Y i.. Y... ) Y..r Y... ) Yijr Y i.. Y... ) Y..r Y... ) Yijr = 2 ) ) Y.. Y... Y.. Y... Y + 2 ) ) Y.. Y... Y..2 Y... Y2 + 2 ) ) Y.. Y... Y.. Y... Y2 + 2 ) ) Y.. Y... Y..2 Y... Y ) ) Y 2.. Y... Y.. Y... Y2 + 2 ) ) Y 2.. Y... Y..2 Y... Y ) ) Y 2.. Y... Y.. Y... Y ) ) Y 2.. Y... Y..2 Y... Y ) ) Y 3.. Y... Y.. Y... Y3 + 2 ) ) Y 3.. Y... Y..2 Y... Y ) ) Y 3.. Y... Y.. Y... Y ) ) Y 3.. Y... Y..2 Y... Y322 = ) 2.75) 4) ) 2.75) ) ) 2.75) ) ) 2.75) 2) ) 2.75) ) ) 2.75) 5) ) 2.75) 3) ) 2.75) ) ) 2.75) 9) ) 2.75) ) ) 2.75) 3) ) 2.75) 2) = 379.5,

62 i= = 3 2 j= 2 j= r= r= Y.j. Y... ) Y..r Y... ) Yijr Y.j. Y... ) Y..r Y... ) Yijr = 3 ) ) Y.. Y... Y.. Y... Y + 3 ) ) Y.. Y... Y..2 Y... Y2 + 3 ) ) Y.2. Y... Y.. Y... Y2 + 3 ) ) Y.2. Y... Y..2 Y... Y ) ) Y.. Y... Y.. Y... Y2 + 3 ) ) Y.. Y... Y..2 Y... Y ) ) Y.2. Y... Y.. Y... Y ) ) Y.2. Y... Y..2 Y... Y ) ) Y.. Y... Y.. Y... Y3 + 3 ) ) Y.. Y... Y..2 Y... Y ) ) Y.2. Y... Y.. Y... Y ) ) Y.2. Y... Y..2 Y... Y322 = 3.9 ) 2.75) 4) ) 2.75) ) ) 2.75) ) ) 2.75) 2) ) 2.75) ) ) 2.75) 5) ) 2.75) 3) ) 2.75) ) ) 2.75) 9) ) 2.75) ) ) 2.75) 3) ) 2.75) 2) = ,

63 i= j= r= Y i.. Y... ) Y.j. Y... ) Y..r Y... ) Yijr = ) ) ) Y.. Y... Y.. Y... Y.. Y... Y + ) ) ) Y.. Y... Y.. Y... Y..2 Y... Y2 + ) ) ) Y.. Y... Y.2. Y... Y.. Y... Y2 + ) ) ) Y.. Y... Y.2. Y... Y..2 Y... Y22 + ) ) ) Y 2.. Y... Y.. Y... Y.. Y... Y2 + ) ) ) Y 2.. Y... Y.. Y... Y..2 Y... Y22 + ) ) ) Y 2.. Y... Y.2. Y... Y.. Y... Y22 + ) ) ) Y 2.. Y... Y.2. Y... Y..2 Y... Y222 + ) ) ) Y 3.. Y... Y.. Y... Y.. Y... Y3 + ) ) ) Y 3.. Y... Y.. Y... Y..2 Y... Y32 + ) ) ) Y 3.. Y... Y.2. Y... Y.. Y... Y32 + ) ) ) Y 3.. Y... Y.2. Y... Y..2 Y... Y322 = 7.25).9 ) 2.75) 4) ).9 ) 2.75) ) ).9 ) 2.75) ) ).9 ) 2.75) 2) +.25).9 ) 2.75) ) +.25).9 ) 2.75) 5) +.25).9 ) 2.75) 3) +.25).9 ) 2.75) ) + 8.5).9 ) 2.75) 9) + 8.5).9 ) 2.75) ) + 8.5).9 ) 2.75) 3) + 8.5).9 ) 2.75) 2) = ,

64 i= j= Y i.. Y... ) 2 Y.j. Y... ) 2 = 2 ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y.. Y ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.2. Y Y.. Y... Y.2. Y ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.. Y Y 2.. Y... Y.. Y ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.2. Y Y 2.. Y... Y.2. Y ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.. Y Y 3.. Y... Y.. Y ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.2. Y Y 3.. Y... Y.2. Y... = ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) ) 2.9 ) 2 = ,

65 i= r= Y i.. Y... ) 2 Y..r Y... ) 2 = 2 ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.. Y Y 2.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.. Y Y 2.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.. Y Y 3.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.. Y Y 3.. Y... Y..2 Y... = ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) 2 = ,

66 j= r= Y.j. Y... ) 2 Y..r Y... ) 2 = 3 ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.2. Y... Y.. Y Y.2. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.2. Y... Y.. Y Y.2. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y Y.. Y... Y..2 Y ) 2 ) 2 ) 2 ) 2 Y.2. Y... Y.. Y Y.2. Y... Y..2 Y... = 3.9 ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) 2 = ,

67 i= j= r= Y i.. Y... ) 2 Y.j. Y... ) 2 Y..r Y... ) 2 = ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.. Y... Y.. Y... + Y.. Y... Y.. Y... Y..2 Y... + ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y.. Y... Y.2. Y... Y.. Y... + Y.. Y... Y.2. Y... Y..2 Y... + ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.. Y... Y.. Y... + Y 2.. Y... Y.. Y... Y..2 Y... + ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y 2.. Y... Y.2. Y... Y.. Y... + Y 2.. Y... Y.2. Y... Y..2 Y... + ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.. Y... Y.. Y... + Y 3.. Y... Y.. Y... Y..2 Y... + ) 2 ) 2 ) 2 ) 2 ) 2 ) 2 Y 3.. Y... Y.2. Y... Y.. Y... + Y 3.. Y... Y.2. Y... Y..2 Y... = 7.25) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) ) 2.9 ) ) 2 = Thus, we have λ 2 = = , λ 3 = = , λ 23 = = , and λ 23 = =

68 58 It follows that the SST, SSA, SSB, and SSAB are SST = i= j= r= Yijr Y... ) 2 = 4.25) ) ) ) ) ) ) ) ) ) ) ) 2 = 5.25, SSA = 4 3 ) 2 Y i.. Y... = ) ) ) 2 ) = 505.5, SSB = 2 i= j= Y.j. Y... ) 2 =.9 ) ) 2) = , SSC = 2 ) 2 Y..r Y... = 2.75) ) 2 ) = 90.75, SSAB = 3 r= 2 λ 2 2 i= j= Y i.. Y... ) 2 Y.j. Y... ) 2 = ) /2) = , SSAC = 3 2 ) 2 ) 2 Y i.. Y... Y..r Y... i= r= λ 2 3 = ) /2) = , SSBC = 2 2 ) 2 ) 2 Y.j. Y... Y..r Y... j= r= λ 2 23 = ) /3) = , SSABC = ) 2 ) 2 ) 2 Y i.. Y... Y.j. Y... Y..r Y... i= j= r= λ 2 23 = ) ) =

Multiple Comparisons. The Interaction Effects of more than two factors in an analysis of variance experiment. Submitted by: Anna Pashley

Multiple Comparisons. The Interaction Effects of more than two factors in an analysis of variance experiment. Submitted by: Anna Pashley Multiple Comparisons The Interaction Effects of more than two factors in an analysis of variance experiment. Submitted by: Anna Pashley One way Analysis of Variance (ANOVA) Testing the hypothesis that

More information

Analysis of Variance and Design of Experiments-I

Analysis of Variance and Design of Experiments-I Analysis of Variance and Design of Experiments-I MODULE VIII LECTURE - 35 ANALYSIS OF VARIANCE IN RANDOM-EFFECTS MODEL AND MIXED-EFFECTS MODEL Dr. Shalabh Department of Mathematics and Statistics Indian

More information

If we have many sets of populations, we may compare the means of populations in each set with one experiment.

If we have many sets of populations, we may compare the means of populations in each set with one experiment. Statistical Methods in Business Lecture 3. Factorial Design: If we have many sets of populations we may compare the means of populations in each set with one experiment. Assume we have two factors with

More information

Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data

Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data 1999 Prentice-Hall, Inc. Chap. 10-1 Chapter Topics The Completely Randomized Model: One-Factor

More information

Analysis and Design of One- and Two-Sided Cusum Charts with Known and Estimated Parameters

Analysis and Design of One- and Two-Sided Cusum Charts with Known and Estimated Parameters Georgia Southern University Digital Commons@Georgia Southern Electronic Theses & Dissertations Graduate Studies, Jack N. Averitt College of Spring 2007 Analysis and Design of One- and Two-Sided Cusum Charts

More information

Statistics For Economics & Business

Statistics For Economics & Business Statistics For Economics & Business Analysis of Variance In this chapter, you learn: Learning Objectives The basic concepts of experimental design How to use one-way analysis of variance to test for differences

More information

Fractional Factorial Designs

Fractional Factorial Designs k-p Fractional Factorial Designs Fractional Factorial Designs If we have 7 factors, a 7 factorial design will require 8 experiments How much information can we obtain from fewer experiments, e.g. 7-4 =

More information

Chapter 8: Hypothesis Testing Lecture 9: Likelihood ratio tests

Chapter 8: Hypothesis Testing Lecture 9: Likelihood ratio tests Chapter 8: Hypothesis Testing Lecture 9: Likelihood ratio tests Throughout this chapter we consider a sample X taken from a population indexed by θ Θ R k. Instead of estimating the unknown parameter, we

More information

Factorial designs. Experiments

Factorial designs. Experiments Chapter 5: Factorial designs Petter Mostad mostad@chalmers.se Experiments Actively making changes and observing the result, to find causal relationships. Many types of experimental plans Measuring response

More information

Theorem A: Expectations of Sums of Squares Under the two-way ANOVA model, E(X i X) 2 = (µ i µ) 2 + n 1 n σ2

Theorem A: Expectations of Sums of Squares Under the two-way ANOVA model, E(X i X) 2 = (µ i µ) 2 + n 1 n σ2 identity Y ijk Ȳ = (Y ijk Ȳij ) + (Ȳi Ȳ ) + (Ȳ j Ȳ ) + (Ȳij Ȳi Ȳ j + Ȳ ) Theorem A: Expectations of Sums of Squares Under the two-way ANOVA model, (1) E(MSE) = E(SSE/[IJ(K 1)]) = (2) E(MSA) = E(SSA/(I

More information

Compositions, Bijections, and Enumerations

Compositions, Bijections, and Enumerations Georgia Southern University Digital Commons@Georgia Southern Electronic Theses & Dissertations COGS- Jack N. Averitt College of Graduate Studies Fall 2012 Compositions, Bijections, and Enumerations Charles

More information

Two-Way Analysis of Variance - no interaction

Two-Way Analysis of Variance - no interaction 1 Two-Way Analysis of Variance - no interaction Example: Tests were conducted to assess the effects of two factors, engine type, and propellant type, on propellant burn rate in fired missiles. Three engine

More information

Lec 5: Factorial Experiment

Lec 5: Factorial Experiment November 21, 2011 Example Study of the battery life vs the factors temperatures and types of material. A: Types of material, 3 levels. B: Temperatures, 3 levels. Example Study of the battery life vs the

More information

Associated Hypotheses in Linear Models for Unbalanced Data

Associated Hypotheses in Linear Models for Unbalanced Data University of Wisconsin Milwaukee UWM Digital Commons Theses and Dissertations May 5 Associated Hypotheses in Linear Models for Unbalanced Data Carlos J. Soto University of Wisconsin-Milwaukee Follow this

More information

Summary of Chapter 7 (Sections ) and Chapter 8 (Section 8.1)

Summary of Chapter 7 (Sections ) and Chapter 8 (Section 8.1) Summary of Chapter 7 (Sections 7.2-7.5) and Chapter 8 (Section 8.1) Chapter 7. Tests of Statistical Hypotheses 7.2. Tests about One Mean (1) Test about One Mean Case 1: σ is known. Assume that X N(µ, σ

More information

Chapter 11: Factorial Designs

Chapter 11: Factorial Designs Chapter : Factorial Designs. Two factor factorial designs ( levels factors ) This situation is similar to the randomized block design from the previous chapter. However, in addition to the effects within

More information

Allow the investigation of the effects of a number of variables on some response

Allow the investigation of the effects of a number of variables on some response Lecture 12 Topic 9: Factorial treatment structures (Part I) Factorial experiments Allow the investigation of the effects of a number of variables on some response in a highly efficient manner, and in a

More information

2 k, 2 k r and 2 k-p Factorial Designs

2 k, 2 k r and 2 k-p Factorial Designs 2 k, 2 k r and 2 k-p Factorial Designs 1 Types of Experimental Designs! Full Factorial Design: " Uses all possible combinations of all levels of all factors. n=3*2*2=12 Too costly! 2 Types of Experimental

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization.

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization. 1 Chapter 1: Research Design Principles The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization. 2 Chapter 2: Completely Randomized Design

More information

STAT Final Practice Problems

STAT Final Practice Problems STAT 48 -- Final Practice Problems.Out of 5 women who had uterine cancer, 0 claimed to have used estrogens. Out of 30 women without uterine cancer 5 claimed to have used estrogens. Exposure Outcome (Cancer)

More information

Factorial ANOVA. Psychology 3256

Factorial ANOVA. Psychology 3256 Factorial ANOVA Psychology 3256 Made up data.. Say you have collected data on the effects of Retention Interval 5 min 1 hr 24 hr on memory So, you do the ANOVA and conclude that RI affects memory % corr

More information

Two-factor studies. STAT 525 Chapter 19 and 20. Professor Olga Vitek

Two-factor studies. STAT 525 Chapter 19 and 20. Professor Olga Vitek Two-factor studies STAT 525 Chapter 19 and 20 Professor Olga Vitek December 2, 2010 19 Overview Now have two factors (A and B) Suppose each factor has two levels Could analyze as one factor with 4 levels

More information

CUSUM Generalized Variance Charts

CUSUM Generalized Variance Charts Georgia Southern University Digital Commons@Georgia Southern Electronic Theses & Dissertations Graduate Studies, Jack N. Averitt College of Fall 13 CUSUM Generalized Variance Charts Yuxiang Li Georgia

More information

Outline Topic 21 - Two Factor ANOVA

Outline Topic 21 - Two Factor ANOVA Outline Topic 21 - Two Factor ANOVA Data Model Parameter Estimates - Fall 2013 Equal Sample Size One replicate per cell Unequal Sample size Topic 21 2 Overview Now have two factors (A and B) Suppose each

More information

AN IMPROVEMENT TO THE ALIGNED RANK STATISTIC

AN IMPROVEMENT TO THE ALIGNED RANK STATISTIC Journal of Applied Statistical Science ISSN 1067-5817 Volume 14, Number 3/4, pp. 225-235 2005 Nova Science Publishers, Inc. AN IMPROVEMENT TO THE ALIGNED RANK STATISTIC FOR TWO-FACTOR ANALYSIS OF VARIANCE

More information

3. Factorial Experiments (Ch.5. Factorial Experiments)

3. Factorial Experiments (Ch.5. Factorial Experiments) 3. Factorial Experiments (Ch.5. Factorial Experiments) Hae-Jin Choi School of Mechanical Engineering, Chung-Ang University DOE and Optimization 1 Introduction to Factorials Most experiments for process

More information

Two-Factor Full Factorial Design with Replications

Two-Factor Full Factorial Design with Replications Two-Factor Full Factorial Design with Replications Dr. John Mellor-Crummey Department of Computer Science Rice University johnmc@cs.rice.edu COMP 58 Lecture 17 March 005 Goals for Today Understand Two-factor

More information

Unit 7: Random Effects, Subsampling, Nested and Crossed Factor Designs

Unit 7: Random Effects, Subsampling, Nested and Crossed Factor Designs Unit 7: Random Effects, Subsampling, Nested and Crossed Factor Designs STA 643: Advanced Experimental Design Derek S. Young 1 Learning Objectives Understand how to interpret a random effect Know the different

More information

Increasing precision by partitioning the error sum of squares: Blocking: SSE (CRD) à SSB + SSE (RCBD) Contrasts: SST à (t 1) orthogonal contrasts

Increasing precision by partitioning the error sum of squares: Blocking: SSE (CRD) à SSB + SSE (RCBD) Contrasts: SST à (t 1) orthogonal contrasts Lecture 13 Topic 9: Factorial treatment structures (Part II) Increasing precision by partitioning the error sum of squares: s MST F = = MSE 2 among = s 2 within SST df trt SSE df e Blocking: SSE (CRD)

More information

Chapter 15: Analysis of Variance

Chapter 15: Analysis of Variance Chapter 5: Analysis of Variance 5. Introduction In this chapter, we introduced the analysis of variance technique, which deals with problems whose objective is to compare two or more populations of quantitative

More information

Random and Mixed Effects Models - Part III

Random and Mixed Effects Models - Part III Random and Mixed Effects Models - Part III Statistics 149 Spring 2006 Copyright 2006 by Mark E. Irwin Quasi-F Tests When we get to more than two categorical factors, some times there are not nice F tests

More information

V. Experiments With Two Crossed Treatment Factors

V. Experiments With Two Crossed Treatment Factors V. Experiments With Two Crossed Treatment Factors A.The Experimental Design Completely Randomized Design (CRD) Let A be a factor with levels i = 1,,a B be a factor with levels j = 1,,b Y ijt = the response

More information

Chap The McGraw-Hill Companies, Inc. All rights reserved.

Chap The McGraw-Hill Companies, Inc. All rights reserved. 11 pter11 Chap Analysis of Variance Overview of ANOVA Multiple Comparisons Tests for Homogeneity of Variances Two-Factor ANOVA Without Replication General Linear Model Experimental Design: An Overview

More information

Lecture 9: Factorial Design Montgomery: chapter 5

Lecture 9: Factorial Design Montgomery: chapter 5 Lecture 9: Factorial Design Montgomery: chapter 5 Page 1 Examples Example I. Two factors (A, B) each with two levels (, +) Page 2 Three Data for Example I Ex.I-Data 1 A B + + 27,33 51,51 18,22 39,41 EX.I-Data

More information

Experimental designs for multiple responses with different models

Experimental designs for multiple responses with different models Graduate Theses and Dissertations Graduate College 2015 Experimental designs for multiple responses with different models Wilmina Mary Marget Iowa State University Follow this and additional works at:

More information

Factorial and Unbalanced Analysis of Variance

Factorial and Unbalanced Analysis of Variance Factorial and Unbalanced Analysis of Variance Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 04-Jan-2017 Nathaniel E. Helwig (U of Minnesota)

More information

16.3 One-Way ANOVA: The Procedure

16.3 One-Way ANOVA: The Procedure 16.3 One-Way ANOVA: The Procedure Tom Lewis Fall Term 2009 Tom Lewis () 16.3 One-Way ANOVA: The Procedure Fall Term 2009 1 / 10 Outline 1 The background 2 Computing formulas 3 The ANOVA Identity 4 Tom

More information

TWO OR MORE RANDOM EFFECTS. The two-way complete model for two random effects:

TWO OR MORE RANDOM EFFECTS. The two-way complete model for two random effects: TWO OR MORE RANDOM EFFECTS Example: The factors that influence the breaking strength of a synthetic fiber are being studied. Four production machines and three operators are randomly selected. A two-way

More information

3. Design Experiments and Variance Analysis

3. Design Experiments and Variance Analysis 3. Design Experiments and Variance Analysis Isabel M. Rodrigues 1 / 46 3.1. Completely randomized experiment. Experimentation allows an investigator to find out what happens to the output variables when

More information

DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Genap 2017/2018 Jurusan Teknik Industri Universitas Brawijaya

DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Genap 2017/2018 Jurusan Teknik Industri Universitas Brawijaya DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Jurusan Teknik Industri Universitas Brawijaya Outline Introduction The Analysis of Variance Models for the Data Post-ANOVA Comparison of Means Sample

More information

2017 Financial Mathematics Orientation - Statistics

2017 Financial Mathematics Orientation - Statistics 2017 Financial Mathematics Orientation - Statistics Written by Long Wang Edited by Joshua Agterberg August 21, 2018 Contents 1 Preliminaries 5 1.1 Samples and Population............................. 5

More information

Ch. 1: Data and Distributions

Ch. 1: Data and Distributions Ch. 1: Data and Distributions Populations vs. Samples How to graphically display data Histograms, dot plots, stem plots, etc Helps to show how samples are distributed Distributions of both continuous and

More information

MAT3378 ANOVA Summary

MAT3378 ANOVA Summary MAT3378 ANOVA Summary April 18, 2016 Before you do the analysis: How many factors? (one-factor/one-way ANOVA, two-factor ANOVA etc.) Fixed or Random or Mixed effects? Crossed-factors; nested factors or

More information

2 Hand-out 2. Dr. M. P. M. M. M c Loughlin Revised 2018

2 Hand-out 2. Dr. M. P. M. M. M c Loughlin Revised 2018 Math 403 - P. & S. III - Dr. McLoughlin - 1 2018 2 Hand-out 2 Dr. M. P. M. M. M c Loughlin Revised 2018 3. Fundamentals 3.1. Preliminaries. Suppose we can produce a random sample of weights of 10 year-olds

More information

Lec 1: An Introduction to ANOVA

Lec 1: An Introduction to ANOVA Ying Li Stockholm University October 31, 2011 Three end-aisle displays Which is the best? Design of the Experiment Identify the stores of the similar size and type. The displays are randomly assigned to

More information

Unit 8: 2 k Factorial Designs, Single or Unequal Replications in Factorial Designs, and Incomplete Block Designs

Unit 8: 2 k Factorial Designs, Single or Unequal Replications in Factorial Designs, and Incomplete Block Designs Unit 8: 2 k Factorial Designs, Single or Unequal Replications in Factorial Designs, and Incomplete Block Designs STA 643: Advanced Experimental Design Derek S. Young 1 Learning Objectives Revisit your

More information

. Example: For 3 factors, sse = (y ijkt. " y ijk

. Example: For 3 factors, sse = (y ijkt.  y ijk ANALYSIS OF BALANCED FACTORIAL DESIGNS Estimates of model parameters and contrasts can be obtained by the method of Least Squares. Additional constraints must be added to estimate non-estimable parameters.

More information

STAT 135 Lab 10 Two-Way ANOVA, Randomized Block Design and Friedman s Test

STAT 135 Lab 10 Two-Way ANOVA, Randomized Block Design and Friedman s Test STAT 135 Lab 10 Two-Way ANOVA, Randomized Block Design and Friedman s Test Rebecca Barter April 13, 2015 Let s now imagine a dataset for which our response variable, Y, may be influenced by two factors,

More information

Master s Written Examination

Master s Written Examination Master s Written Examination Option: Statistics and Probability Spring 016 Full points may be obtained for correct answers to eight questions. Each numbered question which may have several parts is worth

More information

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr.

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr. Department of Economics Business Statistics Chapter 1 Chi-square test of independence & Analysis of Variance ECON 509 Dr. Mohammad Zainal Chapter Goals After completing this chapter, you should be able

More information

Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA)

Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA) Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA) Rationale and MANOVA test statistics underlying principles MANOVA assumptions Univariate ANOVA Planned and unplanned Multivariate ANOVA

More information

Econ 3790: Business and Economic Statistics. Instructor: Yogesh Uppal

Econ 3790: Business and Economic Statistics. Instructor: Yogesh Uppal Econ 3790: Business and Economic Statistics Instructor: Yogesh Uppal Email: yuppal@ysu.edu Chapter 13, Part A: Analysis of Variance and Experimental Design Introduction to Analysis of Variance Analysis

More information

Reference: Chapter 13 of Montgomery (8e)

Reference: Chapter 13 of Montgomery (8e) Reference: Chapter 1 of Montgomery (8e) Maghsoodloo 89 Factorial Experiments with Random Factors So far emphasis has been placed on factorial experiments where all factors are at a, b, c,... fixed levels

More information

Assessing the Effect of Prior Distribution Assumption on the Variance Parameters in Evaluating Bioequivalence Trials

Assessing the Effect of Prior Distribution Assumption on the Variance Parameters in Evaluating Bioequivalence Trials Georgia State University ScholarWorks @ Georgia State University Mathematics Theses Department of Mathematics and Statistics 8--006 Assessing the Effect of Prior Distribution Assumption on the Variance

More information

Inferences about Parameters of Trivariate Normal Distribution with Missing Data

Inferences about Parameters of Trivariate Normal Distribution with Missing Data Florida International University FIU Digital Commons FIU Electronic Theses and Dissertations University Graduate School 7-5-3 Inferences about Parameters of Trivariate Normal Distribution with Missing

More information

Unit 27 One-Way Analysis of Variance

Unit 27 One-Way Analysis of Variance Unit 27 One-Way Analysis of Variance Objectives: To perform the hypothesis test in a one-way analysis of variance for comparing more than two population means Recall that a two sample t test is applied

More information

Chapter 5 Introduction to Factorial Designs Solutions

Chapter 5 Introduction to Factorial Designs Solutions Solutions from Montgomery, D. C. (1) Design and Analysis of Experiments, Wiley, NY Chapter 5 Introduction to Factorial Designs Solutions 5.1. The following output was obtained from a computer program that

More information

Topic 9: Factorial treatment structures. Introduction. Terminology. Example of a 2x2 factorial

Topic 9: Factorial treatment structures. Introduction. Terminology. Example of a 2x2 factorial Topic 9: Factorial treatment structures Introduction A common objective in research is to investigate the effect of each of a number of variables, or factors, on some response variable. In earlier times,

More information

Two or more categorical predictors. 2.1 Two fixed effects

Two or more categorical predictors. 2.1 Two fixed effects Two or more categorical predictors Here we extend the ANOVA methods to handle multiple categorical predictors. The statistician has to watch carefully to see whether the effects being considered are properly

More information

ANOVA Randomized Block Design

ANOVA Randomized Block Design Biostatistics 301 ANOVA Randomized Block Design 1 ORIGIN 1 Data Structure: Let index i,j indicate the ith column (treatment class) and jth row (block). For each i,j combination, there are n replicates.

More information

Stat 217 Final Exam. Name: May 1, 2002

Stat 217 Final Exam. Name: May 1, 2002 Stat 217 Final Exam Name: May 1, 2002 Problem 1. Three brands of batteries are under study. It is suspected that the lives (in weeks) of the three brands are different. Five batteries of each brand are

More information

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication CHAPTER 4 Analysis of Variance One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication 1 Introduction In this chapter, expand the idea of hypothesis tests. We

More information

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance Chapter 8 Student Lecture Notes 8-1 Department of Economics Business Statistics Chapter 1 Chi-square test of independence & Analysis of Variance ECON 509 Dr. Mohammad Zainal Chapter Goals After completing

More information

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent:

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent: Activity #10: AxS ANOVA (Repeated subjects design) Resources: optimism.sav So far in MATH 300 and 301, we have studied the following hypothesis testing procedures: 1) Binomial test, sign-test, Fisher s

More information

STAT 705 Chapter 19: Two-way ANOVA

STAT 705 Chapter 19: Two-way ANOVA STAT 705 Chapter 19: Two-way ANOVA Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 38 Two-way ANOVA Material covered in Sections 19.2 19.4, but a bit

More information

2.830 Homework #6. April 2, 2009

2.830 Homework #6. April 2, 2009 2.830 Homework #6 Dayán Páez April 2, 2009 1 ANOVA The data for four different lithography processes, along with mean and standard deviations are shown in Table 1. Assume a null hypothesis of equality.

More information

Factorial ANOVA. Testing more than one manipulation

Factorial ANOVA. Testing more than one manipulation Factorial ANOVA Testing more than one manipulation Factorial ANOVA Today s goal: Teach you about factorial ANOVA, the test used to evaluate more than two manipulations at the same time Outline: - Why Factorial

More information

This exam contains 5 questions. Each question is worth 10 points. Therefore, this exam is worth 50 points.

This exam contains 5 questions. Each question is worth 10 points. Therefore, this exam is worth 50 points. GROUND RULES: This exam contains 5 questions. Each question is worth 10 points. Therefore, this exam is worth 50 points. Print your name at the top of this page in the upper right hand corner. This is

More information

CS 147: Computer Systems Performance Analysis

CS 147: Computer Systems Performance Analysis CS 147: Computer Systems Performance Analysis CS 147: Computer Systems Performance Analysis 1 / 34 Overview Overview Overview Adding Replications Adding Replications 2 / 34 Two-Factor Design Without Replications

More information

IX. Complete Block Designs (CBD s)

IX. Complete Block Designs (CBD s) IX. Complete Block Designs (CBD s) A.Background Noise Factors nuisance factors whose values can be controlled within the context of the experiment but not outside the context of the experiment Covariates

More information

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables.

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables. One-Way Analysis of Variance With regression, we related two quantitative, typically continuous variables. Often we wish to relate a quantitative response variable with a qualitative (or simply discrete)

More information

y ˆ i = ˆ " T u i ( i th fitted value or i th fit)

y ˆ i = ˆ  T u i ( i th fitted value or i th fit) 1 2 INFERENCE FOR MULTIPLE LINEAR REGRESSION Recall Terminology: p predictors x 1, x 2,, x p Some might be indicator variables for categorical variables) k-1 non-constant terms u 1, u 2,, u k-1 Each u

More information

Gorenstein Injective Modules

Gorenstein Injective Modules Georgia Southern University Digital Commons@Georgia Southern Electronic Theses & Dissertations Graduate Studies, Jack N. Averitt College of 2011 Gorenstein Injective Modules Emily McLean Georgia Southern

More information

A-optimal Minimax Design Criterion for. Two-level Fractional Factorial Designs

A-optimal Minimax Design Criterion for. Two-level Fractional Factorial Designs A-optimal Minimax Design Criterion for Two-level Fractional Factorial Designs by Yue Yin BA. Beijing Institute of Technology 2011 A Thesis Submitted in Partial Fullfillment of the Requirements for the

More information

STAT 705 Chapter 19: Two-way ANOVA

STAT 705 Chapter 19: Two-way ANOVA STAT 705 Chapter 19: Two-way ANOVA Adapted from Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 41 Two-way ANOVA This material is covered in Sections

More information

Analysis of Variance

Analysis of Variance Analysis of Variance Blood coagulation time T avg A 62 60 63 59 61 B 63 67 71 64 65 66 66 C 68 66 71 67 68 68 68 D 56 62 60 61 63 64 63 59 61 64 Blood coagulation time A B C D Combined 56 57 58 59 60 61

More information

50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or Double-Layer Quarter-Wave Thin Films

50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or Double-Layer Quarter-Wave Thin Films University of New Orleans ScholarWorks@UNO University of New Orleans Theses and Dissertations Dissertations and Theses 5-22-2006 50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or

More information

Statistical Hypothesis Testing

Statistical Hypothesis Testing Statistical Hypothesis Testing Dr. Phillip YAM 2012/2013 Spring Semester Reference: Chapter 7 of Tests of Statistical Hypotheses by Hogg and Tanis. Section 7.1 Tests about Proportions A statistical hypothesis

More information

STAT 506: Randomized complete block designs

STAT 506: Randomized complete block designs STAT 506: Randomized complete block designs Timothy Hanson Department of Statistics, University of South Carolina STAT 506: Introduction to Experimental Design 1 / 10 Randomized complete block designs

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur nalysis of Variance and Design of Experiment-I MODULE V LECTURE - 9 FCTORIL EXPERIMENTS Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Sums of squares Suppose

More information

CS 5014: Research Methods in Computer Science. Experimental Design. Potential Pitfalls. One-Factor (Again) Clifford A. Shaffer.

CS 5014: Research Methods in Computer Science. Experimental Design. Potential Pitfalls. One-Factor (Again) Clifford A. Shaffer. Department of Computer Science Virginia Tech Blacksburg, Virginia Copyright c 2015 by Clifford A. Shaffer Computer Science Title page Computer Science Clifford A. Shaffer Fall 2015 Clifford A. Shaffer

More information

Chapter 14 Simple Linear Regression (A)

Chapter 14 Simple Linear Regression (A) Chapter 14 Simple Linear Regression (A) 1. Characteristics Managerial decisions often are based on the relationship between two or more variables. can be used to develop an equation showing how the variables

More information

Unbalanced Data in Factorials Types I, II, III SS Part 2

Unbalanced Data in Factorials Types I, II, III SS Part 2 Unbalanced Data in Factorials Types I, II, III SS Part 2 Chapter 10 in Oehlert STAT:5201 Week 9 - Lecture 2b 1 / 29 Types of sums of squares Type II SS The Type II SS relates to the extra variability explained

More information

STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis

STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis Rebecca Barter April 6, 2015 Multiple Testing Multiple Testing Recall that when we were doing two sample t-tests, we were testing the equality

More information

Analysis of Variance

Analysis of Variance Analysis of Variance Math 36b May 7, 2009 Contents 2 ANOVA: Analysis of Variance 16 2.1 Basic ANOVA........................... 16 2.1.1 the model......................... 17 2.1.2 treatment sum of squares.................

More information

Lecture 10. Factorial experiments (2-way ANOVA etc)

Lecture 10. Factorial experiments (2-way ANOVA etc) Lecture 10. Factorial experiments (2-way ANOVA etc) Jesper Rydén Matematiska institutionen, Uppsala universitet jesper@math.uu.se Regression and Analysis of Variance autumn 2014 A factorial experiment

More information

Statistics 210 Part 3 Statistical Methods Hal S. Stern Department of Statistics University of California, Irvine

Statistics 210 Part 3 Statistical Methods Hal S. Stern Department of Statistics University of California, Irvine Thus far: Statistics 210 Part 3 Statistical Methods Hal S. Stern Department of Statistics University of California, Irvine sternh@uci.edu design of experiments two sample methods one factor ANOVA pairing/blocking

More information

Analysis Of Variance Compiled by T.O. Antwi-Asare, U.G

Analysis Of Variance Compiled by T.O. Antwi-Asare, U.G Analysis Of Variance Compiled by T.O. Antwi-Asare, U.G 1 ANOVA Analysis of variance compares two or more population means of interval data. Specifically, we are interested in determining whether differences

More information

Week 14 Comparing k(> 2) Populations

Week 14 Comparing k(> 2) Populations Week 14 Comparing k(> 2) Populations Week 14 Objectives Methods associated with testing for the equality of k(> 2) means or proportions are presented. Post-testing concepts and analysis are introduced.

More information

A Study on Factorial Designs with Blocks Influence and Inspection Plan for Radiated Emission Testing of Information Technology Equipment

A Study on Factorial Designs with Blocks Influence and Inspection Plan for Radiated Emission Testing of Information Technology Equipment A Study on Factorial Designs with Blocks Influence and Inspection Plan for Radiated Emission Testing of Information Technology Equipment By Kam-Fai Wong Department of Applied Mathematics National Sun Yat-sen

More information

Formal Statement of Simple Linear Regression Model

Formal Statement of Simple Linear Regression Model Formal Statement of Simple Linear Regression Model Y i = β 0 + β 1 X i + ɛ i Y i value of the response variable in the i th trial β 0 and β 1 are parameters X i is a known constant, the value of the predictor

More information

Linear models and their mathematical foundations: Simple linear regression

Linear models and their mathematical foundations: Simple linear regression Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction

More information

Analysis of variance, multivariate (MANOVA)

Analysis of variance, multivariate (MANOVA) Analysis of variance, multivariate (MANOVA) Abstract: A designed experiment is set up in which the system studied is under the control of an investigator. The individuals, the treatments, the variables

More information

Two Factor Completely Between Subjects Analysis of Variance. 2/12/01 Two-Factor ANOVA, Between Subjects 1

Two Factor Completely Between Subjects Analysis of Variance. 2/12/01 Two-Factor ANOVA, Between Subjects 1 Two Factor Completely Between Subjects Analysis of Variance /1/1 Two-Factor AOVA, Between Subjects 1 ypothetical alertness data from a x completely between subjects factorial experiment Lighted room Darkened

More information

Ch 3: Multiple Linear Regression

Ch 3: Multiple Linear Regression Ch 3: Multiple Linear Regression 1. Multiple Linear Regression Model Multiple regression model has more than one regressor. For example, we have one response variable and two regressor variables: 1. delivery

More information

Lecture 9. ANOVA: Random-effects model, sample size

Lecture 9. ANOVA: Random-effects model, sample size Lecture 9. ANOVA: Random-effects model, sample size Jesper Rydén Matematiska institutionen, Uppsala universitet jesper@math.uu.se Regressions and Analysis of Variance fall 2015 Fixed or random? Is it reasonable

More information

Master s Written Examination - Solution

Master s Written Examination - Solution Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analysis of Variance and Design of Experiment-I MODULE IX LECTURE - 38 EXERCISES Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Example (Completely randomized

More information

Comparing Group Means When Nonresponse Rates Differ

Comparing Group Means When Nonresponse Rates Differ UNF Digital Commons UNF Theses and Dissertations Student Scholarship 2015 Comparing Group Means When Nonresponse Rates Differ Gabriela M. Stegmann University of North Florida Suggested Citation Stegmann,

More information