Lecture 3. Experiments with a Single Factor: ANOVA Montgomery Sections 3.1 through 3.5 and Section

Size: px
Start display at page:

Download "Lecture 3. Experiments with a Single Factor: ANOVA Montgomery Sections 3.1 through 3.5 and Section"

Transcription

1 Lecture 3. Experiments with a Single Factor: ANOVA Montgomery Sections 3.1 through 3.5 and Section Lecture 3 Page 1

2 Tensile Strength Experiment Investigate the tensile strength of a new synthetic fiber. The factor is the weight percent of cotton used in the blend of the materials for the fiber and it has five levels. percent tensile strength of cotton total average Lecture 3 Page 2

3 Data Layout for Single-Factor Experiments treatment observations totals averages 1 y 11 y 12 y 1n y 1. ȳ 1. 2 y 21 y 22 y 2n y 2. ȳ a y a1 y a2 y an y a. ȳ a Lecture 3 Page 3

4 Analysis of Variance Statistical Model (Factor Effects Model): y ij = µ+τ i +ǫ ij i = 1,2...,a j = 1,2,...,n i µ - grand mean; τ i -ith treatment effect; ǫ ij iid N(0,σ 2 ) - error Constraint: a i=1 τ i = 0 (Conceptual Approach; SAS:τ a = 0). Estimates for parameters: ˆµ = y.. ˆτ i = (y i. y.. ) ˆǫ ij = y ij y i. ( residual) Basic Hypotheses: H 0 : τ 1 = τ 2 =... = τ a = 0 vs H 1 : τ i 0 for at least onei 4 Lecture 3 Page 4

5 Analysis of Variance (ANOVA) Table If balanced: Source of Sum of Degrees of Mean F 0 Variation Squares Freedom Square Between SS Treatment a 1 MS Treatment F 0 Within SS E N a MS E Total SS T N 1 N = n a SS T = y 2 ij y2../n; SS Treatment = 1 n y 2 i. y 2../N SS E =SS T - SS Treatment If unbalanced: N = a i=1 n i SS T = y 2 ij y2../n; SS Treatment = y 2 i. n i y 2../N SS E =SS T - SS Treatment SS Treatments = a i=1 n iˆτ 2 i andss E = i j ˆǫ2 ij. 5 Lecture 3 Page 5

6 The Expected Mean Squares (EMS) are E(MS E )=σ 2 E(MS Treatment ) =σ 2 + n i τ 2 i /(a 1) Test Statistic F 0 = SS Treatments/(a 1) SS E /(N a) = MS Treatments MS E UnderH 0 : F 0 = SS Treatment/σ 2 (a 1) SS E /σ 2 (N a) = χ2 a 1/(a 1) χ 2 N a /(N a) F a 1,N a Decision Rule: IfF 0 > F α,a 1,N a then rejecth 0 Whena = 2, the square of the t-test statistict 2 0 = MS Treatment MS E = F 0. F -test and two-sample two-sided test are equivalent. 6 Lecture 3 Page 6

7 Example Twelve lambs are randomly assigned to three different diets. The weight gain (in two weeks) is recorded. Is there a difference between the diets? Diet Diet Diet N = 12, yij = 156 and ȳ.. = 156/12 = 13. n 1 = 3,y 1. = 33,ȳ 1. = 11;n 2 = 5,y 2. = 75,ȳ 2. = 15;n 3 = 4,y 3. = 48 and ȳ 3. = 12. ˆτ 1 = ȳ 1. ȳ.. = = 2; Similarly, ˆτ 2 = = 2 and ˆτ 3 = = 1. SS T = i j (y ij ȳ.. ) 2 = 246. SS Treatment = 3 ( 2) 2 +5 (2) 2 +4 ( 1) 2 = 36. SS E = = 210; MS E = ˆσ 2 = 210/(12 3) = F 0 = (36/2)/(210/9) = 0.77; P-value>0.25; Fail to rejecth 0 : τ 1 = τ 2 = τ 3 = 0. 7 Lecture 3 Page 7

8 Using SAS (lambs.sas) option nocenter ps=65 ls=80; data lambs; input diet wtgain datalines; ; symbol1 bwidth=5 i=box; axis1 offset=(5); proc gplot; plot wtgain*diet / frame haxis=axis1; run; quit; 8 Lecture 3 Page 8

9 proc glm; class diet; model wtgain=diet; output out=diag r=res p=pred; run; quit; Sum of Source DF Squares Mean Square F Value Pr > F Model Error Corrected Total R-Square Coeff Var Root MSE wtgain Mean Source DF Type I SS Mean Square F Value Pr > F diet Source DF Type III SS Mean Square F Value Pr > F diet Lecture 3 Page 9

10 proc gplot; plot res*diet /frame haxis=axis1; proc sort; by pred; symbol1 v=circle i=sm50; proc gplot; plot res*pred / haxis=axis1; run; quit; 10 Lecture 3 Page 10

11 Model Checking and Diagnostics Model Assumptions 1 Model is correct 2 Independent observations 3 Errors normally distributed 4 Constant variance y ij = (y.. +(y i. y.. )) + (y ij y i. ) y ij = ŷ ij + ˆǫ ij observed = predicted + residual Note that the predicted response at treatmentiisŷ ij = ȳ i. Diagnostics use predicted responses and residuals. 11 Lecture 3 Page 11

12 Diagnostics Normality Histogram of residuals Normal probability plot / QQ plot Shapiro-Wilk Test Constant Variance Plotˆǫ ij vsŷ ij (residual plot) Bartlett s or Modified Levene s Test Independence Plotˆǫ ij vs time/space Plotˆǫ ij vs variable of interest Outliers 12 Lecture 3 Page 12

13 Diagnostics Example: Tensile Strength Experiment options ls=80 ps=60 nocenter; goptions device=win target=winprtm rotate=landscape ftext=swiss hsize=8.0in vsize=6.0in htext=1.5 htitle=1.5 hpos=60 vpos=60 horigin=0.5in vorigin=0.5in; data one; infile c:\saswork\data\tensile.dat ; input percent strength time; title1 Tensile Strength Example ; proc print data=one; run; Obs percent strength time : : : : Lecture 3 Page 13

14 symbol1 v=circle i=none; title1 Plot of Strength vs Percent Blend ; proc gplot data=one; plot strength*percent/frame; run; proc boxplot; plot strength*percent/boxstyle=skeletal pctldef=4; run; 14 Lecture 3 Page 14

15 proc glm data=one; class percent; model strength=percent; means percent / hovtest=bartlett hovtest=levene; output out=diag p=pred r=res; run; Sum of Source DF Squares Mean Square F Value Pr > F Model <.0001 Error Corrected Total Levene s Test for Homogeneity of strength Variance ANOVA of Squared Deviations from Group Means Sum of Mean Source DF Squares Square F Value Pr > F percent Error Bartlett s Test for Homogeneity of strength Variance Source DF Chi-Square Pr > ChiSq percent Lecture 3 Page 15

16 proc sort; by pred; symbol1 v=circle i=sm50; title1 Residual Plot ; proc gplot; plot res*pred/frame; run; 16 Lecture 3 Page 16

17 proc univariate data=diag normal; var res; qqplot res / normal (L=1 mu=est sigma=est); histogram res / normal; run; Moments N 25 Sum Weights 25 Mean 0 Sum Observations 0 Std Deviation Variance Skewness Kurtosis Uncorrected SS Corrected SS Coeff Variation. Std Error Mean Tests for Normality Test --Statistic p Value Shapiro-Wilk W Pr < W Kolmogorov-Smirnov D Pr > D Cramer-von Mises W-Sq Pr > W-Sq Anderson-Darling A-Sq Pr > A-Sq Lecture 3 Page 17

18 Histogram of Residuals & QQ Plot 18 Lecture 3 Page 18

19 /* Time Serial Plot */ symbol1 v=circle i=none; title1 Plot of residuals vs time ; proc gplot; plot res*time / vref=0 vaxis=-6 to 6 by 1; run; 19 Lecture 3 Page 19

20 Non-Constant Variance: Impact and Remedy Does not affect F-test dramatically when the experiment is balanced Why concern? Comparison of treatments depends on MSE Incorrect intervals and comparison results Variance-Stabilizing Transformations Ideas for Finding Proper Transformations (E[Y] = µ, Var(Y) = σ 2 ) f(y) f(µ)+(y µ)f (µ) = Var(f(Y)) [f (µ)] 2 Var(Y) = [f (µ)] 2 σ 2 Find f such that Var(f(Y)) does not depend on µ anymore. So, Ỹ = f(y) has constant variance for different f(µ). Common transformations x,log(x),1/x,arcsin( x), and1/ x 20 Lecture 3 Page 20

21 Transformations Supposeσ 2 is a function ofµ, that isσ 2 = g(µ) Want to find transformationf such thatỹ Var(Ỹ ) does not depend onµ. = f(y)has constant variance: Have shown Var(Ỹ ) [f (µ)] 2 σ 2 [f (µ)] 2 g(µ) Want to choosef such that[f (µ)] 2 g(µ) c Examples g(µ) = µ (Poisson) f(µ) = 1 µ dµ f(x) = X g(µ) = µ(1 µ) (Binomial) f(µ) = 1 dµ f(x) = arcsin( X) µ(1 µ) g(µ) = µ 2β (Box-Cox) f(µ) = µ β dµ f(x) = X 1 β g(µ) = µ 2 (Box-Cox) f(µ) = 1 µ dµ f(x) = logx 21 Lecture 3 Page 21

22 Identify Box-Cox Transformation Using Data: Approximate Method From the previous slide, ifσ = θµ β, the transformation is Y 1 β β 1; f(y) = logy β = 1 So it is crucial to estimateβ based on datay ij,i = 1,...,a. We have logσ i = logθ +βlogµ i Lets i andȳ i. be the sample standard deviations and means. Because ˆσ i = s i and ˆµ i = ȳ i., fori = 1,...,a, logs i constant +βlogȳ i., = logs i = constant +βlogȳ i. + error i. We can plotlogs i againstlogȳ i., fit a straight line and use the slope to estimateβ. 22 Lecture 3 Page 22

23 Identify Box-Cox Transformation: Formal Method 1. For a fixedλ, perform analysis of variance on y ij (λ) = y λ ij 1 λẏ λ 1 λ 0 ẏlogy ij λ = 0 whereẏ = a n i i=1j=1 y ij 1/N. 2. Step 1 generates a transformed datay ij (λ). Apply ANOVA to the new data and obtain SS E. Because SS E depends onλ, it is denoted by SS E (λ). Repeat 1 and 2 for variousλin an interval, e.g., [-2,2], and record SS E (λ) 3 Findλ 0 which minimizes SS E (λ) and pick up a meaningfulλin the neighborhood ofλ 0. Denote it again byλ. (Maximum Likelihood Principle) 4 The transformation is: ỹ ij = y λ 0 ij ifλ 0 0; ỹ ij = log y ij ifλ 0 = Lecture 3 Page 23

24 Example: Approximate Method (trans.sas) data one; infile c:\saswork\data\boxcox.dat ; input trt resp; proc glm data=one; class trt; model resp=trt; output out=diag p=pred r=res; run; title1 Residual Plot ; symbol1 v=circle i=none; proc gplot data=diag; plot res*pred /frame; run; 24 Lecture 3 Page 24

25 proc univariate data=one noprint; var resp; by trt; output out=two mean=mu std=sigma; data three; set two; logmu = log(mu); logsig = log(sigma); proc reg; model logsig = logmu; title1 Mean vs Std Dev ; symbol1 v=circle i=rl; proc gplot; plot logsig*logmu / regeqn; run; 25 Lecture 3 Page 25

26 Example: Formal Method (trans1.sas) data one; infile c:\saswork\data\boxcox.dat ; input trt resp; logresp = log(resp); proc univariate data=one noprint; var logresp; output out=two mean=mlogresp; data three; set one; if _n_ eq 1 then set two; ydot = exp(mlogresp); do l=-2.0 to 2.0 by.25; den = l*ydot**(l-1); if abs(l) eq 0 then den = 1; yl=(resp**l -1)/den; if abs(l) < then yl=ydot*log(resp); output; end; keep trt yl l; proc sort data=three out=three; by l; 26 Lecture 3 Page 26

27 proc glm data=three noprint outstat=four; class trt; model yl=trt; by l; data five; set four; if _SOURCE_ eq ERROR ; keep l SS; proc print data=five; run; SS E (λ) andλ OBS L SS OBS L SS Lecture 3 Page 27

28 symbol1 v=circle i=sm50; proc gplot; plot SS*l; run; Plot of SS E (λ) vsλ 28 Lecture 3 Page 28

29 Using PROC TRANSREG proc transreg data=one; model boxcox(y/lambda=-2.0 to 2.0 by 0.1)=class(trt); run; Transformation Information for BoxCox(y) Lambda R-Square Log Like : : : * * < * * < Best Lambda : : : * Confidence Interval Convenient Lambda 29 Lecture 3 Page 29

30 Kruskal-Wallis Test: a Nonparametric Alternative for Nonnormality a treatments,h 0 : a treatments are not different. Rank the observationsy ij in ascending order Replace each observation by its rankr ij (assign average for tied observations), and apply one-way ANOVA tor ij. Test statistic (a = 2 = Wilcoxon rank-sum test) [ a ] H = 1 R 2 S 2 i. i=1 n i N(N+1)2 4 χ 2 a 1 underh 0 [ a wheres 2 = 1 ] ni N 1 i=1 j=1 R2 ij N(N+1)2 4 Decision Rule: rejecth 0 ifh > χ 2 α,a 1. LetF 0 be thef -test statistic in ANOVA based onr ij. Then F 0 = H/(a 1) (N 1 H)/(N a) 30 Lecture 3 Page 30

31 A Nonnormality Example data new; input strain nitrogen cards; ; proc glm data=new; class strain; model nitrogen=strain; output out=newres r=res; run; proc univariate data=newres normal; var res; qqplot res / normal (L=1 mu=est sigma=est); run; quit; 31 Lecture 3 Page 31

32 Tests for Normality Test Statistic p Value Shapiro-Wilk W Pr < W Kolmogorov-Smirnov D Pr > D Cramer-von Mises W-Sq Pr > W-Sq Anderson-Darling A-Sq Pr > A-Sq Lecture 3 Page 32

33 proc npar1way data=new; /* May need option: wilcoxon */ class strain; var nitrogen; run; Analysis of Variance for Variable nitrogen Classified by Variable strain Source DF Sum of Squares Mean Square F Value Pr > F Among <.0001 Within Kruskal-Wallis Test Chi-Square DF 5 Pr > Chi-Square Median One-Way Analysis Chi-Square DF 5 Pr > Chi-Square Lecture 3 Page 33

34 Linear Combinations of Treatment Means ANOVA Model: y ij = µ + τ i + ǫ ij (τ i : treatment effect) = µ i + ǫ ij (µ i : treatment mean) Linear combination with given coefficientsc 1,c 2,...,c a : L = c 1 µ 1 +c 2 µ c a µ a = a c i µ i, i=1 Want to test: H 0 : L = c i µ i = L 0 Examples: 1. Pairwise comparison: µ i µ j = 0 for all possibleiandj. 2. Compare treatment vs control: µ i µ 1 = 0 when treatment 1 is a control andi = 2,...,a are new treatments. 3. General cases such asµ 1 2µ 2 +µ 3 = 0,µ 1 +3µ 2 6µ 3 = 0, etc. 34 Lecture 3 Page 34

35 Estimate ofl: ˆL = c iˆµ i = c i ȳ i. Var(ˆL) = c 2 i Var(ȳ i.) = σ 2 c 2 i n i (= σ2 n c 2 i ) Standard Error of ˆL S.E.{ˆL} = MSE c 2 i n i Test statistic t 0 = (ˆL L 0 ) S.E.{ˆL} t(n a) underh 0 35 Lecture 3 Page 35

36 Example: Lambs Diet Experiment Denote the treatment means of three diets byµ 1,µ 2 andµ 3. Suppose one wants to testh 0 : L = 60 with L = µ 1 +2µ 2 +3µ 3 = 6µ+τ 1 +2τ 2 +3τ 3. proc glm data=lambs; class diet; model wtgain=diet; means diet; estimate l1 intercept 6 diet 1 2 3; run; Standard Parameter Estimate Error t Value Pr > t l <.0001 t 0 = ( )/8.89 = 1.91 P value = P(t 1.91 ort 1.91 t(12 3)) =.088 Fail to rejecth 0 : µ 1 +2µ 2 +3µ 3 = 60 atα = 5%. 36 Lecture 3 Page 36

37 Contrasts Γ = a i=1 c iµ i is a contrast if a i=1 c i = 0. Equivalently,Γ = a i=1 c iτ i. Examples 1. Γ 1 = µ 1 µ 2 = µ 1 µ 2 +0µ 3 +0µ 4, c 1 = 1,c 2 = 1,c 3 = 0,c 4 = 0 Comparingµ 1 andµ Γ 2 = µ 1 0.5µ 2 0.5µ 3 = µ 1 0.5µ 2 0.5µ 3 +0µ 4 c 1 = 1,c 2 = 0.5,c 3 = 0.5,c 4 = 0 Comparingµ 1 and the average ofµ 2 andµ 3. Estimate ofγ: C = a i=1 c iȳ i. 37 Lecture 3 Page 37

38 Contrast Sum of Squares SS C = ( ci y i. ) 2 / (c 2 i /n i ) SS C represents the amount of variation attributableγ. Test: H 0 : Γ = 0 t 0 = C S.E. C H 0 t(n a) t 2 0 = ( c i ȳ i. ) 2 MSE c 2 i n i = SS C /1 MSE H 0 F1,N a 38 Lecture 3 Page 38

39 Tensile Strength Example (cont.sas) proc glm data=one; class percent; model strength=percent; contrast C1 percent ; contrast C2 percent ; contrast C3 percent ; contrast C4 percent ; Dependent Variable: STRENGTH Sum of Mean Source DF Squares Square F Value Pr > F Model Error Corrected Total Source DF Type I SS Mean Square F Value Pr > F PERCENT Contrast DF Contrast SS Mean Square F Value Pr > F C C C C Lecture 3 Page 39

40 Orthogonal Contrasts Two contrasts{c i } and{d i } are Orthogonal if a i=1 c i d i n i = 0 ( a c i d i = 0 for balanced experiments) i=1 Example (Balanced Experiment) Γ 1 = µ 1 +µ 2 µ 3 µ 4, Soc 1 = 1,c 2 = 1,c 3 = 1,c 4 = 1. Γ 2 = µ 1 µ 2 +µ 3 µ 4. Sod 1 = 1,d 2 = 1,d 3 = 1,d 4 = 1 It is easy to verify that bothγ 1 andγ 2 are contrasts. Furthermore, c 1 d 1 +c 2 d 2 +c 3 d 3 +c 4 d 4 = ( 1)+( 1) 1+( 1) ( 1) = 0. Hence,Γ 1 andγ 2 are orthogonal to each other. A complete set of orthogonal contrastsc = {Γ 1,Γ 2,...,Γ a 1 } if contrasts are mutually orthogonal and there does not exist a contrast orthogonal outside ofc to all the contrasts inc. 40 Lecture 3 Page 40

41 If there are a treatments, C must contain a 1 contrasts. Complete set is not unique. For example, in the tensile strength example C 1 : includes : Γ 1 = (0, 0, 0, 1, 1) Γ 2 = (1, 0, 1, 1, 1) Γ 3 = (1, 0, 1, 0, 0) Γ 4 = (1, 4, 1, 1, 1) C 2 : includes : Γ 1 = ( 2, 1, 0, 1, 2) Γ 2 = (2, 1, 2, 1, 2) Γ 3 = ( 1, 2, 0, 2, 1) Γ 4 = (1, 4, 6, 4, 1) 41 Lecture 3 Page 41

42 SupposeC 1,C 2,...,C a 1 are the estimates of the contrasts in a complete set of contrasts{γ 1,Γ 2,...,Γ a 1 }, then SS Treatment = SS C1 + SS C2 + +SS Ca 1 F 0 = MS Treatment MSE = F 10 +F F (a 1)0 a 1 wheref i0 is the test statistic used to test contrastγ i. Orthogonal contrasts (estimates) are independent with each other The results follow Cochran s Theorem, so comparisons are indepedent Example on Slide 39 Can also use orthogonal contrasts to study trend Only interesting if treatments are quantitative (ordered): X 1,,X a For equally spaced treatments andn i = n,c i in Table IX Breakdown of polynomial regressionµ(x) = β 0 +β 1 x+ +β a 1 x a 1 = µ(x) = β 0 + β 1 P 1 (x)+ + β a 1 P a 1 (x) P k (x) is of k-th order, and a i=1 P k(x i ) = 0 (constrast) 42 Lecture 3 Page 42

43 Determining Orthogonal Polynomial Coefficients Using SAS Complete set: {(P k (X 1 ),,P k (X a )) : k = 1,,a 1} Often the levels of the treatments are not equally spaced Can useproc IML to determine coefficients(p k (X 1 ),,P k (X a )) proc iml; levels={ }; /* Consider these 5 levels */ print levels; coef=orpol(levels,4); /* Gives all coefs up through 4th order */ coef=t(coef); /* Puts coefs in rows instead of cols */ coef=coef[2:5,]; /* Eliminates the first row of coef matrix*/ print coef; run; LEVELS COEF Lecture 3 Page 43

44 One contrast: Testing Multiple Contrasts (Multiple Comparisons) H 0 : Γ = c i µ i = Γ 0 vs H 1 : Γ Γ 0 atα 100(1-α) Confidence Interval (CI) forγ: CI : c i ȳ i. ±t α/2,n a MS E c 2 i n i P(CI does not containl 0 H 0 ) = α(= type I error rate) Decision Rule: RejectH 0 if CI does not containγ Lecture 3 Page 44

45 Multiple contrasts, fori = 1,2,,m, H (i) 0 : Γ (i) = Γ (i) 0, vs H(i) 1 : Γ (i) Γ (i) 0 If we construct CI 1, CI 2,..., CI m, each with 100(1-α) level, then for each CI i, P(rejectH (i) 0 H (i) 0 ) = P(CI i does not containγ (i) 0 H (i) 0 ) = α But the overall Type I error rate (probability of any type I error in testing H (i) 0 vsh (i) 1,i = 1,,m) is inflated and much larger thanα, that is, P(reject at least one of{h (i) 0 = P(at least one CI i do not containγ (i) α,i = 1,,m} H(1) 0,,H(m) 0 ) 0 H (1) 0,,H(m) 0 ) One way to achieve small overall error rate, we require much smaller error rate (α ) of each individual CI i. 45 Lecture 3 Page 45

46 Bonferroni Inequality Bonferroni Method for Testing Multiple Contrasts P(reject at least one of{h (i) 0 = P(rejectH (1) 0 or rejecth (2) 0 or... or rejecth (m),i = 1,,m} H(1) 0,,H(m) 0 ) 0 H (1) 0,,H(m) 0 ) P(rejectH (1) 0 H (1) 0 )+ +P(rejectH(m) 0 H (m) 0 ) = mα In order to control overall error rate (or, overall confidence level), let mα = α, we have,α = α/m Bonferroni CIs: CI i : c ij ȳ j. ±t α/2m (N a) MS E c 2 ij n j When m is large, Bonferroni CIs are too conservative ( overall type II error too large). 46 Lecture 3 Page 46

47 Scheffe s Method for Testing All Contrasts Consider all possible contrasts: Γ = c i µ i Estimate: C = c i ȳ i., St. Error: S.E. C = MS E c 2 i n i Critical value: (a 1)Fα,a 1,N a Scheffe s simultaneous CI:C ± (a 1)F α,a 1,N a S.E. C Overall confidence level and error rate formcontrasts P(CIs contain true parameter for any contrast) 1 α P(at least one CI does not contain true parameter) α Remark: Scheffe s method is also conservative, too conservative whenmis small 47 Lecture 3 Page 47

48 Methods for Pairwise Comparisons There area(a 1)/2 possible pairs: µ i µ j (contrast for comparingµ i andµ j ). We may be interested inmpairs or all pairs. Standard Procedure: 1. Estimation: ȳ i. ȳ j. 2. Compute a Critical Difference (CD) (based on the method employed) 3. If or equivalently if the interval ȳ i. ȳ j. > CD (ȳ i. ȳ j. CD, ȳ i. ȳ j. +CD) does not contain zero, declareµ i µ j significant. 48 Lecture 3 Page 48

49 Least significant difference (LSD): not control overall error rate Bonferroni method (for m pairs) Methods for Calculating CD CD = t α/2,n a MS E (1/n i +1/n j ) CD = t α/2m,n a MS E (1/n i +1/n j ) control overall error rate for themcomparisons. Tukey s method (for all possible pairs) CD = q α(a,n a) 2 MS E (1/n i +1/n j ) q α (a,n a) from studentized range distribution (Table VII). Control overall error rate (exact for balanced experiments). (Example 3.7). 49 Lecture 3 Page 49

50 Comparing Treatments with Control (Dunnett s Method) 1. Assumeµ 1 is a control, andµ 2,...,µ a are (new) treatments 2. Only interested ina 1pairs: µ 2 µ 1,...,µ a µ 1 3. Compare ȳ i. ȳ 1. to CD = d α (a 1,N a) MS E (1/n i +1/n 1 ) whered α (p,f) from Table VIII: critical values for Dunnett s test. 4. Remark: control overall error rate. Read Example 3.9 For pairwise comparison, which method should be preferred? LSD, Bonferroni, Tukey, Dunnett or others? 50 Lecture 3 Page 50

51 Tensile Strength Example data one; infile c:\saswork\data\tensile.dat ; input percent strength time; proc glm data=one; class percent; model strength=precent; /* Construct CI for Treatment Means*/ means percent /alpha=.05 lsd clm; means percent / alpha=.05 bon clm; /* Pairwise Comparison*/ means percent /alpha=.05 lines lsd; means percent /alpha=.05 lines bon; means percent /alpha=.05 lines scheffe; means percent /alpha=.05 lines tukey; means percent /dunnett; run; 51 Lecture 3 Page 51

52 The GLM Procedure t Confidence Intervals for y Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of t Half Width of Confidence Interval % Confidence trt N Mean Limits Lecture 3 Page 52

53 The GLM Procedure Bonferroni t Confidence Intervals for y Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of t Half Width of Confidence Interval Simultaneous 95% trt N Mean Confidence Limits Lecture 3 Page 53

54 t Tests (LSD) for y NOTE: This test controls the Type I comparisonwise error rate, not the experimentwise error rate. Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of t Least Significant Difference Means with the same letter are not significantly different. t Grouping Mean N trt A B B C C Lecture 3 Page 54

55 Bonferroni (Dunn) t Tests for y NOTE: This test controls the Type I experimentwise error rate, but it generally has a higher Type II error rate than REGWQ. Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of t Minimum Significant Difference Means with the same letter are not significantly different. Bon Grouping Mean N trt A B A B C C C Lecture 3 Page 55

56 Scheffe s Test for y NOTE: This test controls the Type I experimentwise error rate. Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of F Minimum Significant Difference Means with the same letter are not significantly different. Scheffe Grouping Mean N trt A A B A B B C C C C C Lecture 3 Page 56

57 Tukey s Studentized Range (HSD) Test for y NOTE: This test controls the Type I experimentwise error rate, but it generally has a higher Type II error rate than REGWQ. Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of Studentized Range Minimum Significant Difference Means with the same letter are not significantly different. Tukey Grouping Mean N trt A A B A B B C C D C D D Lecture 3 Page 57

58 Dunnett s t Tests for y NOTE: This test controls the Type I experimentwise error for comparisons of all treatments against a control. Alpha 0.05 Error Degrees of Freedom 20 Error Mean Square 8.06 Critical Value of Dunnett s t Minimum Significant Difference Comparisons significant at the 0.05 level are indicated by ***. Difference trt Between Simultaneous 95% Comparison Means Confidence Limits *** *** *** Lecture 3 Page 58

Lecture 4. Checking Model Adequacy

Lecture 4. Checking Model Adequacy Lecture 4. Checking Model Adequacy Montgomery: 3-4, 15-1.1 Page 1 Model Checking and Diagnostics Model Assumptions 1 Model is correct 2 Independent observations 3 Errors normally distributed 4 Constant

More information

Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3-1 through 3-3

Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3-1 through 3-3 Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3-1 through 3-3 Page 1 Tensile Strength Experiment Investigate the tensile strength of a new synthetic fiber. The factor is the weight percent

More information

Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3.1 through 3.3

Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3.1 through 3.3 Lecture 3. Experiments with a Single Factor: ANOVA Montgomery 3.1 through 3.3 Fall, 2013 Page 1 Tensile Strength Experiment Investigate the tensile strength of a new synthetic fiber. The factor is the

More information

Comparison of a Population Means

Comparison of a Population Means Analysis of Variance Interested in comparing Several treatments Several levels of one treatment Comparison of a Population Means Could do numerous two-sample t-tests but... ANOVA provides method of joint

More information

Lecture 5: Comparing Treatment Means Montgomery: Section 3-5

Lecture 5: Comparing Treatment Means Montgomery: Section 3-5 Lecture 5: Comparing Treatment Means Montgomery: Section 3-5 Page 1 Linear Combination of Means ANOVA: y ij = µ + τ i + ɛ ij = µ i + ɛ ij Linear combination: L = c 1 µ 1 + c 1 µ 2 +...+ c a µ a = a i=1

More information

Single Factor Experiments

Single Factor Experiments Single Factor Experiments Bruce A Craig Department of Statistics Purdue University STAT 514 Topic 4 1 Analysis of Variance Suppose you are interested in comparing either a different treatments a levels

More information

Linear Combinations. Comparison of treatment means. Bruce A Craig. Department of Statistics Purdue University. STAT 514 Topic 6 1

Linear Combinations. Comparison of treatment means. Bruce A Craig. Department of Statistics Purdue University. STAT 514 Topic 6 1 Linear Combinations Comparison of treatment means Bruce A Craig Department of Statistics Purdue University STAT 514 Topic 6 1 Linear Combinations of Means y ij = µ + τ i + ǫ ij = µ i + ǫ ij Often study

More information

Introduction to Design and Analysis of Experiments with the SAS System (Stat 7010 Lecture Notes)

Introduction to Design and Analysis of Experiments with the SAS System (Stat 7010 Lecture Notes) Introduction to Design and Analysis of Experiments with the SAS System (Stat 7010 Lecture Notes) Asheber Abebe Discrete and Statistical Sciences Auburn University Contents 1 Completely Randomized Design

More information

Outline. Topic 20 - Diagnostics and Remedies. Residuals. Overview. Diagnostics Plots Residual checks Formal Tests. STAT Fall 2013

Outline. Topic 20 - Diagnostics and Remedies. Residuals. Overview. Diagnostics Plots Residual checks Formal Tests. STAT Fall 2013 Topic 20 - Diagnostics and Remedies - Fall 2013 Diagnostics Plots Residual checks Formal Tests Remedial Measures Outline Topic 20 2 General assumptions Overview Normally distributed error terms Independent

More information

SAS Procedures Inference about the Line ffl model statement in proc reg has many options ffl To construct confidence intervals use alpha=, clm, cli, c

SAS Procedures Inference about the Line ffl model statement in proc reg has many options ffl To construct confidence intervals use alpha=, clm, cli, c Inference About the Slope ffl As with all estimates, ^fi1 subject to sampling var ffl Because Y jx _ Normal, the estimate ^fi1 _ Normal A linear combination of indep Normals is Normal Simple Linear Regression

More information

Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2

Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Fall, 2013 Page 1 Random Variable and Probability Distribution Discrete random variable Y : Finite possible values {y

More information

Lecture 7: Latin Square and Related Design

Lecture 7: Latin Square and Related Design Lecture 7: Latin Square and Related Design Montgomery: Section 4.2-4.3 Page 1 Automobile Emission Experiment Four cars and four drivers are employed in a study for possible differences between four gasoline

More information

Lecture 7: Latin Squares and Related Designs

Lecture 7: Latin Squares and Related Designs Lecture 7: Latin Squares and Related Designs Montgomery: Section 4.2 and 4.3 1 Lecture 7 Page 1 Automobile Emission Experiment Four cars and four drivers are employed in a study of four gasoline additives(a,b,c,

More information

unadjusted model for baseline cholesterol 22:31 Monday, April 19,

unadjusted model for baseline cholesterol 22:31 Monday, April 19, unadjusted model for baseline cholesterol 22:31 Monday, April 19, 2004 1 Class Level Information Class Levels Values TRETGRP 3 3 4 5 SEX 2 0 1 Number of observations 916 unadjusted model for baseline cholesterol

More information

Lecture 4. Random Effects in Completely Randomized Design

Lecture 4. Random Effects in Completely Randomized Design Lecture 4. Random Effects in Completely Randomized Design Montgomery: 3.9, 13.1 and 13.7 1 Lecture 4 Page 1 Random Effects vs Fixed Effects Consider factor with numerous possible levels Want to draw inference

More information

One-Way Analysis of Variance (ANOVA) There are two key differences regarding the explanatory variable X.

One-Way Analysis of Variance (ANOVA) There are two key differences regarding the explanatory variable X. One-Way Analysis of Variance (ANOVA) Also called single factor ANOVA. The response variable Y is continuous (same as in regression). There are two key differences regarding the explanatory variable X.

More information

Lecture 10: 2 k Factorial Design Montgomery: Chapter 6

Lecture 10: 2 k Factorial Design Montgomery: Chapter 6 Lecture 10: 2 k Factorial Design Montgomery: Chapter 6 Page 1 2 k Factorial Design Involving k factors Each factor has two levels (often labeled + and ) Factor screening experiment (preliminary study)

More information

ANALYSIS OF VARIANCE OF BALANCED DAIRY SCIENCE DATA USING SAS

ANALYSIS OF VARIANCE OF BALANCED DAIRY SCIENCE DATA USING SAS ANALYSIS OF VARIANCE OF BALANCED DAIRY SCIENCE DATA USING SAS Ravinder Malhotra and Vipul Sharma National Dairy Research Institute, Karnal-132001 The most common use of statistics in dairy science is testing

More information

EXST7015: Estimating tree weights from other morphometric variables Raw data print

EXST7015: Estimating tree weights from other morphometric variables Raw data print Simple Linear Regression SAS example Page 1 1 ********************************************; 2 *** Data from Freund & Wilson (1993) ***; 3 *** TABLE 8.24 : ESTIMATING TREE WEIGHTS ***; 4 ********************************************;

More information

Chapter 8 (More on Assumptions for the Simple Linear Regression)

Chapter 8 (More on Assumptions for the Simple Linear Regression) EXST3201 Chapter 8b Geaghan Fall 2005: Page 1 Chapter 8 (More on Assumptions for the Simple Linear Regression) Your textbook considers the following assumptions: Linearity This is not something I usually

More information

Outline. Topic 19 - Inference. The Cell Means Model. Estimates. Inference for Means Differences in cell means Contrasts. STAT Fall 2013

Outline. Topic 19 - Inference. The Cell Means Model. Estimates. Inference for Means Differences in cell means Contrasts. STAT Fall 2013 Topic 19 - Inference - Fall 2013 Outline Inference for Means Differences in cell means Contrasts Multiplicity Topic 19 2 The Cell Means Model Expressed numerically Y ij = µ i + ε ij where µ i is the theoretical

More information

This is a Randomized Block Design (RBD) with a single factor treatment arrangement (2 levels) which are fixed.

This is a Randomized Block Design (RBD) with a single factor treatment arrangement (2 levels) which are fixed. EXST3201 Chapter 13c Geaghan Fall 2005: Page 1 Linear Models Y ij = µ + βi + τ j + βτij + εijk This is a Randomized Block Design (RBD) with a single factor treatment arrangement (2 levels) which are fixed.

More information

Lec 1: An Introduction to ANOVA

Lec 1: An Introduction to ANOVA Ying Li Stockholm University October 31, 2011 Three end-aisle displays Which is the best? Design of the Experiment Identify the stores of the similar size and type. The displays are randomly assigned to

More information

Lecture 12: 2 k Factorial Design Montgomery: Chapter 6

Lecture 12: 2 k Factorial Design Montgomery: Chapter 6 Lecture 12: 2 k Factorial Design Montgomery: Chapter 6 1 Lecture 12 Page 1 2 k Factorial Design Involvingk factors: each has two levels (often labeled+and ) Very useful design for preliminary study Can

More information

Analysis of variance and regression. April 17, Contents Comparison of several groups One-way ANOVA. Two-way ANOVA Interaction Model checking

Analysis of variance and regression. April 17, Contents Comparison of several groups One-way ANOVA. Two-way ANOVA Interaction Model checking Analysis of variance and regression Contents Comparison of several groups One-way ANOVA April 7, 008 Two-way ANOVA Interaction Model checking ANOVA, April 008 Comparison of or more groups Julie Lyng Forman,

More information

Topic 23: Diagnostics and Remedies

Topic 23: Diagnostics and Remedies Topic 23: Diagnostics and Remedies Outline Diagnostics residual checks ANOVA remedial measures Diagnostics Overview We will take the diagnostics and remedial measures that we learned for regression and

More information

Topic 2. Chapter 3: Diagnostics and Remedial Measures

Topic 2. Chapter 3: Diagnostics and Remedial Measures Topic Overview This topic will cover Regression Diagnostics Remedial Measures Statistics 512: Applied Linear Models Some other Miscellaneous Topics Topic 2 Chapter 3: Diagnostics and Remedial Measures

More information

Biological Applications of ANOVA - Examples and Readings

Biological Applications of ANOVA - Examples and Readings BIO 575 Biological Applications of ANOVA - Winter Quarter 2010 Page 1 ANOVA Pac Biological Applications of ANOVA - Examples and Readings One-factor Model I (Fixed Effects) This is the same example for

More information

Week 7.1--IES 612-STA STA doc

Week 7.1--IES 612-STA STA doc Week 7.1--IES 612-STA 4-573-STA 4-576.doc IES 612/STA 4-576 Winter 2009 ANOVA MODELS model adequacy aka RESIDUAL ANALYSIS Numeric data samples from t populations obtained Assume Y ij ~ independent N(μ

More information

Topic 17 - Single Factor Analysis of Variance. Outline. One-way ANOVA. The Data / Notation. One way ANOVA Cell means model Factor effects model

Topic 17 - Single Factor Analysis of Variance. Outline. One-way ANOVA. The Data / Notation. One way ANOVA Cell means model Factor effects model Topic 17 - Single Factor Analysis of Variance - Fall 2013 One way ANOVA Cell means model Factor effects model Outline Topic 17 2 One-way ANOVA Response variable Y is continuous Explanatory variable is

More information

Outline. Analysis of Variance. Acknowledgements. Comparison of 2 or more groups. Comparison of serveral groups

Outline. Analysis of Variance. Acknowledgements. Comparison of 2 or more groups. Comparison of serveral groups Outline Analysis of Variance Analysis of variance and regression course http://staff.pubhealth.ku.dk/~lts/regression10_2/index.html Comparison of serveral groups Model checking Marc Andersen, mja@statgroup.dk

More information

SAS Commands. General Plan. Output. Construct scatterplot / interaction plot. Run full model

SAS Commands. General Plan. Output. Construct scatterplot / interaction plot. Run full model Topic 23 - Unequal Replication Data Model Outline - Fall 2013 Parameter Estimates Inference Topic 23 2 Example Page 954 Data for Two Factor ANOVA Y is the response variable Factor A has levels i = 1, 2,...,

More information

Analysis of variance. April 16, Contents Comparison of several groups

Analysis of variance. April 16, Contents Comparison of several groups Contents Comparison of several groups Analysis of variance April 16, 2009 One-way ANOVA Two-way ANOVA Interaction Model checking Acknowledgement for use of presentation Julie Lyng Forman, Dept. of Biostatistics

More information

Analysis of Covariance

Analysis of Covariance Analysis of Covariance (ANCOVA) Bruce A Craig Department of Statistics Purdue University STAT 514 Topic 10 1 When to Use ANCOVA In experiment, there is a nuisance factor x that is 1 Correlated with y 2

More information

Analysis of variance. April 16, 2009

Analysis of variance. April 16, 2009 Analysis of variance April 16, 2009 Contents Comparison of several groups One-way ANOVA Two-way ANOVA Interaction Model checking Acknowledgement for use of presentation Julie Lyng Forman, Dept. of Biostatistics

More information

Answer Keys to Homework#10

Answer Keys to Homework#10 Answer Keys to Homework#10 Problem 1 Use either restricted or unrestricted mixed models. Problem 2 (a) First, the respective means for the 8 level combinations are listed in the following table A B C Mean

More information

Topic 25 - One-Way Random Effects Models. Outline. Random Effects vs Fixed Effects. Data for One-way Random Effects Model. One-way Random effects

Topic 25 - One-Way Random Effects Models. Outline. Random Effects vs Fixed Effects. Data for One-way Random Effects Model. One-way Random effects Topic 5 - One-Way Random Effects Models One-way Random effects Outline Model Variance component estimation - Fall 013 Confidence intervals Topic 5 Random Effects vs Fixed Effects Consider factor with numerous

More information

Assignment 9 Answer Keys

Assignment 9 Answer Keys Assignment 9 Answer Keys Problem 1 (a) First, the respective means for the 8 level combinations are listed in the following table A B C Mean 26.00 + 34.67 + 39.67 + + 49.33 + 42.33 + + 37.67 + + 54.67

More information

Assignment 6 Answer Keys

Assignment 6 Answer Keys ssignment 6 nswer Keys Problem 1 (a) The treatment sum of squares can be calculated by SS Treatment = b a ȳi 2 Nȳ 2 i=1 = 5 (5.40 2 + 5.80 2 + 10 2 + 9.80 2 ) 20 7.75 2 = 92.95 Then the F statistic for

More information

Analysis of Variance

Analysis of Variance 1 / 70 Analysis of Variance Analysis of variance and regression course http://staff.pubhealth.ku.dk/~lts/regression11_2 Marc Andersen, mja@statgroup.dk Analysis of variance and regression for health researchers,

More information

Outline. Analysis of Variance. Comparison of 2 or more groups. Acknowledgements. Comparison of serveral groups

Outline. Analysis of Variance. Comparison of 2 or more groups. Acknowledgements. Comparison of serveral groups Outline Analysis of Variance Analysis of variance and regression course http://staff.pubhealth.ku.dk/~jufo/varianceregressionf2011.html Comparison of serveral groups Model checking Marc Andersen, mja@statgroup.dk

More information

STAT 115:Experimental Designs

STAT 115:Experimental Designs STAT 115:Experimental Designs Josefina V. Almeda 2013 Multisample inference: Analysis of Variance 1 Learning Objectives 1. Describe Analysis of Variance (ANOVA) 2. Explain the Rationale of ANOVA 3. Compare

More information

Lecture 9: Factorial Design Montgomery: chapter 5

Lecture 9: Factorial Design Montgomery: chapter 5 Lecture 9: Factorial Design Montgomery: chapter 5 Page 1 Examples Example I. Two factors (A, B) each with two levels (, +) Page 2 Three Data for Example I Ex.I-Data 1 A B + + 27,33 51,51 18,22 39,41 EX.I-Data

More information

BE640 Intermediate Biostatistics 2. Regression and Correlation. Simple Linear Regression Software: SAS. Emergency Calls to the New York Auto Club

BE640 Intermediate Biostatistics 2. Regression and Correlation. Simple Linear Regression Software: SAS. Emergency Calls to the New York Auto Club BE640 Intermediate Biostatistics 2. Regression and Correlation Simple Linear Regression Software: SAS Emergency Calls to the New York Auto Club Source: Chatterjee, S; Handcock MS and Simonoff JS A Casebook

More information

Handout 1: Predicting GPA from SAT

Handout 1: Predicting GPA from SAT Handout 1: Predicting GPA from SAT appsrv01.srv.cquest.utoronto.ca> appsrv01.srv.cquest.utoronto.ca> ls Desktop grades.data grades.sas oldstuff sasuser.800 appsrv01.srv.cquest.utoronto.ca> cat grades.data

More information

Lecture 11: Simple Linear Regression

Lecture 11: Simple Linear Regression Lecture 11: Simple Linear Regression Readings: Sections 3.1-3.3, 11.1-11.3 Apr 17, 2009 In linear regression, we examine the association between two quantitative variables. Number of beers that you drink

More information

dm'log;clear;output;clear'; options ps=512 ls=99 nocenter nodate nonumber nolabel FORMCHAR=" = -/\<>*"; ODS LISTING;

dm'log;clear;output;clear'; options ps=512 ls=99 nocenter nodate nonumber nolabel FORMCHAR= = -/\<>*; ODS LISTING; dm'log;clear;output;clear'; options ps=512 ls=99 nocenter nodate nonumber nolabel FORMCHAR=" ---- + ---+= -/\*"; ODS LISTING; *** Table 23.2 ********************************************; *** Moore, David

More information

One-way ANOVA Model Assumptions

One-way ANOVA Model Assumptions One-way ANOVA Model Assumptions STAT:5201 Week 4: Lecture 1 1 / 31 One-way ANOVA: Model Assumptions Consider the single factor model: Y ij = µ + α }{{} i ij iid with ɛ ij N(0, σ 2 ) mean structure random

More information

Linear Combinations of Group Means

Linear Combinations of Group Means Linear Combinations of Group Means Look at the handicap example on p. 150 of the text. proc means data=mth567.disability; class handicap; var score; proc sort data=mth567.disability; by handicap; proc

More information

Descriptions of post-hoc tests

Descriptions of post-hoc tests Experimental Statistics II Page 81 Descriptions of post-hoc tests Post-hoc or Post-ANOVA tests! Once you have found out some treatment(s) are different, how do you determine which one(s) are different?

More information

Laboratory Topics 4 & 5

Laboratory Topics 4 & 5 PLS205 Lab 3 January 23, 2014 Orthogonal contrasts Class comparisons in SAS Trend analysis in SAS Multiple mean comparisons Laboratory Topics 4 & 5 Orthogonal contrasts Planned, single degree-of-freedom

More information

Chapter 11. Analysis of Variance (One-Way)

Chapter 11. Analysis of Variance (One-Way) Chapter 11 Analysis of Variance (One-Way) We now develop a statistical procedure for comparing the means of two or more groups, known as analysis of variance or ANOVA. These groups might be the result

More information

5.3 Three-Stage Nested Design Example

5.3 Three-Stage Nested Design Example 5.3 Three-Stage Nested Design Example A researcher designs an experiment to study the of a metal alloy. A three-stage nested design was conducted that included Two alloy chemistry compositions. Three ovens

More information

Lecture 10: Experiments with Random Effects

Lecture 10: Experiments with Random Effects Lecture 10: Experiments with Random Effects Montgomery, Chapter 13 1 Lecture 10 Page 1 Example 1 A textile company weaves a fabric on a large number of looms. It would like the looms to be homogeneous

More information

Two-factor studies. STAT 525 Chapter 19 and 20. Professor Olga Vitek

Two-factor studies. STAT 525 Chapter 19 and 20. Professor Olga Vitek Two-factor studies STAT 525 Chapter 19 and 20 Professor Olga Vitek December 2, 2010 19 Overview Now have two factors (A and B) Suppose each factor has two levels Could analyze as one factor with 4 levels

More information

Topic 20: Single Factor Analysis of Variance

Topic 20: Single Factor Analysis of Variance Topic 20: Single Factor Analysis of Variance Outline Single factor Analysis of Variance One set of treatments Cell means model Factor effects model Link to linear regression using indicator explanatory

More information

COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION

COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION Answer all parts. Closed book, calculators allowed. It is important to show all working,

More information

More about Single Factor Experiments

More about Single Factor Experiments More about Single Factor Experiments 1 2 3 0 / 23 1 2 3 1 / 23 Parameter estimation Effect Model (1): Y ij = µ + A i + ɛ ij, Ji A i = 0 Estimation: µ + A i = y i. ˆµ = y..  i = y i. y.. Effect Modell

More information

Odor attraction CRD Page 1

Odor attraction CRD Page 1 Odor attraction CRD Page 1 dm'log;clear;output;clear'; options ps=512 ls=99 nocenter nodate nonumber nolabel FORMCHAR=" ---- + ---+= -/\*"; ODS LISTING; *** Table 23.2 ********************************************;

More information

Outline Topic 21 - Two Factor ANOVA

Outline Topic 21 - Two Factor ANOVA Outline Topic 21 - Two Factor ANOVA Data Model Parameter Estimates - Fall 2013 Equal Sample Size One replicate per cell Unequal Sample size Topic 21 2 Overview Now have two factors (A and B) Suppose each

More information

T-test: means of Spock's judge versus all other judges 1 12:10 Wednesday, January 5, judge1 N Mean Std Dev Std Err Minimum Maximum

T-test: means of Spock's judge versus all other judges 1 12:10 Wednesday, January 5, judge1 N Mean Std Dev Std Err Minimum Maximum T-test: means of Spock's judge versus all other judges 1 The TTEST Procedure Variable: pcwomen judge1 N Mean Std Dev Std Err Minimum Maximum OTHER 37 29.4919 7.4308 1.2216 16.5000 48.9000 SPOCKS 9 14.6222

More information

Chapter 1 Linear Regression with One Predictor

Chapter 1 Linear Regression with One Predictor STAT 525 FALL 2018 Chapter 1 Linear Regression with One Predictor Professor Min Zhang Goals of Regression Analysis Serve three purposes Describes an association between X and Y In some applications, the

More information

Lecture 5: Determining Sample Size

Lecture 5: Determining Sample Size Lecture 5: Determining Sample Size Montgomery: Section 3.7 and 13.4 1 Lecture 5 Page 1 Choice of Sample Size: Fixed Effects Can determine the sample size for OverallF test Contrasts of interest For simplicity,

More information

Chapter 6 Multiple Regression

Chapter 6 Multiple Regression STAT 525 FALL 2018 Chapter 6 Multiple Regression Professor Min Zhang The Data and Model Still have single response variable Y Now have multiple explanatory variables Examples: Blood Pressure vs Age, Weight,

More information

1) Answer the following questions as true (T) or false (F) by circling the appropriate letter.

1) Answer the following questions as true (T) or false (F) by circling the appropriate letter. 1) Answer the following questions as true (T) or false (F) by circling the appropriate letter. T F T F T F a) Variance estimates should always be positive, but covariance estimates can be either positive

More information

4.8 Alternate Analysis as a Oneway ANOVA

4.8 Alternate Analysis as a Oneway ANOVA 4.8 Alternate Analysis as a Oneway ANOVA Suppose we have data from a two-factor factorial design. The following method can be used to perform a multiple comparison test to compare treatment means as well

More information

Booklet of Code and Output for STAC32 Final Exam

Booklet of Code and Output for STAC32 Final Exam Booklet of Code and Output for STAC32 Final Exam December 8, 2014 List of Figures in this document by page: List of Figures 1 Popcorn data............................. 2 2 MDs by city, with normal quantile

More information

The entire data set consists of n = 32 widgets, 8 of which were made from each of q = 4 different materials.

The entire data set consists of n = 32 widgets, 8 of which were made from each of q = 4 different materials. One-Way ANOVA Summary The One-Way ANOVA procedure is designed to construct a statistical model describing the impact of a single categorical factor X on a dependent variable Y. Tests are run to determine

More information

PLS205 Winter Homework Topic 8

PLS205 Winter Homework Topic 8 PLS205 Winter 2015 Homework Topic 8 Due TUESDAY, February 10, at the beginning of discussion. Answer all parts of the questions completely, and clearly document the procedures used in each exercise. To

More information

Lecture 7 Remedial Measures

Lecture 7 Remedial Measures Lecture 7 Remedial Measures STAT 512 Spring 2011 Background Reading KNNL: 3.8-3.11, Chapter 4 7-1 Topic Overview Review Assumptions & Diagnostics Remedial Measures for Non-normality Non-constant variance

More information

Statistics for exp. medical researchers Comparison of groups, T-tests and ANOVA

Statistics for exp. medical researchers Comparison of groups, T-tests and ANOVA Faculty of Health Sciences Outline Statistics for exp. medical researchers Comparison of groups, T-tests and ANOVA Lene Theil Skovgaard Sept. 14, 2015 Paired comparisons: tests and confidence intervals

More information

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics DETAILED CONTENTS About the Author Preface to the Instructor To the Student How to Use SPSS With This Book PART I INTRODUCTION AND DESCRIPTIVE STATISTICS 1. Introduction to Statistics 1.1 Descriptive and

More information

22s:152 Applied Linear Regression. Take random samples from each of m populations.

22s:152 Applied Linear Regression. Take random samples from each of m populations. 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

Overview Scatter Plot Example

Overview Scatter Plot Example Overview Topic 22 - Linear Regression and Correlation STAT 5 Professor Bruce Craig Consider one population but two variables For each sampling unit observe X and Y Assume linear relationship between variables

More information

Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA)

Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA) BSTT523 Pagano & Gauvreau Chapter 13 1 Nonparametric Statistics Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA) In particular, data

More information

The Distribution of F

The Distribution of F The Distribution of F It can be shown that F = SS Treat/(t 1) SS E /(N t) F t 1,N t,λ a noncentral F-distribution with t 1 and N t degrees of freedom and noncentrality parameter λ = t i=1 n i(µ i µ) 2

More information

STATISTICS 479 Exam II (100 points)

STATISTICS 479 Exam II (100 points) Name STATISTICS 79 Exam II (1 points) 1. A SAS data set was created using the following input statement: Answer parts(a) to (e) below. input State $ City $ Pop199 Income Housing Electric; (a) () Give the

More information

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

1 One-way Analysis of Variance

1 One-way Analysis of Variance 1 One-way Analysis of Variance Suppose that a random sample of q individuals receives treatment T i, i = 1,,... p. Let Y ij be the response from the jth individual to be treated with the ith treatment

More information

Lecture notes on Regression & SAS example demonstration

Lecture notes on Regression & SAS example demonstration Regression & Correlation (p. 215) When two variables are measured on a single experimental unit, the resulting data are called bivariate data. You can describe each variable individually, and you can also

More information

N J SS W /df W N - 1

N J SS W /df W N - 1 One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F J Between Groups nj( j * ) J - SS B /(J ) MS B /MS W = ( N

More information

PLS205 Lab 2 January 15, Laboratory Topic 3

PLS205 Lab 2 January 15, Laboratory Topic 3 PLS205 Lab 2 January 15, 2015 Laboratory Topic 3 General format of ANOVA in SAS Testing the assumption of homogeneity of variances by "/hovtest" by ANOVA of squared residuals Proc Power for ANOVA One-way

More information

STAT 705 Chapter 19: Two-way ANOVA

STAT 705 Chapter 19: Two-way ANOVA STAT 705 Chapter 19: Two-way ANOVA Adapted from Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 41 Two-way ANOVA This material is covered in Sections

More information

Topic 6. Two-way designs: Randomized Complete Block Design [ST&D Chapter 9 sections 9.1 to 9.7 (except 9.6) and section 15.8]

Topic 6. Two-way designs: Randomized Complete Block Design [ST&D Chapter 9 sections 9.1 to 9.7 (except 9.6) and section 15.8] Topic 6. Two-way designs: Randomized Complete Block Design [ST&D Chapter 9 sections 9.1 to 9.7 (except 9.6) and section 15.8] The completely randomized design Treatments are randomly assigned to e.u. such

More information

Chapter 2 Inferences in Simple Linear Regression

Chapter 2 Inferences in Simple Linear Regression STAT 525 SPRING 2018 Chapter 2 Inferences in Simple Linear Regression Professor Min Zhang Testing for Linear Relationship Term β 1 X i defines linear relationship Will then test H 0 : β 1 = 0 Test requires

More information

ST505/S697R: Fall Homework 2 Solution.

ST505/S697R: Fall Homework 2 Solution. ST505/S69R: Fall 2012. Homework 2 Solution. 1. 1a; problem 1.22 Below is the summary information (edited) from the regression (using R output); code at end of solution as is code and output for SAS. a)

More information

Topic 32: Two-Way Mixed Effects Model

Topic 32: Two-Way Mixed Effects Model Topic 3: Two-Way Mixed Effects Model Outline Two-way mixed models Three-way mixed models Data for two-way design Y is the response variable Factor A with levels i = 1 to a Factor B with levels j = 1 to

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

I i=1 1 I(J 1) j=1 (Y ij Ȳi ) 2. j=1 (Y j Ȳ )2 ] = 2n( is the two-sample t-test statistic.

I i=1 1 I(J 1) j=1 (Y ij Ȳi ) 2. j=1 (Y j Ȳ )2 ] = 2n( is the two-sample t-test statistic. Serik Sagitov, Chalmers and GU, February, 08 Solutions chapter Matlab commands: x = data matrix boxplot(x) anova(x) anova(x) Problem.3 Consider one-way ANOVA test statistic For I = and = n, put F = MS

More information

EXST Regression Techniques Page 1. We can also test the hypothesis H :" œ 0 versus H :"

EXST Regression Techniques Page 1. We can also test the hypothesis H : œ 0 versus H : EXST704 - Regression Techniques Page 1 Using F tests instead of t-tests We can also test the hypothesis H :" œ 0 versus H :" Á 0 with an F test.! " " " F œ MSRegression MSError This test is mathematically

More information

STOR 455 STATISTICAL METHODS I

STOR 455 STATISTICAL METHODS I STOR 455 STATISTICAL METHODS I Jan Hannig Mul9variate Regression Y=X β + ε X is a regression matrix, β is a vector of parameters and ε are independent N(0,σ) Es9mated parameters b=(x X) - 1 X Y Predicted

More information

Chapter 16. Nonparametric Tests

Chapter 16. Nonparametric Tests Chapter 16 Nonparametric Tests The statistical tests we have examined so far are called parametric tests, because they assume the data have a known distribution, such as the normal, and test hypotheses

More information

STAT 705 Chapter 19: Two-way ANOVA

STAT 705 Chapter 19: Two-way ANOVA STAT 705 Chapter 19: Two-way ANOVA Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 38 Two-way ANOVA Material covered in Sections 19.2 19.4, but a bit

More information

These are all actually contrasts (the coef sum to zero). What are these contrasts representing? What would make them large?

These are all actually contrasts (the coef sum to zero). What are these contrasts representing? What would make them large? Lecture 12 Comparing treatment effects Orthogonal Contrasts What use are contrasts? Recall the Cotton data In this case, the treatment levels have an ordering to them this is not always the case) Consider

More information

UNIVERSITY EXAMINATIONS NJORO CAMPUS SECOND SEMESTER 2011/2012

UNIVERSITY EXAMINATIONS NJORO CAMPUS SECOND SEMESTER 2011/2012 UNIVERSITY EXAMINATIONS NJORO CAMPUS SECOND SEMESTER 2011/2012 THIRD YEAR EXAMINATION FOR THE AWARD BACHELOR OF SCIENCE IN AGRICULTURE AND BACHELOR OF SCIENCE IN FOOD TECHNOLOGY AGRO 391 AGRICULTURAL EXPERIMENTATION

More information

Pairwise multiple comparisons are easy to compute using SAS Proc GLM. The basic statement is:

Pairwise multiple comparisons are easy to compute using SAS Proc GLM. The basic statement is: Pairwise Multiple Comparisons in SAS Pairwise multiple comparisons are easy to compute using SAS Proc GLM. The basic statement is: means effects / options Here, means is the statement initiat, effects

More information

Outline. Review regression diagnostics Remedial measures Weighted regression Ridge regression Robust regression Bootstrapping

Outline. Review regression diagnostics Remedial measures Weighted regression Ridge regression Robust regression Bootstrapping Topic 19: Remedies Outline Review regression diagnostics Remedial measures Weighted regression Ridge regression Robust regression Bootstrapping Regression Diagnostics Summary Check normality of the residuals

More information

PLS205 Lab 6 February 13, Laboratory Topic 9

PLS205 Lab 6 February 13, Laboratory Topic 9 PLS205 Lab 6 February 13, 2014 Laboratory Topic 9 A word about factorials Specifying interactions among factorial effects in SAS The relationship between factors and treatment Interpreting results of an

More information

Analysis of Variance II Bios 662

Analysis of Variance II Bios 662 Analysis of Variance II Bios 662 Michael G. Hudgens, Ph.D. mhudgens@bios.unc.edu http://www.bios.unc.edu/ mhudgens 2008-10-24 17:21 BIOS 662 1 ANOVA II Outline Multiple Comparisons Scheffe Tukey Bonferroni

More information

Topic 14: Inference in Multiple Regression

Topic 14: Inference in Multiple Regression Topic 14: Inference in Multiple Regression Outline Review multiple linear regression Inference of regression coefficients Application to book example Inference of mean Application to book example Inference

More information